https://jackrusher.com/strange-loop-2022/ 00:00:12.95 My talk today is Stop Writing Dead Programs. 00:00:14.57 This is sort of the thesis statement 00:00:16.01 for the talk, even though it's 40 years 00:00:18.52 old, this Seymour Papert quote saying 00:00:19.91 that we're still digging ourselves into 00:00:21.65 a kind of a pit by continuing to 00:00:23.68 preserve practices that have no rational 00:00:26.92 basis beyond being historical. 00:00:29.08 I will start with a somewhat personal 00:00:30.47 journey in technology. I'm going to ask 00:00:31.66 you for some feedback at some different 00:00:34.13 places, so first off - by applause - how 00:00:36.97 many of you know what this is? 00:00:40.43 Okay, okay, that's actually more 00:00:41.86 than I expected. Now, how many of you 00:00:44.33 actually used one of these? 00:00:47.63 Okay, so what I can say is that I am part 00:00:49.36 of the last generation of people who 00:00:51.04 were forced to use punch cards at school. 00:00:52.79 I still had to write Fortran programs 00:00:55.25 with punch cards, and this thing is a 00:00:56.93 card punch. It's like a keyboard, 00:00:58.85 except when you press the keys you're 00:01:00.22 actually making holes in a piece of 00:01:01.54 paper, and then you feed them 00:01:03.29 into this thing in the back, and the 00:01:05.57 pieces of paper look like this. So each 00:01:06.83 one of these vertical columns is 00:01:08.51 basically a byte, and you're stabbing 00:01:10.19 through the different bits of the 00:01:11.81 byte to indicate what letter it is. If 00:01:13.37 you look at the top left corner, 00:01:16.55 you see Z(1) = Y + W(1). 00:01:19.19 This is one line of code - a 00:01:21.53 card is one line of code. Something to 00:01:23.87 notice about this card it's 80 columns 00:01:25.55 wide. We're going to come back to that 00:01:25.56 later. Some commenters were confused that we still used punched cards in the 80s, when display terminals already existed. This was in the context of a required class for engineering students to prepare them for the possibility that they would encounter punch cards in the wild. Most of us never did, beyond this one class. 00:01:27.46 This design dates from 00:01:29.45 1928. This is a Hollerith punch card, the 00:01:31.42 same one used forever. Now, what does a 00:01:34.24 program look like if you're programming 00:01:35.33 like this? It looks like this: it's a deck. 00:01:37.37 Now notice the rubber band. When you're 00:01:40.01 doing this, you live in terminal fear 00:01:41.74 that you will drop the deck of cards. It 00:01:44.39 is a terrible experience resorting 00:01:47.21 the cards. That long diagonal stripe 00:01:48.89 there is so that this person, who made 00:01:50.51 this particular deck, could put it back 00:01:51.83 together without having to look at every 00:01:53.33 single line in the process. And the words 00:01:55.24 written on the top of the deck are sort 00:01:57.59 of indicating where different 00:01:58.49 subroutines are located within this program. 00:02:00.95 Now, to give you a sense of how long 00:02:02.99 these programs can get, this picture 00:02:05.45 (forgive me, it's a low quality picture). 00:02:06.64 This is the actual reader I used and 00:02:08.63 that in the front there is an actual 00:02:10.01 program I wrote. The lower right hand 00:02:11.80 corner one, which was a Fortran program 00:02:13.85 to simulate rocket flight, because my my 00:02:16.72 particular school had a connection to 00:02:18.22 NASA and we did a lot of Rocket-y things. 00:02:19.67 Right, so can you imagine how long it 00:02:22.25 took me to punch all these and put them 00:02:23.51 in there, and what we would do is give 00:02:25.25 them to a system operator who would feed 00:02:26.75 them into a computer. In this case the 00:02:28.67 computer I personally used was this one. 00:02:30.53 This is a VAX-11/780. This machine cost 00:02:34.19 nearly a million dollars, and had 16 00:02:35.99 megabytes - that's megabytes - of RAM, ran at 00:02:38.69 5 megahertz - that's megahertz! This 00:02:41.44 thing in front of me here is thousands 00:02:43.36 of times more powerful than the machine 00:02:44.80 that I was using then - that the whole 00:02:46.13 campus was using to do these kinds of 00:02:47.57 things then - and what would the output 00:02:49.49 look like that came from sending this 00:02:51.35 enormous deck of cards in? Well, it would 00:02:53.15 come out on a line printer that looks like 00:02:54.77 this. And you wouldn't get it right 00:02:56.75 away. An operator would give it to you 00:02:58.72 later. Note the vintage haircuts, the 00:03:00.89 fellow in the middle there is the actual 00:03:02.57 operator who was handing me these 00:03:03.71 outputs, and he's the person who gave me 00:03:04.85 these photos of this equipment. 00:03:06.47 So this process, as you can imagine, was 00:03:08.57 hard, but it was hard in a dumb way. 00:03:11.33 Some things are hard because they have 00:03:13.07 to be, and I really support the idea of 00:03:14.69 overcoming challenges and doing hard 00:03:16.43 things, but this was hard 00:03:18.29 for reasons had nothing to do with the 00:03:19.79 actual problem you're trying to [solve]. 00:03:22.19 Like, something with a rocket and a simulation, 00:03:23.57 and you're thinking about not dropping 00:03:25.13 your punch card deck, and it's taking you 00:03:26.57 forever to find out what happened. So, it 00:03:29.14 really hinges on your ability to emulate 00:03:30.94 the computer in your head because the 00:03:33.11 computer's not going to help you in any 00:03:34.30 way. There's not an editor, there's 00:03:35.44 nothing, and that in turn hinges on 00:03:38.08 working memory, which is something that 00:03:39.47 is not very well distributed among 00:03:41.08 humans. There were a small number of 00:03:42.94 us for whom this whole thing came pretty 00:03:44.57 naturally, and we were treated as, like, 00:03:46.78 special people - as kind of high priests 00:03:48.58 with magical powers, and this is how we 00:03:50.69 came to think of ourselves, right, that 00:03:52.36 we're special [because] we can make it work. 00:03:53.86 But the truth is we were less priests 00:03:55.97 like this than we were monks like this - 00:03:57.41 hitting ourselves in the head. 00:03:59.93 Right, but the problem is - 00:04:02.86 as Peter Harkins mentions here - that 00:04:05.39 programmers have this tendency to, once 00:04:07.07 they master something hard (often 00:04:08.86 pointlessly hard), rather than then making 00:04:10.78 it easy they feel proud of themselves 00:04:12.58 for having done it and just perpetuate 00:04:14.33 the hard nonsense. And I'm going to argue 00:04:16.67 that a lot of what we still do today is 00:04:18.28 very much like what I was doing on that 00:04:19.78 old VAX. For one thing, there's a lot of 00:04:21.59 batch processing going on, and 00:04:23.74 what's wrong with batch processing? Hella 00:04:25.73 long feedback loops. It's no good, takes 00:04:27.77 you forever - it took me 45 minutes to 00:04:29.27 find out what a one card change would do 00:04:31.49 in the printout that I would get back, 00:04:32.99 because that was the loop. You're 00:04:34.79 thinking: well, it's not like that for us, 00:04:36.23 right, we're not poking holes in paper 00:04:37.67 cards - we have display terminals! But 00:04:40.31 how many of you guys have compile 00:04:42.05 cycles that can take 45 minutes? Famously, 00:04:44.87 the Go team wrote go because they 00:04:47.21 were so angry about waiting for an hour, 00:04:48.89 because they wanted to see what was 00:04:50.57 going to happen with some C++ 00:04:51.59 code they're running on some horrible 00:04:52.85 giant Google codebase. Maybe you want 00:04:54.89 to deploy your stuff and see if it works, 00:04:56.27 because we're all running web apps now. 00:04:57.29 So do you, like, stuff it in a 00:04:58.90 Docker container, and then ship it out to 00:05:00.65 the cloud and wait for a CI job? How long 00:05:02.57 does that that take? 00:05:04.15 Two hours for this guy! I mean why do we 00:05:06.77 tolerate this? This is crazy! Docker 00:05:08.51 shouldn't exist. It exists only because 00:05:09.95 everything else is so terribly 00:05:11.15 complicated that they added another 00:05:12.40 layer of complexity to make it work. It's 00:05:14.74 like they thought: if deployment is bad, 00:05:16.12 we should make development bad too. It's 00:05:18.59 just... it's not good. 00:05:21.53 So, what kind of things do we inherit 00:05:23.15 from this way of thinking about the 00:05:24.59 world? We get funny ideas that are built 00:05:26.62 into programming about time and state. 00:05:28.79 Ideas like, there should be a compile/run 00:05:31.07 cycle. This is a terrible idea, but it's 00:05:33.23 an ancient idea, that you're going to 00:05:34.49 compile the thing and you're getting an 00:05:35.68 artifact and you're going to run the 00:05:36.71 artifact over there and those two things 00:05:38.15 are completely different phases of your 00:05:39.71 process. There's going to be linear 00:05:41.81 execution - most programming languages 00:05:43.79 assume that there's only one thread and 00:05:45.35 you're going to run straight through 00:05:46.43 from the beginning to the end; that your 00:05:48.35 program is going to start up from a 00:05:49.67 blank State and then run to termination. 00:05:51.40 Now, how many programs that we actually 00:05:53.45 write do that? We'll revisit that in a 00:05:55.43 moment. This really only works if your 00:05:56.99 program is some kind of input/output 00:05:58.49 transformer. So there's no runtime 00:06:00.77 introspection, because runtime is 00:06:02.02 happening over there and your actual 00:06:03.40 work is happening over here, and you just 00:06:04.85 have to kind of guess from what happened, 00:06:05.93 how it might be related to your code, and 00:06:08.09 if there's a bug - well, sorry, failures 00:06:10.12 just halt your program. You get maybe a 00:06:11.68 core dump, or you get a log message 00:06:13.18 somewhere with a stack trace in it. Now, 00:06:15.23 what kind of programs do we really write? 00:06:16.90 Mostly long-lived servers. I've got 00:06:18.89 server processes with uptimes of a 00:06:20.81 thousand days. 00:06:21.89 They don't work the same way 00:06:23.62 /usr/bin/sort works. I don't want a 00:06:25.73 process that's optimized for writing 00:06:27.35 that. We also write GUI programs. GUI 00:06:29.51 programs are more intense than this, even. 00:06:31.30 So you've got all of these 00:06:32.68 different kinds of input coming into the 00:06:34.37 program, and it's maybe it's talking to 00:06:35.68 the keyboard, it's talking to the mouse, 00:06:36.77 it's talking to the network, if it's Zoom 00:06:38.57 it's talking to the camera, it's talking 00:06:40.37 to the microphone - it's crazy. So this 00:06:42.59 approach to programming just 00:06:44.15 doesn't work well for the things we 00:06:45.59 actually build. It also infected 00:06:48.17 programming language theory. So, if the 00:06:50.21 program is a static artifact, what does 00:06:51.59 that mean? It means we're mostly going to 00:06:53.27 concentrate on algebraics, so we're going 00:06:54.46 to talk about syntax and semantics and 00:06:55.79 very little else. 00:06:57.05 There's going to be no concern really 00:06:58.49 for pragmatics - and what I mean here by 00:06:59.80 pragmatics is what it's actually like to 00:07:01.55 interact with your programming 00:07:02.74 environment, and this leads to 00:07:04.37 mathematics envy and a real fixation on 00:07:06.46 theorem proving. 00:07:07.96 So, to give an example of what happens 00:07:09.95 when people actually concentrate on a 00:07:11.80 part of programming and make progress, 00:07:13.01 we're going to take a quick tour through 00:07:14.74 syntax and semantics. We're going to do a 00:07:17.02 simple transformation here. We've 00:07:18.40 got 1 through 4, we want it to be 00:07:19.67 2 through 5. We want it to be 00:07:21.17 relatively general. I've written some 00:07:22.90 example programs that do this in a 00:07:25.37 variety of programming languages. The 00:07:28.07 first one here is it is in ARM64 machine 00:07:32.39 language, because my laptop happens to 00:07:34.18 run this processor now. As you can 00:07:35.87 plainly see from this code, it starts off 00:07:38.57 Oh wait! Does everyone here understand 00:07:40.15 ARM64? Okay, all right, it's a little easier 00:07:42.46 if I do this, so you can see where the 00:07:43.79 instructions are within these different 00:07:45.11 words. This is a cool instruction set. 00:07:47.39 It's not like x86. [In x86], all the 00:07:49.18 instructions are different lengths. In 00:07:50.45 ARM64, they're all the same length because 00:07:51.77 it's RISC, but we'll do it in assembly 00:07:54.29 language - it'll be easier, right. So we 00:07:56.15 we'll start with this label here, add one, 00:07:57.83 and we've got the signature of what it 00:07:59.08 would be as a C program after that. 00:08:00.95 What am I actually doing when I write 00:08:02.33 this program? Well, the first thing I'm 00:08:03.58 doing is moving things from registers 00:08:05.51 onto the stack. Why am I doing this? I'm 00:08:08.27 doing this because the ABI says I have 00:08:09.77 to. No other reason. It's nothing to do 00:08:11.62 with my problem. And then I want to call 00:08:13.37 malloc because I have to allocate some 00:08:14.74 memory to return the new, you know, array, 00:08:16.90 with the new stuff in it. So what I have 00:08:18.52 to do... 00:08:19.30 I'm doing crazy things. Look down here, 00:08:21.89 you see the registers are all called 00:08:23.45 with X names? That's because 00:08:24.83 there's 64-bit registers at X, but I get 00:08:26.51 down here to set up for malloc and now 00:08:27.71 I'm using W names. Why? Well, I just have 00:08:29.51 to know that I have to do something 00:08:30.46 special if it's a 32-bit number, and 00:08:32.44 it'll mask off 32 of the bits and still 00:08:34.31 work great. Now I have to stuff things 00:08:37.07 in these registers. I have to multiply 00:08:38.57 one of the variables. Do I use a multiply 00:08:39.94 for that? No, I'm using it with a bit 00:08:41.38 shifting operation because that's what's 00:08:42.82 faster on this processor. And then I call 00:08:44.87 malloc, and I get back what I want. Great. 00:08:46.91 Now, I want a loop. This is what a loop 00:08:48.76 looks like. Notice we're on the second 00:08:49.91 page, and all I'm doing is incrementing 00:08:52.25 some numbers. So, I come through and 00:08:54.23 I do a comparison. Okay, is this register 00:08:55.91 that I put this value into zero? If 00:08:57.71 it's less/equal, then I jump to return. You 00:08:59.15 can't see return, it's on another page. 00:09:00.47 There's a third page. 00:09:02.09 So, I move zero into this other register 00:09:04.25 and I go through here and bang bang... I'm 00:09:06.23 I'm not going to bore you with the whole 00:09:07.67 thing. I'm bored just talking about it. 00:09:09.35 Imagine how I felt writing it! 00:09:11.15 And then at the end I have to do the 00:09:12.53 reverse of the things I did at the 00:09:13.79 beginning to set everything back into 00:09:15.59 the registers from the stack where I 00:09:16.85 saved them. Why? Because I have to have 00:09:18.23 the right return address to give this 00:09:19.61 thing back. I have to do this like a 00:09:20.99 voodoo incantation, because it's what the 00:09:22.55 processor wants. Nothing to do with the 00:09:24.41 problem I'm trying to solve. How can we 00:09:26.09 do it better? Hey look - it's C)! This is 00:09:28.25 exactly the same program. Many fewer 00:09:30.23 lines of code. However, it has a load of 00:09:32.50 problems that have nothing to do with 00:09:33.71 what I'm trying to accomplish as well. 00:09:34.91 For one, I have to pass two things. I have 00:09:37.31 to pass the length of the array separate 00:09:38.99 from the array. Why? Because there's no 00:09:41.38 sequence type in C. Great work guys! So 00:09:43.67 then, from there, I want to return this 00:09:45.23 value. This modified sequence. And what do 00:09:46.79 I have to do? Well, I had to do this in 00:09:48.41 assembly too, but this is crazy. I have to 00:09:49.85 allocate memory give it back and then 00:09:51.29 hope that the other guy is going to free 00:09:52.67 that memory later. This has nothing to do 00:09:54.88 with what I'm trying to accomplish. I 00:09:56.63 want to increment each of these numbers. 00:09:57.71 I do it with a for loop that counts from 00:09:59.81 one to the length of the array. Is 00:10:01.43 counting to the length of the array 00:10:02.38 relevant? Right, no. No, this is not 00:10:04.37 relevant. In fact, essentially one line 00:10:07.31 of code of this whole thing - the actual 00:10:08.99 increment - is the only thing that 00:10:10.49 actually matters. On the other hand, I can 00:10:13.00 complement C as a portable assembly 00:10:14.93 language because you see I don't have to 00:10:15.88 do the stack nonsense by hand, and 00:10:17.75 instead of telling it that it's four 00:10:19.85 bytes wide, I can actually use sizeof 00:10:21.76 to know that but that's about the 00:10:23.44 only way it's really an improvement. Now 00:10:25.19 let's look at Lisp. Note that Lisp is 00:10:26.75 about 10 years older than C. Here I have 00:10:29.32 a sequence abstraction. I have four 00:10:31.07 numbers and I can use a 00:10:32.57 higher order function to go over it and add 00:10:33.94 one to each of them. This is a tremendous 00:10:35.38 improvement by going back in time. 00:10:38.16 But we can do better. We can do better 00:10:38.87 than this notation. We can go to 00:10:41.09 Haskell. So, in Haskell what do we have? 00:10:43.07 This is really lovely. We have this thing 00:10:45.11 where we auto-curry the (+ 1), and we 00:10:47.50 get a function that adds one. This is 00:10:49.25 getting pretty concise. Can anybody here 00:10:50.81 quickly name for me a language in which 00:10:52.37 this exact operation is even more 00:10:54.29 concise? I'll give you a moment. 00:10:56.03 I hear APL, and indeed APL! So here we 00:11:00.94 have [rank] polymorphism. I have 00:11:04.06 a single number - 00:11:06.41 a scalar - and I have a set of numbers. 00:11:07.67 Note that there's no stupid junk. I don't 00:11:09.35 have to put commas between everything. I 00:11:11.38 don't have to wrap anything in any 00:11:12.76 special [delimiters] or anything of this 00:11:14.09 nature. I just say add one to these 00:11:15.88 numbers, and I get what I was after. So if 00:11:17.44 we start from the assembly language and 00:11:20.38 we come to the APL, which is - you know - 00:11:21.65 again - you know - like eight years older 00:11:23.44 than C, we find that syntax and semantics 00:11:25.61 can take us a long way. 00:11:27.88 But there are other things that we care 00:11:30.05 about where no one has put in this much 00:11:32.03 effort. And one of those things is state 00:11:34.06 and time. Almost every programming 00:11:35.44 language doesn't do anything to help us 00:11:37.85 with managing state over time from 00:11:40.37 multiple sources. There are some notable 00:11:42.41 exceptions. I will talk about them now. So, 00:11:44.15 Clojure, 00:11:46.56 because Rich Hickey - he really cared about 00:11:47.20 concurrency - he included immutable data 00:11:47.21 structures. So now you don't have 00:11:49.06 constant banging on the same things and 00:11:50.87 crushing each other's data. This is very 00:11:53.32 helpful. What else? He's got atoms. These 00:11:54.71 are synchronized mutable boxes with 00:11:56.81 functional update semantics. Everybody 00:11:58.25 uses these. These are great. He has also a 00:11:59.87 full Software Transactional Memory 00:12:01.06 implementation that frankly nobody uses, 00:12:02.63 but it's still great. It just has a more 00:12:05.50 complicated API, and the lesson from this 00:12:07.06 probably is: if you want people to do the 00:12:08.81 right thing, you have to give them an API 00:12:10.73 simple enough that they really will. 00:12:12.29 Then on top of this, we have core.async. 00:12:13.49 Now, I have less nice things to 00:12:14.63 say about core.async. I like 00:12:16.31 Communicating Sequential Processes, the 00:12:18.94 way everybody else does, but this is 00:12:21.23 implemented as a macro and as a 00:12:22.31 consequence when it compiles your CSP 00:12:23.93 code you end up with something that you 00:12:25.31 can't really look into anymore. Like, you 00:12:27.76 can't ask a channel how many things are 00:12:30.41 in that channel. You can't really know 00:12:32.15 much about what's happening there. And I 00:12:33.94 would say that in the JVM, I agree with 00:12:35.26 what rich said the year before he 00:12:36.65 created core.async, which is that you 00:12:37.91 should just probably use the built-in 00:12:39.41 concurrent queues. 00:12:41.21 Now, in ClojureScript, of course, 00:12:43.31 these things were more useful because 00:12:44.87 everyone was trapped in callback hell. 00:12:46.43 We'll see what happens moving on, now 00:12:48.88 that we have async/await in JavaScript. 00:12:49.97 Moving on to another implementation 00:12:51.41 of CSP, Go. Go actually did something good 00:12:53.38 here, right, they - and I'm not going to say 00:12:55.43 much else that's great about Go - is The Go team includes several all-time great programmers. I respect them all. But I do feel that they had a chance to be more ambitious than they were with Go, which - with the weight of their reputations and the might of Google behind it - could have shifted the culture in a better direction. 00:12:59.62 they built a fantastic runtime for 00:13:01.91 this stuff. It's really lightweight, it 00:13:04.06 does a great job. The bad news is that Go 00:13:04.07 is a completely static language, so even 00:13:04.85 though you should be able to go in and 00:13:07.12 ask all of these questions during 00:13:09.29 runtime while you're developing from 00:13:11.26 within your editor, like a civilized 00:13:13.43 person, you can't. You end up with a 00:13:14.87 static artifact. Well, that's a bummer. 00:13:15.88 Okay. 00:13:17.26 And I would say, actually, before I 00:13:18.71 move on, that anytime you have this 00:13:20.56 kind of abstraction where you have a 00:13:22.31 bunch of threads running, when you have 00:13:22.32 processes doing things, you really want 00:13:23.21 ps and you really want kill. And, 00:13:24.82 unfortunately, neither Go nor Clojure can 00:13:26.44 provide these because their runtimes 00:13:27.41 don't believe in them. The JVM 00:13:28.49 runtime itself thinks that if you kill a 00:13:30.59 [thread] you're going to leak some 00:13:32.56 resources, and that the resources you 00:13:34.49 leak may include locks that you need to 00:13:35.87 free up some other threads that are 00:13:38.03 running elsewhere, so they've just 00:13:40.37 forbidden the whole thing. And in Go you 00:13:41.21 have to send it a message, open a 00:13:42.76 separate Channel, blah blah blah. 00:13:44.50 Erlang, on the other hand, gets almost 00:13:46.43 everything right in this area. In 00:13:47.56 this situation, they've implemented the 00:13:49.31 actor model, and they've done it in a way 00:13:50.69 where you have a live interactive 00:13:52.37 runtime, and because they're using shared 00:13:54.11 nothing for their state and supervision 00:13:55.55 trees, you can kill anything anytime and 00:13:57.05 your system will just keep running. This 00:13:58.55 is fantastic. This is great. Why doesn't 00:13:59.93 everything work like this? It also 00:14:02.03 comes with introspection tools, like 00:14:03.65 Observer, that should make anyone using 00:14:05.21 any other platform to build a 00:14:06.88 long-running server thing fairly jealous. 00:14:08.32 Now, when I say this, I'm not telling you 00:14:10.12 you should use Erlang. What I'm telling 00:14:11.32 you is whatever you use should be at 00:14:13.31 least as good as Erlang at doing this, 00:14:14.50 and if you're developing a new language - 00:14:16.19 for God's sake - please take notice. 00:14:18.23 I can talk now about something that 00:14:19.91 I worked on with my colleague Matt 00:14:21.76 Huebert. This is something that I 00:14:23.21 particularly like. This is a hack in The cells project was Matt's baby. He did almost all the coding. I worked with him as a mentor because I had already implemented a number of dataflow systems. 00:14:24.94 ClojureScript. We call it cells, and it 00:14:28.19 takes spreadsheet like dataflow and adds 00:14:30.35 it into ClojureScript. This resulted in a 00:14:31.61 paper that was delivered at the PX16 00:14:33.05 workshop at ECOOP in Rome in 2016. 00:14:35.09 You've got things like this, right. So, you 00:14:38.93 say: here's an interval, every 300 00:14:41.15 milliseconds give me another random 00:14:43.79 integer, and it does. And then you can 00:14:45.47 have another thing refer to that, in this 00:14:48.23 case consing them on, and now we build a 00:14:49.97 history of all the random integers that 00:14:51.29 have happened. What else can you do? Well 00:14:52.79 you can refer to that, and you can (take 10) 00:14:54.76 with normal Clojure semantics, and 00:14:56.21 then map that out as a bar chart. What do you 00:14:57.71 get? A nice graph. A graph that 00:14:58.97 moves in real time. Or we can move on to 00:15:00.53 this. We added sort of Bret Victor-style 00:15:02.03 scrubbers into it so that you could do 00:15:03.88 these kinds of things. I'll show you 00:15:05.93 instead of telling you, because it's 00:15:08.09 obvious if you look at it what's going 00:15:10.12 on here. We did this partially to 00:15:12.17 show people that you can just really 00:15:14.50 program with systems that have all those 00:15:15.71 features that Bret was demoing. 00:15:16.73 Source code's still out there - anybody 00:15:17.99 wants to do that, you can do that. We 00:15:20.32 moved on from that to maria.cloud, which Maria was a joint project of Matt, Dave Liepmann, and myself. We wanted a good teaching environment that requires no installfest for ClojureBridge. 00:15:21.47 takes all of that code we wrote for 00:15:23.32 cells and turns it into a notebook. We 00:15:25.85 actually did this for learners. Take a 00:15:27.41 look at this. This is a computational 00:15:29.38 notebook. It has the cells, it gives you 00:15:31.25 much better error messages than default 00:15:32.44 Clojure, and so on. We used this to teach. 00:15:34.06 It was a great experience, and currently - 00:15:36.41 this year - thanks to Clojurists Together, we 00:15:38.38 have some additional funding to bring it 00:15:41.15 up to date and keep it running. I 00:15:42.53 encourage everybody to check it out. The 00:15:44.09 last thing here on this list is the 00:15:46.43 propagators. The propagators come 00:15:48.23 from Sussman - this is Sussman's 00:15:49.49 project from around the same time that 00:15:50.93 actors were happening and Alan Kay was first 00:15:53.26 getting interested in Smalltalk. This 00:15:54.65 was a really fertile scene at MIT in the 00:15:56.21 early 70s. It was actually the project 00:15:58.61 he originally hired Richard Stallman, of 00:16:00.47 the GNU project, as a grad student, to 00:16:02.44 work on, and then later did some 00:16:03.94 additional work with Alexey Radul, which 00:16:05.56 expanded the whole thing. 00:16:07.67 I can't tell you all about it here. 00:16:09.47 There's just too much to say, but I can 00:16:11.09 tell you there was a fantastic talk at 00:16:13.12 the 2011 strange Loop called We Really 00:16:14.99 Don't Know How to Compute!, and I 00:16:16.67 recommend that you watch that when you 00:16:18.82 get out of Strange Loop. Just go home and 00:16:20.56 watch that talk. It's amazing. A side 00:16:22.00 thing is that the propagator model was 00:16:23.93 used by one of the grad students at MIT 00:16:25.18 at the time to make the very first 00:16:26.38 spreadsheet. VisiCalc was based on this 00:16:28.18 model. This is a really useful 00:16:30.11 abstraction that everyone should know 00:16:32.56 about. It's data flow based, it does truth 00:16:33.94 maintenance, and it keeps provenance of 00:16:35.21 where all of the conclusions the truth 00:16:37.55 maintenance system came from, which means 00:16:39.91 it's probably going to be very valuable 00:16:41.74 for explainable AI later. There are a number of other approaches that I really like, but which I didn't have time to get into here. FrTime, from the Racket community, is great. In terms of formalisms for reasoning about this sort of thing, I really like the p-calculus. 00:16:44.03 We'll move to another area where 00:16:46.37 there's been even less progress. 00:16:47.87 Now we're getting to the the absolute 00:16:49.31 nadir of progress here, [and] that's in 00:16:50.50 program representation. Let's look at 00:16:52.55 that punch card again: 00:16:54.35 80 columns, there it is. 00:16:57.11 Now look at this. This is the output of a 00:16:58.43 teletype. Notice that it is fixed width 00:16:59.62 and approximately 80 columns. Notice that 00:17:00.94 the fonts are all fixed with. 00:17:02.38 This is the teletype in question. 00:17:04.85 This looks like it should be in a museum, 00:17:06.82 and it should be in a museum, and - in fact - 00:17:09.11 is in a museum. 00:17:11.87 We got these. So, this is the terminal in 00:17:13.54 which I did a lot of hacking on that VAX 00:17:16.42 that you saw earlier (when I wasn't 00:17:17.75 forced to use punch cards), and a lot of 00:17:19.37 that was in languages like VAX Pascal) - 00:17:22.54 yeah - but also Bliss, which was pretty 00:17:25.42 cool. So you'll notice that this is a 00:17:26.99 VT100 terminal. And all of you are 00:17:28.01 using machines today that have terminal 00:17:29.45 emulators that pretend to be this 00:17:32.39 terminal; that's why they have VT100 00:17:34.49 escape codes, because those escape codes 00:17:36.28 first shipped on this terminal. Now we'll 00:17:39.35 move on to another terminal. This is the 00:17:41.02 one that I used when I was doing all of 00:17:43.07 my early Unix hacking back in the 80s. 00:17:45.35 This is called an ADM-3A. Now, by 00:17:46.97 applause, how many of you use an editor 00:17:49.66 that has vi key bindings? Come on! Yeah, 00:17:51.52 all right, yeah. So then you might be 00:17:52.90 interested in the keyboard of the ADM-3A, 00:17:55.07 which was the one that Bill Joy had at 00:17:58.31 home to connect to school through a 00:18:00.11 modem while he was writing vi. So here it 00:18:03.52 is. Note the arrow keys on the h-j-k-l. 00:18:05.51 They are there because those are the 00:18:07.07 ASCII control codes to move the roller and the 00:18:08.69 printhead on the old teletype that you 00:18:10.61 saw a moment ago. So you'd hit control 00:18:12.89 plus those to control a teletype. We used to use CTRL-h to back up over printed characters to then type a sequence of dashes as a strikethrough on these old printers. We also used the same trick on display terminals to make fancy spinning cursors. 00:18:17.15 It happened to have the arrow keys, he 00:18:18.95 used them. Look where the control key is. 00:18:21.89 For all you Unix people, it's right next 00:18:23.93 to the a. To this day, on this 00:18:25.66 supercomputer here, I bind the caps lock 00:18:27.77 key to control because it makes my life 00:18:29.63 easier on the Unix machine that it is. 00:18:31.31 Look up there, where the escape key is, by 00:18:32.99 the q. That's why we use escape to get 00:18:35.15 into command mode in vi, because it was 00:18:37.31 easily accessible. Now scan across the 00:18:38.81 top row just right of the 0. What's 00:18:41.45 that? The unshifted * is the :. 00:18:43.37 That's why [it does] what it does in vi, 00:18:45.40 because it was right there. And now the 00:18:47.27 last one, for all the Unix people in the 00:18:49.31 audience, in the upper right hand corner 00:18:51.28 there's a button where when you hit 00:18:53.81 control and that button, it would clear 00:18:55.97 the screen and take the cursor to the 00:18:58.07 home position. If you did not hit control, 00:18:59.33 instead hit shift, you got the ~. 00:19:00.52 Notice tilde is right under home. If 00:19:01.54 you're wondering why your home directory 00:19:02.93 is tilde whatever username, it's 00:19:04.31 because of this keyboard. 00:19:07.97 Now here is Terminal.app on my mega 00:19:09.71 supercomputer. Notice 80 Columns of fixed 00:19:11.21 width type. Notice that when I look at 00:19:13.19 the processes they have ttys - that stands 00:19:14.99 for teletype. 00:19:15.00 This machine is cosplaying as a PDP-11. 00:19:19.54 Now, whenever I get exercised about this, 00:19:23.45 and talk about it, somebody sends me this 00:19:25.78 blog post from Graydon Hoare. He's 00:19:27.77 talking [about how] he'll bet on text. He 00:19:29.57 makes good arguments. I love text. I use 00:19:35.73 text every day. Text is good! The thing 00:19:35.74 about it, though, is that the people who 00:19:42.40 send me this in support of text always 00:19:45.11 mean text like this - text like it came 00:19:46.54 out of a teletype - and never text like 00:19:48.04 Newton's Principia, never text like this 00:19:50.39 from Wolfgang Weingart. That is, these 00:19:52.43 people don't even know what text is 00:19:55.25 capable of! They're denying the 00:19:56.93 possibilities of the medium! 00:19:58.90 This is how I feel about that. I've 00:20:01.01 promised Alex I will not say anything 00:20:02.69 profane during this talk, so you will be 00:20:05.27 seeing this emoji again. 00:20:07.66 The reason I disagree with this position 00:20:08.87 is because the visual cortex exists, okay? 00:20:10.49 So this guy, this adorable little 00:20:12.65 fella, he branched off from our lineage 00:20:14.69 about 60 million years ago. Note the 00:20:16.66 little touchy fingers and the giant eyes, 00:20:18.47 just like we have. We've had a long time 00:20:21.28 with the visual cortex. It is very 00:20:23.09 powerful. It is like a GPU accelerated 00:20:25.90 supercomputer of the brain, whereas the 00:20:28.37 part that takes in the words is like a 00:20:30.11 very serial, slow, single-thread CPU, and I 00:20:32.63 will give you all a demonstration right 00:20:34.07 now. 00:20:36.65 Take a look at this teletype-compatible 00:20:38.87 text of this data and tell me if any 00:20:42.47 sort of pattern emerges. Do you see 00:20:44.57 anything interesting? 00:20:46.31 Here it is plotted X/Y. Your brain knew 00:20:48.47 this was a dinosaur before you knew that 00:20:50.27 your brain knew this was a dinosaur. This dataset is Alberto Cairo's Datasaurus. 00:20:50.28 That is how powerful the visual 00:20:51.11 cortex is, and there are loads of people 00:20:53.81 who have spent literally hundreds of 00:20:56.45 years getting very good at this. Data 00:20:58.66 visualization. If I gave you a table 00:20:59.99 talking about the troop strength of 00:21:03.35 Napoleon's March to and from Moscow, 00:21:05.09 you'd get kind of a picture. But if you 00:21:06.49 look at it like this, you know what kind 00:21:09.28 of tragedy it was. You can see right away. 00:21:11.39 This was 175 years ago, and we're still 00:21:12.77 doing paper tape. 00:21:14.57 Graphic designers - they know something. 00:21:16.31 They know a few things. For instance, they 00:21:18.52 know that these are all channels. These 00:21:20.45 different things: point, line, plane, 00:21:22.61 organization, asymmetry - that these things 00:21:24.11 are all channels that get directly to 00:21:25.73 our brain, and there is no need to eshew 00:21:28.54 these forms of representation when we're 00:21:31.78 talking about program representation. 00:21:33.71 I recommend everyone in this audience 00:21:35.69 who hasn't already done so, go just get 00:21:37.25 this 100 year old book from Kandinsky 00:21:38.69 and get a sense of what's possible. 00:21:40.61 Here's one of his students working on 00:21:42.59 some notation. Look how cool that is! Come 00:21:44.93 on! All right, so another thing with text 00:21:47.14 is that it's really bad at doing graphs 00:21:49.31 with cycles, and our world is full of 00:21:50.57 graphs with cycles. Here's a Clojure 00:21:52.13 notation idea of the the taxonomy of 00:21:54.35 animals, including us and that cute little 00:21:56.02 tarsier. And it works fine because 00:21:57.35 it's a tree, and trees are really good at 00:21:59.81 containment - they can do containment in a 00:22:02.45 single acyclic manner. Now this 00:22:04.07 sucks to write down as text. This is the 00:22:05.57 Krebs cycle. Hopefully, all of you learned 00:22:07.43 this at school. If not maybe read up on 00:22:09.52 it. 00:22:10.90 If you imagine trying to explain this 00:22:13.61 with paragraphs of text you would never 00:22:15.40 get anywhere. Our doctors would all fail. 00:22:17.75 We would all be dead. So instead, we draw 00:22:21.11 a picture. We should be able to draw 00:22:23.63 pictures when we're coding as well. 00:22:25.07 Here's the Periodic Table of the Elements. Look how 00:22:27.28 beautiful this is. This is 1976. We've 00:22:28.97 got all these channels working together to 00:22:30.59 tell us things about all these these 00:22:32.02 different elements, how these elements interact 00:22:33.59 with each other. 00:22:35.51 Another area that we've pretty much 00:22:36.71 ignored is pragmatics, and what I mean by 00:22:37.97 that - I'm borrowing it from linguistics 00:22:39.77 because we've borrowed syntax and 00:22:41.93 semantics from linguistics - pragmatics is 00:22:43.49 studying the relationship between a 00:22:45.35 language and the users of the language, 00:22:47.51 and I'm using it here to talk about 00:22:48.52 programming environments. 00:22:49.90 Specifically, I want to talk about 00:22:51.64 interactive programming, which is I think 00:22:53.14 the only kind of programming we should 00:22:55.01 really be doing. Some people call it live 00:22:56.57 coding, mainly in the art community, and 00:22:58.43 this is when you code with what Dan 00:22:59.63 Ingalls refers to as liveness. It is the 00:23:01.13 opposite of batch processing. Instead, 00:23:02.93 there is a programming environment, 00:23:04.73 and the environment and the program are 00:23:06.28 combined during development. So what does 00:23:07.61 this do for us? Well, there's no compile 00:23:09.52 and run cycle. You're compiling inside 00:23:11.93 your running program, so you no longer 00:23:13.31 have that feedback loop. It 00:23:16.37 doesn't start with a blank slate and run 00:23:19.07 to termination. Instead, all of your 00:23:20.57 program state is still there while 00:23:22.01 you're working on it. This means that you 00:23:24.04 can debug. You can add things to it. You 00:23:26.99 can find out what's going on, all while 00:23:29.14 your program is running. 00:23:31.07 Of course, there's runtime introspection 00:23:33.23 and failures don't halt the 00:23:34.61 program. They give you some kind of 00:23:36.16 option to maybe fix and continue what's 00:23:37.73 happening now. This combination of 00:23:39.16 attributes, I would say, is most of what 00:23:41.21 makes spreadsheets so productive. 00:23:43.49 And it gives you these incredibly short 00:23:44.87 feedback loops, of which we'll now have 00:23:46.90 some examples. If you're compiling some 00:23:48.95 code, say, in Common Lisp, you can compile 00:23:50.51 the code and disassemble it and see 00:23:52.37 exactly what you got. Now the program is 00:23:54.23 running. The program is alive right now, 00:23:56.09 and I'm asking questions of that runtime. 00:23:58.49 And I look at this and I say, okay, 00:23:59.81 36 bytes - that's too much - so I'll 00:24:01.66 go through and I'll add some some, you 00:24:03.47 know, optimizations to it, recompile, 00:24:06.23 16 bytes that's about as many 00:24:07.43 instructions as I want to spend on this. 00:24:10.78 so I know a bunch of you are probably 00:24:12.89 allergic to S-expressions. Here's Julia. 00:24:14.45 You can do exactly the same thing in 00:24:17.51 Julia. Look at this. You get the native 00:24:19.07 code back for the thing that you just 00:24:20.87 made, and you can change it while it's 00:24:22.43 running. 00:24:24.59 Now what about types? This is where half 00:24:24.60 of you storm off in anger. So, 00:24:25.90 I'm going to show you this tweet, and I 00:24:28.13 wouldn't be quite this uncharitable, but 00:24:29.93 I broadly agree with this position. 00:24:31.19 It's a lot of fun like. I have 00:24:32.81 been programming for 45 years. I have 00:24:34.01 shipped OCaml. I have shipped Haskell. I 00:24:35.39 love Haskell, actually. I think it's great. 00:24:35.40 But I would say that over those many 00:24:37.66 decades, I have not really seen the 00:24:39.83 programs in these languages to have any 00:24:42.40 fewer defects than programs in any other 00:24:43.90 programming language that I use, modulo 00:24:46.01 the ones with really bad memory 00:24:48.77 allocation behavior. 00:24:51.28 And there has been considerable 00:24:52.97 empirical study of this question, and 00:24:55.31 there has been no evidence. It really I was going to do a little literature review here to show that development speed claims for dynamic languages and code quality/ maintenance claims for static languages appear to have no empirical evidence, but Dan Luu has already done a great job of that, so I'll just link to his page on the topic. Summary: "[U]nder the specific set of circumstances described in the studies, any effect, if it exists at all, is small. [...] If the strongest statement you can make for your position is that there's no empirical evidence against the position, that's not much of a position." 00:24:56.87 doesn't seem to matter. So if you like 00:24:59.14 programming in those languages, that's 00:25:01.90 great! I encourage you to do it! You 00:25:03.35 should program it whatever you enjoy, but 00:25:05.02 you shouldn't pretend that you have a 00:25:07.01 moral high ground because you've chosen 00:25:08.75 this particular language. And I would say 00:25:10.31 really that if what you care about is 00:25:12.16 systems that are highly fault tolerant, 00:25:14.45 you should be using something like 00:25:16.37 Erlang over something like Haskell 00:25:18.47 because the facilities Erlang provides 00:25:19.78 are more likely to give you working 00:25:21.04 programs. Imagine that you were about to take a transatlantic flight. If some engineers from the company that built the aircraft told you that they had not tested the engines, but had proven them correct by construction, would you board the plane? I most certainly would not. Real engineering involves testing the components of a system and using them within their tolerances, along with backup systems in case of failure. Erlang's supervision trees resemble what we would do for critical systems in other engineering disciplines. 00:25:22.85 You can throw fruit at me - rotten fruit - 00:25:24.47 at me later. You can find me in the 00:25:26.45 hallway track to tell me how wrong I am. 00:25:28.43 So, I've said that, but I'll also 00:25:30.23 show you probably the most beautiful 00:25:32.51 piece of [code] that I've ever seen. 00:25:33.64 Like, the best source code in the world. 00:25:35.33 And that's McIlroy's Power Serious, which 00:25:37.19 happens to be written in Haskell. So, this 00:25:38.75 is a mutually recursive definition of 00:25:38.76 the series of sine and cosine in two 00:25:40.13 lines of code. I want to cry when I look 00:25:42.11 at this because of how beautiful it is. 00:25:43.25 But that has nothing to do with software 00:25:44.63 engineering. Do you understand what I'm 00:25:47.33 saying? There's a different question. The 00:25:48.89 beauty of the language is not always 00:25:50.87 what gets you to where you need to go. 00:25:52.13 I will make a an exception here for 00:25:53.87 model checkers, because 00:25:55.97 protocols are super hard! It's a good 00:25:57.52 idea to try to verify them I've used 00:25:59.02 Coq and Teapot [for example] for these kinds of 00:26:00.95 things in the past, and some systems do 00:26:02.14 have such a high cost of failure that it 00:26:03.64 makes sense to use them. If you're 00:26:05.14 doing some kind of, you know, horrible 00:26:07.01 cryptocurrency thing, where you're likely 00:26:08.75 to lose a billion dollars worth of 00:26:11.09 SomethingCoin(tm), then, yeah, you 00:26:14.39 maybe want to use some kind of verifier 00:26:15.83 to make sure you're not going to screw 00:26:17.33 it up. But, that said, space 00:26:18.71 probes written in Lisp and FORTH) 00:26:20.21 have been debugged while off world. Had I had more time, I would have done an entire series of slides on FORTH. It's a tiny language that combines interactive development, expressive metaprogramming, and tremendous machine sympathy. I've shipped embedded systems, bootloaders, and other close-to-the-metal software in FORTH. 00:26:22.25 If they had if they had proven their 00:26:23.81 programs correct by construction, 00:26:25.07 shipped them into space, and then found out 00:26:26.63 their spec was wrong, they would have 00:26:28.43 just had some dead junk on Mars. But what In fact, they did prove their program correct by construction. But there was still human error! 00:26:30.35 these guys had was the ability to fix 00:26:33.47 things while they are running on space 00:26:34.85 probes. I think that's actually more 00:26:35.93 valuable. Again, throw the rotten fruit 00:26:37.90 later. Meet me in the hallway track. 00:26:40.19 I would say overall that part of 00:26:42.40 this is because programming is actually 00:26:44.75 a design discipline. It -- oh, we're losing 00:26:46.19 somebody - somebody's leaving now probably 00:26:47.51 out of anger about static types. This was an improvised joke about someone leaving to eat lunch or use the bathroom or something. I've since heard that that person felt embarassed and called out by the joke, so I'd like to leave an apology here. It was meant to be funny in context! 00:26:48.95 As a design discipline, you find that you 00:26:50.93 will figure out what you're building as 00:26:52.43 you build it. You don't actually 00:26:54.83 know when you start, even if you think 00:26:57.04 you do, so it's important that we build 00:26:58.90 buggy approximations on the way, and I 00:27:00.89 think it's not the best use of your time 00:27:02.93 to prove theorems about code that you're 00:27:04.31 going to throw away anyway. In addition, 00:27:07.43 the spec is always wrong! It doesn't 00:27:08.57 matter where you got it, or who said it, 00:27:12.76 the only complete spec for any 00:27:14.87 non-trivial system is the source code of 00:27:16.25 the system itself. We learn through 00:27:18.28 iteration, and when the spec's right, it's 00:27:20.02 still wrong! Because the software will 00:27:21.71 change tomorrow. All software is 00:27:24.23 continuous change. The spec today is not 00:27:25.73 the spec tomorrow. Which leads me to 00:27:27.23 say that overall, debuggability is in my 00:27:29.26 opinion more important than correctness 00:27:32.14 by construction. So let's talk about 00:27:33.76 debugging! 00:27:35.51 I would say that actually most 00:27:37.25 programming is debugging. What do we 00:27:38.63 spend our time doing these 00:27:41.02 days? Well, we're spending a lot of time 00:27:43.25 with other people's libraries. We're 00:27:44.87 dealing with API endpoints. We're dealing 00:27:46.97 with huge legacy code bases, and we're 00:27:48.71 spending all our time like this robot 00:27:50.93 detective, trying to find out what's 00:27:52.37 actually happening in the code. And we do 00:27:54.16 that with exploratory programming, 00:27:54.17 because it reduces the amount of 00:27:55.66 suffering involved. So, for example, in a 00:27:57.64 dead coding language, I will have to run 00:27:58.90 a separate debugger, load in the program, 00:28:00.35 and run it, set a break point, and get it 00:28:01.31 here. Now, if I've had a fault in 00:28:02.75 production, this is not actually so 00:28:04.49 helpful to me. Maybe I have a core dump, 00:28:06.35 and the core dump has some information 00:28:07.73 that I could use, but it doesn't show me 00:28:08.99 the state of things while it's running. 00:28:11.21 Now here's some Common Lisp. Look, I set 00:28:12.83 this variable. Look, I inspect this 00:28:14.51 variable on the bottom I see the value 00:28:16.49 of the variable. This is valuable to me. 00:28:18.16 I like this, and here we 00:28:20.33 have a way to look at a whole set of 00:28:22.49 nested data structures graphically. We 00:28:24.04 can actually see things - note in 00:28:25.85 particular the complex double float at 00:28:27.28 the bottom that shows you a geometric 00:28:28.37 interpretation. This object inspector is called Clouseau. You can see a video about it here. 00:28:30.23 This is amazing! This is also 1980s 00:28:31.49 technology. You should be ashamed if 00:28:33.89 you're using a programming language that 00:28:35.57 doesn't give you this at run time. 00:28:37.31 Speaking of programming languages that 00:28:39.64 do give you this at runtime, here is a 00:28:43.07 modern version in Clojure. Here's somebody 00:28:45.95 doing a Datalog query and getting back 00:28:47.93 some information and graphing it as they 00:28:49.31 go. I will say that Clojure is slightly 00:28:51.64 less good at this than Common Lisp, at 00:28:53.02 present, in part because the Common Lisp 00:28:53.03 Object System (CLOS) makes it particularly easy 00:28:54.16 to have good presentations for different 00:28:56.51 kinds of things, but at least it's in the 00:28:58.19 right direction. 00:28:59.14 As we talk about this, one of the 00:29:00.52 things in these kinds of programming 00:29:02.33 languages, like Lisp, is that you have an 00:29:04.31 editor and you're evaluating forms - all the 00:29:06.16 Clojure parameters here are going to 00:29:08.14 know this right off - 00:29:10.19 you're evaluating forms and they're 00:29:12.04 being added to the runtime as you go. And 00:29:13.43 this is great. It's a fantastic way to 00:29:14.69 build up a program, but there's a real 00:29:16.49 problem with it, which is that if you 00:29:18.35 delete some of that code, the thing 00:29:19.90 that you just evaluated earlier is still 00:29:20.93 in the runtime. So it would be great if 00:29:23.14 there were a way that we could know what 00:29:24.71 is current rather than having, say, a text 00:29:26.81 file that grows gradually out of sync 00:29:28.49 with the running system. And that's 00:29:29.69 called Smalltalk, and has been around 00:29:30.88 since at least the 70s. So this is the 00:29:32.14 Smalltalk object browser. We're 00:29:34.85 looking at Dijkstra's algorithm, 00:29:36.47 specifically we're looking at 00:29:37.78 backtracking in the shortest path 00:29:39.11 algorithm, and if I change this I know I 00:29:41.26 changed it. I know what's happening if I 00:29:43.07 delete this method the method is gone. 00:29:44.63 It's no longer visible. So there is a 00:29:45.95 direct correspondence between what I'm 00:29:48.40 doing and what the system knows and 00:29:50.57 what I'm seeing in front of me, and 00:29:52.01 this is very powerful. And here we have 00:29:53.87 the Glamorous toolkit. This is 00:29:56.38 Tudor Girba and feenk's thing. They embrace this 00:29:57.83 philosophy completely. They have built an 00:29:58.90 enormous suite of visualizations that 00:29:59.99 allow you to find out things about your 00:30:01.85 program while it's running. We should all 00:30:04.85 take inspiration from this. This is an 00:30:06.35 ancient tradition, and they have kind of 00:30:07.90 taken this old thing of Smalltalkers 00:30:10.37 and Lispers building their own tools as 00:30:12.40 they go to understand their own codebases, 00:30:13.97 and they have sort of pushed it - 00:30:15.16 they've pushed the pedal all the way to 00:30:17.63 the floor, and they're rushing forward 00:30:19.90 into the future and we should follow 00:30:21.76 them. 00:30:23.81 Another thing that is very useful in 00:30:25.19 these situations is error handling. If 00:30:26.81 your error handling is 'the program stops', 00:30:29.02 then it's pretty hard to recover. 00:30:31.07 But in a Common Lisp program like this - 00:30:33.04 this is an incredibly stupid toy example - 00:30:34.85 but I have a version function. I have not 00:30:36.04 actually evaluated the function yet. I'm 00:30:37.54 going to try to call it. So, what's going 00:30:39.28 to happen, well, the CL people here know 00:30:40.73 what's going to happen, it's going to pop 00:30:41.81 up the condition handler. So this is 00:30:43.07 something that - programming in Clojure - 00:30:43.08 I actually really miss from Common Lisp. 00:30:43.90 It comes up, and I have options here. I 00:30:45.88 can type in the value of a specific 00:30:47.51 function, say 'hey call this one instead' 00:30:49.13 for the missing function. I can try again, 00:30:50.99 which - if I don't change anything - will 00:30:52.90 just give me the same condition handler. 00:30:54.64 Or, I can change the state of the running 00:30:57.04 image and then try again. So, for example, 00:30:58.54 if I go down and evaluate the function 00:31:00.16 so that it's now defined and hit retry, 00:31:01.90 it just works. This is pretty amazing. We 00:31:02.81 should all expect this from our 00:31:05.09 programming environments. Again, when I 00:31:06.23 talk about Smalltalk and Lisp, people say 00:31:07.61 'well, I don't want to use Smalltalk or Lisp'. I'm 00:31:09.64 not telling you to use Smalltalk or 00:31:11.45 Lisp. I'm telling you that you should have 00:31:13.01 programming languages that are at least 00:31:14.99 as good as Smalltalk and Lisp. 00:31:16.37 Some people, when I show them all this 00:31:17.81 stuff - all this interactive stuff, they're, 00:31:19.61 like, 'Well, what if I just had a real fast 00:31:21.71 compiler, man? You know I can just 00:31:23.45 just change and hit a key and then the 00:31:25.19 things that -' Well, we're back to that 00:31:28.01 again, because if you have a fast 00:31:29.81 compiler you still have all the problems 00:31:31.97 with the blank slate/run-to-termination 00:31:33.64 style. Data science workloads 00:31:35.09 take a long time to initialize. You might 00:31:36.23 have a big data load and you don't want 00:31:37.43 to have to do that every single time you 00:31:38.81 make a change to your code. And the data 00:31:41.21 science people know this! This is why R 00:31:42.76 is interactive. This is why we have 00:31:43.97 notebooks for Python and other languages, 00:31:45.71 because they know it's crazy to work 00:31:48.23 this other way. Also, GUI State - oh my word! 00:31:49.54 It can be incredibly tedious to click 00:31:50.87 your way back down to some sub-sub-menu 00:31:53.38 so that you can get to the part where 00:31:55.13 the problem is. You want to just keep it 00:31:56.57 right where it is and go in and see 00:31:58.78 what's happening behind the scenes, and 00:32:00.35 fix it while it's running. Someone came up to me after the talk and described a situation where he was working on a big, fancy commercial video game. He had to play the same section of the game for 30 minutes to get back to where the error occurred each time. 00:32:01.78 Also, you should be able to attach to 00:32:03.35 long-running servers and debug them 00:32:05.45 while they're in production. This is 00:32:07.13 actually good! It's scary to people who 00:32:08.75 are easily frightened, but it is very 00:32:10.78 powerful. 00:32:13.19 I'll say after all of this about 00:32:16.37 interactive programming, about escaping 00:32:18.40 batch mode, that almost all programming 00:32:20.21 today is still batch mode. And how do we 00:32:21.52 feel about that? I kind of feel like Licklider 00:32:23.09 did. Licklider funded 00:32:24.88 almost all of the work that created the 00:32:26.26 world we live in today, and Engelbart 00:32:27.95 built half of it, and one of the things 00:32:29.69 that Licklider said that I found - I 00:32:31.07 just love the phrase - is 'getting into 00:32:33.23 position to think'. That is, all of the 00:32:34.97 ceremony that you have to go through to 00:32:36.59 get ready to do your work should go away, 00:32:36.60 and that was their whole mission in the 00:32:38.99 60s. 00:32:40.73 We almost got there, but then we have 00:32:42.40 languages like C++. 00:32:43.78 I could say a lot of mean things 00:32:46.31 about C++, but I used to work at the 00:32:47.57 same facility that Bjarne did, and I kind 00:32:49.66 of know him a little bit, so I'm not 00:32:51.23 going to do that. Instead, 00:32:52.97 I'm just going to quote Ken Thompson 00:32:56.99 This is a really funny situation, 00:32:59.21 because I worked [using] some of the early C++ 00:33:01.31 compilers because I was 00:33:03.28 excited about the idea of having decent 00:33:05.69 abstractions in a low-level language 00:33:08.14 that I could use [at work]. But I will say that it 00:33:08.15 was never great, and that it has gotten 00:33:09.40 worse over time, paradoxically by adding 00:33:11.51 good features to the language. But 00:33:14.26 if you keep adding every feature that 00:33:16.13 you possibly want, you end up 00:33:17.69 with a language that is not in any way 00:33:19.66 principled. There is no way to reason 00:33:20.99 about it. It has too much junk in it. And 00:33:22.97 if you'd like to see this happening in 00:33:26.45 real time to another language, I 00:33:31.43 recommend that you read what's going on 00:33:33.40 in TC39 with JavaScript, where they are 00:33:34.85 adding every possible feature and 00:33:36.52 muddying an already difficult language 00:33:38.81 further. In all fairness, TC39 is in a terrible position. They can't remove features from the language because there's such a large corpus already in the world. At the same time, the language has a bunch of ergonomic problems that they want to fix. I wish they had frozen a primitive version of JS and added a marker at the beginning of scripts to switch out which language is used, much in the way #lang does in Racket. 00:33:40.07 So, what about Go? Well, I admire the 00:33:42.76 runtime and the goroutines, the garbage 00:33:44.81 collector, but it's really another punch 00:33:47.21 card compatible compile/run language. It 00:33:49.25 also shares with C++ 00:33:50.87 the problem that it's not a great 00:33:52.19 library language, because if you want to 00:33:53.45 write a library in Go and then use it 00:33:54.88 from say a C program, or whatever, you 00:33:57.28 have to bring in the entire go runtime, 00:33:58.54 which is a couple [of megabytes]. not mostly what I 00:34:00.04 want. So what about Rust? Well, I mean it's 00:34:01.61 a nice thing that Rust is a good library 00:34:03.59 language. I like that about it. But it's 00:34:04.90 also a huge missed opportunity in terms 00:34:06.35 of interactive programming. They just 00:34:06.36 went straight for the punch cards again. 00:34:07.61 And it's a super super complicated 00:34:10.01 language, so it would be nice when 00:34:11.51 trying to figure out which of the 40 00:34:12.95 different memory allocation keywords 00:34:15.29 you're going to use to tell it how to do 00:34:16.60 its thing if you could explore that 00:34:17.99 interactively instead of going through a 00:34:19.55 compile/test cycle. And another way that 00:34:21.40 I feel about it - I have to quote Deech 00:34:23.38 here - which is that you know some people 00:34:25.07 hate stop the world GC, I really hate 00:34:27.23 stop the world type checkers. If 00:34:29.57 it's going to take me an hour to compile 00:34:30.95 my thing, I just want to give up. I'm 00:34:32.81 going to become a carpenter or something. 00:34:35.03 In this family of languages, 00:34:36.34 I'll say that Zig is more to my taste. I 00:34:37.60 actually like Zig more than I like 00:34:40.12 Rust. This will anger all of the 00:34:41.99 Rustaceans. I apologize, but it is true. 00:34:43.31 But, Zig people - for goodness sake - why is 00:34:45.10 there no interactive story there either? 00:34:46.66 You've got this nice little language 00:34:47.93 that has multi-stage compilation. It can 00:34:49.60 learn a lot from Lisp, and it just sort 00:34:52.43 of ignores all that and goes straight 00:34:53.57 to the 1970s or before. 00:34:55.43 So what do future directions that don't 00:34:57.34 suck look like? Well, I'll give you some 00:34:58.73 examples that try to use some 00:34:59.75 of the things I've talked about as 00:35:01.55 underexplored areas. So, this 00:35:02.75 is a structure editor for Racket, which 00:35:04.84 is a dialect of Scheme), and it was built 00:35:07.01 by a fellow called Andrew Blinn, and it's 00:35:08.51 still Racket underneath. That is, it's 00:35:10.01 still a lot of parentheses - it's still 00:35:11.99 S-expressions - but when you're editing it, 00:35:14.56 you have this completely different 00:35:15.89 feeling where you're modifying this 00:35:17.39 living structure and it's quite colorful 00:35:19.13 and beautiful - probably for some of you 00:35:20.56 garish - but I like it. 00:35:22.13 And I recommend having a peek at how 00:35:25.67 that works, and compare it to how you're 00:35:27.53 editing code now. Another example that I 00:35:29.27 think will be more accessible to this 00:35:32.21 audience is this one from Leif Anderson. 00:35:33.17 This is also Racket, and this is doing a 00:35:35.27 define using pattern matching for a red 00:35:37.25 black tree balancing algorithm. And it is 00:35:39.47 an ancient practice of many years to 00:35:42.29 document gnarly code like this with a 00:35:43.67 comment block over it, but you have a 00:35:46.13 couple of problems: (1) the comment block 00:35:47.51 is ugly and not completely obviously 00:35:49.01 meaning what it's supposed to mean; but 00:35:50.45 also (2) it can grow out of sync with the 00:35:52.43 code itself so. Leif has made this fine 00:35:53.81 thing that reads the code and produces 00:35:55.37 these diagrams, and you can switch the 00:35:57.82 diagram view on or off. So this is 00:35:59.45 what - if we want to talk about 00:36:01.43 self-documenting code, I would say 00:36:02.56 something like this that can actually 00:36:04.55 show you what the code does is better 00:36:07.49 than what most things do. 00:36:09.10 In the same vein, we've got this piece. 00:36:11.27 This is called Data Rabbit. Data 00:36:13.84 Rabbit is a crazy data visualization 00:36:15.71 thing written in Clojure. Each one of 00:36:17.08 these little blocks that are connected 00:36:18.29 by these tubes is actually a little 00:36:20.51 piece of Clojure code, and they can do 00:36:22.67 data visualization, they can do 00:36:24.53 refinement, they can do all of these nice 00:36:26.21 things. I'm not a huge, you know, box 00:36:28.19 and arrow programming language guy, but I 00:36:29.63 think that Ryan has done great work here 00:36:32.27 and that everybody should take a look at 00:36:33.13 it. 00:36:34.91 There's also Clerk. I'm a bit biased 00:36:36.05 here. This is something I work on. This is 00:36:38.03 something I've been working on for the 00:36:39.95 last year with the team at Nextjournal, 00:36:42.17 but I think it is actually very good, so 00:36:44.63 I'm going to tell you a little something 00:36:46.67 about it. 00:36:48.17 This is this is what it looks like 00:36:49.43 when you're working with Clerk. You've got 00:36:51.34 whatever editor you want on one side and 00:36:52.97 then you've got a view onto the contents 00:36:54.17 of the namespace you're working on off 00:36:55.49 to the side. This has some special 00:36:57.53 properties. It means, for one thing, that 00:36:59.08 you can put these notebooks into version 00:37:01.19 control. You can ship these notebooks. 00:37:02.51 These can be libraries that you use. You 00:37:02.52 don't have this separation between your 00:37:03.47 notebook code and your production code. 00:37:05.45 They can be the same thing, and it 00:37:06.82 encourages a kind of literate 00:37:07.84 programming approach where every comment 00:37:09.17 along the way - or every comment block 00:37:11.03 along the way - is interpretered as markdown, 00:37:11.93 with LaTeX and other features. 00:37:13.01 It's a very nice way to work. I 00:37:14.99 encourage the Clojure people here to 00:37:16.06 check it out. It is of no use to you if 00:37:18.10 you're not a Clojure person, because it's 00:37:20.15 very Clojure-specific. And I'll show you 00:37:21.89 a couple of other screenshots here, like 00:37:23.56 this we're doing some data science and 00:37:26.08 you've got - that's my emacs on the 00:37:27.89 right hand side, and I'm able to do all 00:37:30.23 of the things, like pretty printing data 00:37:31.97 structures, and inspecting them, and then 00:37:34.19 sending things over and seeing them in 00:37:35.63 Clerk. It is a very cozy way to work. 00:37:37.67 There's also, for instance, this example 00:37:38.81 where in around six lines of code I do a 00:37:40.37 query for some bioinformatic information 00:37:42.29 that shows me 00:37:43.91 what drugs affect what genes that are 00:37:45.34 known to be correlated with what 00:37:47.39 diseases, so we can see what drugs 00:37:48.77 might be interesting targets for genetic 00:37:50.08 disorders of differing type. Twenty 00:37:51.29 years ago, if you would have told people 00:37:53.56 they'd be able to do a single query like 00:37:55.73 this and find these kinds of things out 00:37:57.10 they would have looked at you like had 00:37:58.79 two heads, but here it is and it's no 00:38:00.34 code at all. Or this, which is a port of 00:38:02.15 Sussman's Structure and Interpretation 00:38:03.71 of Classical Mechanics library into 00:38:05.27 Clojure that you can use inside of Clerk. This is very nice work by Sam Ritchie. In addition to porting the libraries, he's working on an open edition of Sussman's textbooks using Clojure. 00:38:07.73 And then [you can] do things with physics - 00:38:09.95 real things. This is emulating a chaotic 00:38:11.87 system, and you can actually - you can't 00:38:13.43 see on this - but you can actually grab 00:38:15.77 sliders and move them around and change 00:38:17.75 the state of the system in real time. 00:38:18.82 It'll show you what's happening. 00:38:20.15 Or this. Martin here in the front row 00:38:23.56 wrote this. This is an example of Rule 00:38:25.49 30, which is a cellular automaton, and he's 00:38:26.56 written a viewer for it, so instead of 00:38:27.89 looking at 1s and 0s, you can 00:38:28.91 actually see the thing he's working on. 00:38:29.87 And the amount of code this takes is 00:38:31.73 almost none. 00:38:34.55 This is a regular expression dictionary 00:38:36.10 that I wrote. This thing - one of the 00:38:38.87 nice things about Clerk is you have all 00:38:40.67 the groovy visualization [and] interactive 00:38:42.17 things that come from having a browser, 00:38:43.91 but you also have all the power of 00:38:45.41 Clojure running on the JVM on the other 00:38:46.60 side. So you can do things like talk to a 00:38:47.93 database on the file system, which is a 00:38:49.19 revelation compared to what you can 00:38:51.34 normally do with a browser. 00:38:53.15 With this kind of thing you 00:38:55.13 can do rapid application development. You 00:38:57.17 can do all kinds of things, and I will 00:38:58.60 add that clerk actually improves on the 00:38:59.69 execution semantics that you normally 00:39:00.95 get with emacs and Clojure. This is inside 00:39:02.32 baseball for the Clojure people, sorry 00:39:04.49 for everybody else, but that thing I was 00:39:06.23 talking about - about how you can add 00:39:08.08 things to the running image and then 00:39:09.29 delete the code and then they're not 00:39:11.15 there and you don't know it and maybe 00:39:12.65 you save your program it doesn't work 00:39:14.03 the next time you start - Clerk will not 00:39:15.34 use things that you've removed from the 00:39:16.84 file. It actually reports that, so you get 00:39:18.65 errors when you have gotten your text 00:39:19.97 out of sync with your running image. 00:39:21.53 Now, obviously, I have a huge Lisp bias. I 00:39:23.51 happen to love Lisp, but it's not just 00:39:25.49 Lisps. There are other people doing good 00:39:27.89 things. This is called Hazel. This is 00:39:29.99 from Cyrus Omar's team. You see those 00:39:31.25 little question marks after the function 00:39:32.87 there? This is an OCaml or Elm-like 00:39:34.01 language, and they do something called 00:39:35.81 typed holes where they're actually 00:39:37.25 running interactively their type 00:39:38.75 inference and using it for its in my 00:39:40.06 opinion strongest purpose, which is 00:39:41.27 improving user interface. So here, when 00:39:42.34 you go to put something into one of 00:39:44.75 these typed holes, it knows what type 00:39:46.67 it's going to be, and it's going to give 00:39:48.71 you hints, and it's going to help you do 00:39:50.39 it, and they've taken that to build this 00:39:52.25 nice student interface. If you're 00:39:54.71 going to teach students through design 00:39:55.73 recipes that involve type-based thinking, 00:39:56.87 then you should have a thing like this 00:39:58.31 that actually helps them in some way, and 00:40:00.41 the one they've made is very good I 00:40:01.79 recommend reading the papers. [Cyrus] has 00:40:03.34 a student called David Moon who has made 00:40:04.84 this. This is called Tylr. I can't really 00:40:06.41 show you this in a good way without 00:40:07.97 [many videos]. So I recommend that you go to 00:40:10.19 David Moon's Twitter, and you scroll 00:40:11.27 through and you look at some of these 00:40:13.49 things. It's got a beautiful 00:40:15.23 structure editing component that 00:40:16.43 prevents you from screwing up your code 00:40:17.51 syntactically while you're working on it, 00:40:18.71 and gives you advice based on type 00:40:20.39 information. 00:40:22.13 Here this is my absolute favorite 00:40:23.56 from Cyrus's group. This is also by 00:40:25.37 David Moon who did the structure editor 00:40:26.81 and Andrew Blinn who did the nice editor 00:40:28.73 for Scheme that we saw at the beginning 00:40:30.17 of this section. Here we have, again, an 00:40:32.27 OCaml or Elm-like language, but you can 00:40:34.13 put these little widgets in. 00:40:36.65 These are called livelits, with the 00:40:37.79 syntactical affordance here [that] they 00:40:39.71 begin with a dollar sign. 00:40:41.51 He's got some data here, and the data 00:40:42.58 showed as a data frame. It's 00:40:43.97 actually a convenient, nice to edit thing, 00:40:45.41 and it's in-line with the source code. 00:40:47.56 This is a thing where you can have 00:40:49.13 more expressive source code by allowing 00:40:50.99 you to overlay different views onto the 00:40:51.00 source code. You can also see there's a 00:40:52.49 slider in there, and the slider is [live]. 00:40:54.23 [It] immediately computes. The rest of the 00:40:56.08 values are immediately recomputed when 00:40:57.41 the slider slides in a data flow kind of 00:40:59.03 way. This is a great project. I hope they 00:41:00.23 do more of it. Here's something a little 00:41:02.75 crazier. This is Enzo. Enzo is groovy 00:41:05.03 because it is a functional programming 00:41:07.06 language that has two representations. It 00:41:09.82 is projectional, so it is not just this 00:41:11.08 kind of lines between boxes thing. 00:41:13.55 It's line between boxes, and then you 00:41:15.10 can flip it over and see the code that 00:41:18.23 corresponds to those things. You can edit 00:41:19.79 either side and it fixes both. 00:41:21.34 And now we'll go on to our last example 00:41:22.97 from this section, which is also the 00:41:24.65 craziest one. And that is Hest by Ivan Reese. 00:41:27.10 Here we're computing factorial, 00:41:28.37 but we're doing it with animation, so we 00:41:29.69 see these values flowing through the 00:41:31.60 system in this way and splitting based 00:41:33.10 on uh based on criteria that are 00:41:34.60 specified in the code, and we're working 00:41:36.53 up to a higher and higher factorial now. 00:41:38.45 I look at this, and I don't say 'yeah, 00:41:41.45 that's how I want to program; I 00:41:42.89 want to spend every day in this thing', 00:41:44.81 but what I've learned - if nothing else - 00:41:47.15 over the very long career that I've had, 00:41:49.67 is if you see something that looks 00:41:50.99 completely insane and a little bit like 00:41:52.31 outsider art, you're probably looking at 00:41:53.87 something that has good ideas. So, whether 00:41:55.60 or not we ever want to work like this, we 00:41:58.01 shouldn't ignore it. 00:41:59.21 This was my last example for today. I had to stop because I was already slightly over time, but there a number of other systems that I would like to have mentioned: * Ink & Switch has funded a number of pieces of infrastructure that could be building blocks for new environments. They've also funded a team that has produces one of the more interesting tablet interface experiments in recent years, shown in this video by Szymon Kaliski. * Subtext, by Jonathan Edwards, is full of interesting ideas. * Natto by Paul Shen is another take on node/arrow systems. * This collection of videos contains interesting ideas for visualizing programs and their execution. * Tree-edit, a structural editor that combines Treesitter and miniKanren to give the same sort of experience that Lispers have long taken for granted to programming languages with more complicated syntax. * Darklang has many features that I find admirable. * JetBrains MPS provides an environment for building tooling for programming languages. In this talk, I stayed away from artistic livecoding systems because many programmers can't see themselves in what artists are doing. However, I would be remiss not to show you these systems: * Andrew Sorensen's Extempore (video). * Sonic Pi, by Sam Aaron. A Ruby dialect on an Erlang runtime for teaching programming through musical composition (and more). * Olivia Jack's Hydra, a collaborative environment for livecoding. * Orca, by Hundred Rabbits. A 2D programming language with a built-in clock for livecoding music. The rest of their projects are also worth your time! * I livecode most of my own artwork in Clojure and Scheme. 00:42:01.91 I have some thank yous to do. First, 00:42:04.31 I'd like to thank Alex for inviting me 00:42:07.25 to give this talk. I'd like to thank Nextjournal 00:42:08.69 for sponsoring my work, including 00:42:10.91 the writing of this talk. And I would 00:42:13.25 like to thank all of you for watching! 00:42:15.41 Thank you very much!