https://www.robinsloan.com/lab/slab/ Home About New story: Author's Note From: Robin Sloan To: the lab Sent: October 2021 The slab and the permacomputer Obsidian revetment slab fragment, 1st century A.D., Roman Obsidian revetment slab fragment, 1st century A.D., Roman This is a note intended to lay out something that's lately clicked for me. Here are three glimpses of the future of computing that all seem to "rhyme": 1. Cloud functions I wrote about my experience with Google Cloud Functions back in the spring; for me, these represent the "perfection" of the AWS/GCP model. Their utility snuck up on me! At this point, I have about a dozen cloud functions running -- or, I guess it's more accurate to say, waiting to run. My little platoon of terra cotta warriors, dozing in their data centers until called upon. 2. Colab notebooks I'd heard about these forever, but it's only in the past year that I've used them, and they have in that time become indispensable. The fusion of "document" with "program" AND "environment" is frankly dizzying; using Colab feels more futuristic than just about anything else I do in a browser. (I should add that I'm terrible at Python, but part of the appeal is that you can be terrible at Python and still get a lot done in these notebooks.) You might reply, "surely, Robin, you're just saying that you admire IPython and Jupyter notebooks generally" -- but I'm not really. The astonishment of the Colab notebook is that it comes with a powerful computer attached, instantly, for free! You can buy a subscription to make that computer even more powerful, which I do, happily. Of course, I do recognize Colab's lineage, and I'm grateful for all the labor behind IPython and Jupyter. This recent interview with IPython's creator, on the occasion of its 20th anniversary, is a wonderful story of invention. Fernando Perez says: [My mentor] had the patience to let me "productively procrastinate" by building IPython, something that I could somewhat justify as a tool for finishing that dissertation. I regained some much needed confidence, I got attracted to building something, and it turned out to be really important. An understatement! For me, the magic is in the specific combination of Jupyter's affor dances with Google's largesse: you open a new tab, and poof, it's a document with a powerful computer attached. 3. World computers I can't get behind Web3 in theory or in practice; at the same time, I'll concede that Ethereum's conjuring of a "world computer" is deeply evocative. The Ethereum blockchain is one entity, shared globally, agreed upon by all its participants: that's what makes it useful as a ledger. The Ethereum Virtual Machine, a kind of computer -- simultaneously sophisticated and primitive -- is likewise one logical entity, even if it's distributed in space and time. As with a lot of things in crypto, the feeling is as much mystical as it is technical. I understand why people get excited when they deploy an Ethereum contract: it feels like you are programming not just a computer, but THE computer. That feeling is technically wrong; it is definitely just a computer; but since when did the technical wrongness of feelings prevent them from being motivating? --------------------------------------------------------------------- I think these are glimpses of an accelerating reformulation of "computers"--the individual machines like my laptop, or your phone, or the server whirring in the corner of my office -- into "compute", a seamless slab of digital capability. I like "slab" better than "cloud", both for its sense of a smooth, opaque surface and its suggestion of real mass and weight. That's the twist, of course: cloud functions and Colab notebooks and Ethereum contracts DO run on "computers", vast armadas of individual machines taking up real physical space, venting real hot air. A responsible user of these systems ought to remember that, but ... only sometimes. Power outlets also conceal gnarly infrastructural realities, real mass and weight, and a person ought to be aware of those, too -- but not, perhaps, every time they plug in the vacuum. The idea that "computers" might melt into "compute", a utility as unremarkable as electricity or water, isn't new. But I do feel like it's suddenly melting faster! For me, a more useful analogy than electricity is textile manufacturing, which was, a couple centuries ago, THE high-tech industry; innovations in mechanical weaving were close to the core of the industrial revolution. Today, aside from the weird technical fabrics that are like, bullet-proof and opaque to cosmic rays, textile manufacturing isn't considered high-tech: it's just ... industry, I suppose. Textiles are produced with extreme effi ciency in huge, matter-of-fact facilities. Move along! Nothing to see here. I recently read David Macaulay's book Mill, illustrating the construc tion and growth of a textile mill in Rhode Island in the early 1800s, and, I've got to tell you: Macauley's mill looks and feels like a data center. They put data centers near rivers, too! For me, this raises the analogical question: Textiles in 1800 : textiles in 2020 :: computers in 2020 : ??? I mean, I am betting the ??? is a slab -- but I don't know exactly what kind, nor do I know how it will be built or operated or accessed. --------------------------------------------------------------------- The dutifully critical part of me wants to shout: you shouldn't trust these slabs! Their operators, G -- and A -- and M -- and the rest, will surely betray you. The very signature of the corporate internet is the way it slips from your grasp. The leviathans swim off in pursuit new markets, and what do they leave you with? Deprecation notices. There are other endings, too: even now, the slabs occasionally flicker offline, and it's not difficult to imagine a seriously hard crash, one that lasts a long time, caused by either an accident or an attack. So much for my terra cotta warriors. Then again ... internet trunk lines run alongside railroad tracks. Won't the slab operators and their infrastructure still be with us in a hundred years, in SOME form, just as the railroads are today? I would guess yes, probably. So, I think maybe we -- that's the "we" of people interested in the futures, near and far, of computers -- ought to go in two directions at once. First, if somebody offers you a seamless slab of compute and says, here, take a bite: sure, go for it. See what you can make. Solve problems for yourself and for others. Explore, invent, play. At the same time, think further and more pointedly ahead. There's an idea simmering out there, still fringe, coaxed forward by a network of artists and hobbyists: it's called "permacomputing" and it asks the question, what would computers look like if they were really engi neered to last, on serious time scales? You already know the answers! They'd use less power; they'd be hardy against the elements; they'd be repairable -- that's crucial -- and they'd be comprehensible. The whole stack, from the hardware to the boot loader to the OS (if there is one) to the application, would be something that a person could hold in their head. Plenty of computers were like that, up until the 1980s or so; but permacomputing doesn't mean we have to go backwards. The permacom puters of the future could be totally sophisticated, super fast; they could use all the tricks that engineers and programmers have learned in the decades since the Altair 8800. They would just deploy them toward different ends. As a concrete-ish example, I think this project from Alexander Mordv intsev is lovely, and totally permacomputing: Alexander is the discoverer, in 2015, of the "DeepDream" technique, an early -- now iconic -- fountain of AI-generated art. You have surely seen examples: images that boil with strange details; whorls of eyeballs where eyeballs should not whorl. Earlier this year, Alexander released a stripped-down implementation of DeepDream written in a vintage dialect of C, his code carefully commented. This version runs on a CPU, not a GPU. It does so very slowly. Who cares? It whorls its eyeballs eventually, even on the humblest hardware. You could run Alexander's deepdream.c on a Raspberry Pi. You could probably run it on a smart refrigerator. The implementation does depend on a single pre-trained model file, produced at (then-)great expense by many computers with very fast GPUs. I find this totally evocative: it's easy to imagine future permacomputers that rely, for some of their functions, on artifacts from a time before permacomputing. It would be impossible, or at least forbiddingly difficult, to produce new model files, so the old ones would be ferried around like precious grimoires ... (For the record, I already feel this way about some ML model files: whenever I find one that's interesting or useful, I diligently save my own copy.) Even if it turns out you never need a permacomputer, you'll be glad you thought about them. Powerful forces are pushing computing toward vast, brittle, energy-hungry systems that are incomprehensible even to their own makers; I should know, because I am a small constituent part of these forces. Given such pressure, even a faint counter vailing wind is precious. The sailing/computing duo Hundred Rabbits are pilgrim-poets of perma computing. Their Uxn project is a clever 8-bit computer design that can be built or emulated in a variety of ways, including on old, recycled hardware. Of Uxn, they write: With only 64kb of memory, it will never run Chrome, TensorFlow or a blockchain. It sucks at doing most modern computing, but it's also sort of the point. It's more about finding what new things could be made in such a small system. Where does this leave us? I'm perfectly comfortable in the both/and. I accept the invitation of the slab; I benefit daily from the leverage it grants me. I am, at the same time, certain my functions and notebooks will be blown away before the decade is out; maybe just by the leviathan's restlessness, or maybe by something more dire. I'd like a permacomputer of my own. October 2021, Oakland I'm Robin Sloan, a fiction writer. You can sign up for my lab newsletter: [ ] [Subscribe] This website doesn't collect any information about you -- not even basic analytics. It aspires to the speed and privacy of the printed page. Don't miss the colophon. Hony soyt qui mal pence