Copyright 1992 by Mary Eisenhart. All rights reserved. Nanotechnology Separating Myth From Reality Preparing For The Future An Interview With Eric Drexler By Mary Eisenhart Photos: Glenn Matsumura I expect in the ultimate destiny of this that there's going to be an appliance that sits on your desktop that can make anything. I don't expect that to happen within the lifetime of AutoCAD Release 13 or 14. But that's where it's all going. This week we're shipping HyperChem, which lets you design objects at the molecular level. The technology is going to go on scaling down until we have the same level of fine-grained control over the structure of matter that biology has been using for the last billion years or so. And at that point, we're going to be able to make *anything* from a digital model. That's going to change the world, beyond comprehension, I think. --John Walker, MicroTimes #95 Imagine that many of the key developments of twentieth-century technology were consequences of something that could be figured out in the nineteenth century, but still would take many years to implement. So in the nineteenth century, someone was able, not to understand the whole of the twentieth century, but to understand that it would be possible to develop antibiotics, nuclear weapons, and farm technologies that made it possible to feed the population with only a few percent of the population working in agriculture. Let's say that this was determined at a time when most people were farmers, when infectious disease was a dominant cause of death, and when no one had any concept that it was possible to destroy the planet. One could imagine a range of scenarios where the ability to have food without people working on the farm produces unimagined wealth and leisure and is portrayed as Utopia. One can imagine the existence of nuclear weapons as being portrayed as a sure recipe for our immediate obliteration, and the idea of penicillin eliminating a wide range of infectious diseases as a fantasy--"a pill that cures all these things?" The reality has been much more complex than any of those views, and much of what has happened would have been rejected as a fantasy at the time. Indeed, many correct predictions at various times were rejected in just that manner. I think that's a useful comparison to judge what's being said now, and how it may be compared to the eventual reality. --Eric Drexler Back in the 1970s, while an undergraduate at MIT, Eric Drexler read about the advancing state of research in molecular biology, where, he says, "researchers kept discovering new molecular machines within organisms and studying how they worked. I began to wonder what kinds of things we'll be able to build when we have tools we don't have yet, tools able to build molecular machines." Not without a fair amount of struggle for acceptance in academic circles, Drexler ultimately received a Ph.D. in Molecular Nanotechnology from MIT, and is currently a Research Fellow of the Institute for Molecular Manufacturing. He is also the president of the Foresight Institute, a Palo Alto-based nonprofit educational organization founded, as he explains it, "to gather, digest and distribute information on molecular nanotechnology and related advanced technologies, discussing both what's happening in the laboratory today, what the pathways are, what the long-term technological capabilities are, and the consequences for policy and decisions that need to be made." Since Drexler's first book, Engines of Creation (Doubleday), first appeared in 1986, nanotechnology has garnered enthusiastic interest (and equally vigorous nay-saying) in the technical and scientific communities. It has also brought on a bizarre assortment of half-baked speculation, snake oil, and pie-in-the-sky schemes. This, Drexler points out, is only natural for discussions of products likely to be decades away. And while the inevitable onslaught of flakes and charlatans is a known road hazard on the technological highways--particularly when sober scientists discuss nanomachines dedicated to killing cancer cells or reversing the aging process, or food production reminiscent of the have-it-your-way gizmos on Star Trek--it need not detract from useful study. "It's very important to distinguish between what is near-term and what is long-term," says Drexler, "and also between what's real and unreal. And to not confuse 'long term' with 'unreal,' which is a classic American disease." While in the United States, there is little funding (public or private) committed to research in the field, the situation in Japan is quite different, with MITI announcing last year a commitment of $185 million to nanotechnology research, and the Science and Technology Agency funding research projects aimed at building complex structures, atom by atom. One such project, the Aono Atomcraft Project, has been taking a hands-on approach that parallels theoretical studies by Drexler and Xerox PARC researcher Ralph Merkle, but, says Drexler, "Aono has funding and a laboratory. Ralph and I are both doing theoretical studies--studying what kinds of systems can be built in the long term, and how to get there. Which is different from taking steps in a laboratory along one of these paths. And also much less expensive!" he laughs. "I don't know of any people who have looked closely at molecular manufacturing, and understand the physical mechanisms involved and the analysis behind it, who are not of the opinion that, yes, this is going to be a dominant manfacturing technology in the twenty-first century. Certainly that it's going to be used to make computers. Quite likely that it's going to be used to make a much broader range of products," says Drexler. And the Japanese, historically more concerned with the long term than the quarterly bottom line, are putting significant resources into nanotechnology. "Rather than seeing twenty years as forever, they see it as a fraction of a human lifetime, as planning for one's own personal future, for that of one's friends and relatives and children." So, for now, Drexler and his colleagues are engaged in research and information dissemination, taking the small steps along the path to a radically different future. In November, the Foresight Institute will present the First General Conference on Nanotechnology: Development, Applications, and Opportunities in Palo Alto. The conference lasts three days and features discussions of issues ranging from "Intelligent Computation in the Age of Molecular Manufacturing" by MIT's Marvin Minsky to "Designing Molecular Components" by Ted Kaehler of Apple Computer to "Nanotechnology R&D Sponsorship" by Neil Jacobstein of Cimflex Teknowledge. For the last four and a half years, Drexler's been working on the just-published Nanosystems: Molecular Machinery, Manufacturing, and Computation (Wiley-Interscience), a college-textbook-level work intended to serve as a foundation for the studies of future researchers in the field. In 1991, he and coauthors Chris Peterson and Gayle Pergamit published Unbounding the Future (Morrow), which explores, at a level accessible to a nontechnical audience, the basic concepts and social implications of nanotechnology. With a view to similar exploration, we recently paid Drexler a visit. What do you mean by "nanotechnology"? Since 1986, it's become a buzzword in science and technology, but it has been applied to anything on a nanometer scale, rather than to what I've been terming here "molecular manufacturing." There are two basically different approaches to making small structures. One is the top-down approach that's been followed in the semiconductor industry, where you use large machines to make small features, and you strive to make smaller and smaller features every year, while still keeping enough control of the shape of those little irregular structures that they do what you want. The payoff from this has been enormous; that path has a considerable distance to go, and the computer capacity that results will contribute greatly to the development of molecular nanotechnology. But molecular nanotechnology itself moves in the opposite direction. It starts with small, precisely structured molecules, which people have been making for a hundred years or more through organic chemistry. And the challenge, rather than making things smaller, is to make things larger, while keeping that precise control. So molecular nanotechnology is a bottom-up path. Conventional microtechnology, which has recently been relabeled with the sexy term nanotechnology, is a fundamentally different top-down path that I don't think leads to the same destination. I see them as moving together on a size scale, and having some synergy. I can imagine hybrid systems for a time. But then the molecular approach, I believe, will displace the other, because of its advantages of precision, range of control, and cost. Nobody's ever suggested to me how anything else could substitute for it. Molecular manufacturing, which is at the heart of molecular nanotechnology, will be based on the use of molecular machines that use molecules as building blocks to build complex structures. Explain "molecular machines." A conventional machine is a large collection of atoms with parts that move and do useful things. A molecular machine is one that has very few atoms, relatively speaking--thousands or millions or billions are typical numbers--and that has all its atoms in precisely defined locations as an aspect of its design and manufacture. So they're devices that have scales measured in nanometers instead of millimeters, which makes them roughly a factor of a million smaller than conventional mechanical devices. The things that molecular machines do in biology include taking molecules and putting them together in complex patterns to make other molecular machines. A wide range of products--everything from protein molecules to redwood trees--is made by biological molecular machines. Studying what could be done with artificial molecular machines led to the conclusion that they could be used to build better molecular machines that could be used to build better molecular machines that could be used to make--it's a slight exaggeration to say "anything that's physically possible by arranging atoms," but surely a much, much wider range of physical structures than can be made any other way. In the 1970s, genetic engineering technology was just getting underway. This made it possible to synthesize, more conveniently than before, new protein molecules. Proteins serve as components of molecular machines in biology, and that meant that the problem was one of design. We had the tools to make building blocks for molecular machines; the challenge was to design them. So in 1981 I published a paper in the Proceedings of the National Academy of Sciences, arguing that protein design was in fact feasible--many people at the time thought that it was impossible because of the complexity of different ways that a long, floppy molecule could fold up--and then presenting an analysis that indicated that with protein-based molecular machines, you could start this process of building better molecular machines, and eventually have very sophisticated molecular manufacturing systems. And in fact in 1988, a protein molecule was successfully designed from scratch, by Bill DeGrado at DuPont. A number of groups have done likewise since. Unfortunately, this work has not been done by people with a systems engineering perspective, and so little or no effort has gone into making building blocks for larger systems. It's been done by biologists and chemists, who are chiefly interested in studying natural systems by imitating them, with small differences, and in making devices that do what we see in nature, perhaps under different conditions. For example, proteins have been engineered for greater stability, to enable them to sit in bottles of enzyme detergent on shelves in grocery stores for a long timeYx So what do engineers contribute? Engineers bring an understanding of the principle that a set of building blocks and a way of putting them together can be used to make systems of indefinitely large size and complexity. We see in computers that knowing how to make a few kinds of transistors and connections between them is sufficient to make the most sophisticated supercomputers in the world, or at least the CPUs for them. We see in software that a few instructions in a RISC machine can be used to implement absolutely any computational process. And that understanding, applied to the molecular domain, can open a new world of engineering. In some ways the development of molecular manufacturing will be like the development of the digital computer. Once upon a time, there were many different special-purpose devices for processing information--adding machines, and analog computers, and so on and so forth. Then there was the development of a general-purpose device for processing information that worked by handling information in what could be seen as its fundamental units--bits--performing repetitive operations in a controlled sequence at high speed, using smaller and smaller devices. In conventional manufacturing today there are a wide range of special purpose devices--milling machines and lathes and injection molding machines and so forth--but what we can look forward to is the development of a general-purpose device that can make virtually anything by working with matter in terms of its smallest building blocks, atoms and molecules. And again, working at high speeds, using small devices performing simple repetitive operations in a controlled sequence. What kinds of products and technologies might come out of this? The easiest products to imagine just offer large performance improvements over ones that we already have. For example, we've seen computers for many years now become smaller and faster and more energy-efficient. With molecular manufacturing, it will be possible to make machines that are somewhat faster--billion-cycle-per-second clock rates--that are smaller than a bacterium and consume so little power that a billion times the capacity of a modern supercomputer could be put in a desktop box and cooled with a fan. The reason that these devices are only somewhat faster than modern electronics is that this particular calculation is based on a simple mechanical scheme, a hybrid between Babbage's Analytical Engine (laughs), modern VLSI architectures, and molecular components. Ralph Merkle at Xerox PARC and I have been looking at electronic schemes for computation that would likewise exploit nanometer-scale components built with molecular precision. Most of the work that Ralph and I have been doing, though, uses the standard computational modeling tools that chemists have developed to predict the behavior of the molecules they're working with, and applies those tools to describe large structures that are designed to work like macroscopic machine components. Things like robot arms in factories--for an example that's highly relevant to molecular manufacturing--have gears and bearings and a wide range of moving parts. So Ralph and I have designed a set of bearings with hundreds to thousands of atoms--the principles scale up to larger structures--and more recently have designed the first nanometer-scale planetary gear system, a device that has a shaft coming in one side spinning it at one speed, and a shaft coming out the other side that spins at a different speed, and transmits power from one to the other. What might this gizmo be used for? This particular device is primarily a demonstration of what can be designed and modeled with available tools. But planetary gears are found, in the macroscopic world, in automobile transmissions, and in nanometer-scale mechanical systems they can serve similar functions. What separates the designing and modeling, which you can do with current molecular modeling technology, as I understand it, from the ability to actually build the things? What we've been designing are things that can't be built with anything like today's tools. They can be modeled, in part, because they contain very few atoms and therefore are among the simplest mechanical devices that are possible in the physical world. But to make them will require tools that can position very reactive molecules precisely in three-dimensional space, to build up these structures in more or less a building-block fashion. Those tools don't exist, but we can see how to go about developing them in a step-by-step fashion. It will take many years. What is the current state of tools? It is now possible for people at IBM to line up a bunch of atoms to spell IBM. But as I understand it, this is done at very, very low temperatures, and when you get to room temperature the atoms go about their normal business... So this is transitory and expensive and not currently useful. The next major step on the pathway that currently seems most attractive involves using a relative of the scanning tunneling microscope (which was what Don Eigler's group used at IBM Almaden), the atomic force microscope, as a positioning mechanism for reactive molecules, molecules that, when put in place, will form strong, stable bonds to their neighbors. Markus Krummenacker, a researcher supported by the Institute for Molecular Manufacturing, is presently studying the design of molecular building blocks with those properties. And research is getting underway here in the Bay Area aimed directly at building an instrument able to position such building blocks, a descendant of the atomic force microscope. This instrument, as planned, will be an adaptation of the atomic force microscope. The atomic force microscope can act like a robot arm, able to position things in three degrees of freedom, not rotating, but X-Y-Z. What has been lacking is the functional equivalent of a hand at the end of the arm. It's like a robot arm that ends in a rough rock, rather than a selective gripper. A suitably attached protein molecule, in particular an antibody fragment, can serve as a selective gripper for other molecules. This family of technologies can initially be used as an improved atomic force microscope--rather than bumping a blunt rough object against something to probe its shape, you can bump a sharp, precisely defined molecule against something to probe its shape, and can in fact probe it with several different tips, one after another. This promises to be a powerful tool for biological studies, and for the pharmaceutical industry, as soon as it can be developed, and without building anything else. So that will, I believe, help to fund the development of the technology base, but that same instrument, by positioning reactive molecules, can build things. And the question is, what? The first set of applications will be molecular instruments, made by building a complex molecular structure that's comparable in size to a protein molecule, or a set of protein molecules. That, for example, reads DNA--the DNA molecule binds, and feeds through the structure, and is destroyed one step at a time. If you can probe the DNA molecule as it goes by, so that you are reading out the sequence, you would have a molecular DNA reader. We know that molecules can read DNA, because that's how DNA is copied; the challenge is to build a device that reads DNA, and instead of producing a copy, produces, through a suitable interface, a pattern in computer memory. This data could be used to learn about the molecular mechanisms of healthy tissue and of disease processes, and would be an aid in medical studies and pharmaceutical design, as well as in basic biology. If you can easily map all of the genes in a normal cell, then all of the genes in a cancer cell, you're going to more easily learn what the genetic changes were with that particular cancer. And from there, you'll have made at least a step in learning how to deal with it. One of the scenarios in Unbounding is that of taking a pill of little nanomachines that attack only your cancer cells. How realistic is that? How do you separate the snake oil from the reality when you're at such an early stage? Some people would have it, for instance, that nanotechnology will enable us to live forever... Well, first, it's clear that the ability to build sensors and medical instruments at a molecular scale, and to guide them using computers that are themselves smaller than a cell, will open a new world of medical capabilities. It will bring surgical control to a molecular level. What that can mean for treating disease is extremely broad. What it can mean for reversing the molecular and cellular changes of aging is also very broad. But when people speak of immortality, they're speaking of something that can't be achieved within physical law as we understand it, because nothing is permanent and indestructible. This spring I heard you say that when you first got interested in this, it seemed like a very, very long-term thing, and then relatively recently enough progress has been made to make you think that some of it, at least, is fairly short-term. Over the years we've seen both an improvement in the technology base--the ability to make molecules and the ability to maneuver physical objects with subatomic precision. That's come out of laboratory work in the rest of the world. In my own work over the last several years I have begun focusing on the question of the first steps. My initial studies were of long-term capabilities, which in many ways are simpler than first steps, in much the same way that digital logic is simpler than complex nonlinear transistor circuitry where you're paying attention to all the messiness of what's going on in those circuits. Studies of those questions, taking advantage of recent developments in the technology base, indicate that there are impressive advances that can be made with something on the order of one year of laboratory effort, involving just a handful of people and relatively inexpensive equipment. And that there are a series of steps of increasing expense and difficulty as one builds more complex systems that lead from there to advanced molecular manufacturing of a wide range of products. With payoffs at every step along the way. It appears that a first generation of scientific instruments based on improved atomic force microscopes can be developed in a time frame and cost range that is standard for venture capital. A few years, and a few million dollars. Those instruments would then be very useful in biomedical research and pharmaceuticals, and would have a market that's substantial compared to that initial investment. That same technology base, with moderate extensions, can be used to build complex molecular structures by positioning reactive molecules. Development can be rapid, because the time required to do an experiment to find out what's wrong with it and try again is short, on the order of a day. And the instruments required to do that are expected to be in the cost range of a high-end workstation. And so it's reasonable to expect that a wide range of moderately complex molecular mechanical devices can be built in a moderate length of time with a moderate development cost. That range of devices includes a yet broader class of scientific instruments--I gave you the example earlier of a DNA reader. That's the most obvious one, and I suspect that as we get closer to that time and look more closely, a wider range of molecular scale insturments will appear attractive. The reason for focusing on scientific instruments is that the value of information is independent of the size of the mechanism that produces it. Information about molecules that are found in the human body or that are used in industry in the million-ton-per-year range can be very valuable, even if that information was gained by studying just one molecule. So building a single molecular instrument can yield information that's worth thousands, millions, in some instances billions of dollars. Pursuing that development path leads to a technology base able to build fairly complex molecular machines. Design analysis indicates that a mechanism with roughly one hundred moving parts is complex enough that with broadcast acoustic instructions to make it step through a series of states, it could be used much as the AFM mechanism could be used, as a construction mechanism, to build things. And since it itself would have been built by the AFM mechanism, one of the kinds of things it could build is more devices of the same sort. At that point, you're in a position to make products from chemicals that have some cost that doesn't look ridiculous when expressed in dollars per gram, and is extremely small when expressed in dollars per molecule, since there are many billions of billions of typical molecules in a gram. At a thousand dollars per gram, you're talking about much less than a nanodollar per molecule. (laughs) If you can produce complex structures from those building blocks for that kind of cost per gram, the cost per device is many orders of magnitude smaller than what is possible with semiconductor lithography and the integrated circuit technology on which computer electronics is based. At that point it's reasonable to expect that someone will design devices that can serve as RAM storage cells, and will do the engineering work necessary to build RAM systems in which these molecular devices serve as storage cells. At that point a reasonable thing to anticipate is memory devices that use a semiconductor chip as an addressing mechanism, and a layer of repetitive molecular devices on top of that--a slab consisting of many layers of molecular devices on top of that, such that bits could be read from any of the molecular blocks on the interior of the slab. We're not talking a few atoms, or a few thousand atoms, but perhaps a few million per block. Nonetheless, reasonable storage capacities for a device like that are in excess of 1015 bits in a chip-size package. Compare that to what's currently available, or planned. There is talk today of billion-bit chips. This would be a million times that capacity. And it's possible to do much, much better than that. What I described is a sketch of what one might see as an early-generation hybrid system. The direction from there leads to building larger and more complex structures with molecular precision, including CPUs, and denser memory systems that are not limited by addressing through a chip, but instead have all of the required devices implemented on the molecular scale. And at that point the hardware technology base for computers has been replaced, and we're clearly speaking of a many-billion-dollar world-class industry. In that same time frame, a wide range of other applications can be expected, but I think that example is sufficient to show the scale. What time frame do you think that's likely to be? Two years? Five years? Twenty years? Dates are difficult to estimate. It's much easier to calculate a good approximation to the behavior of a molecule than it is to make even a rough estimate of what a world full of researchers will do over a span of years when they're developing new tools and pursuing diverse goals. But keeping that in mind, it is easy to see how to get a first-generation molecular manipulator with what a number of laboratory researchers have independently told me looks like a few months of work, for a crude prototype that demonstrates the principles. Developing that into a capable system and learning to use that well for molecular construction might take five years. At that point it's plausible that you could make machines with a hundred moving parts. With machines with a hundred moving parts, you can now build a molecular-scale device, a hundred-nanometer scale (tenth-micron scale) device, that is doing molecular construction, and you're in a position to begin a multi-year development effort that leads to making macroscopic quantities of increasingly complex products. To reach the threshold of advanced molecular manufacturing, able to build a very wide range of kinds of materials and structures, will require another layer of advances on top of that. Again with fast, inexpensive experiments at each step along these paths, it's plausible that another five years, with at the later dates large resources being poured into what is already a very useful technology, could lead to these kinds of advanced results. If you add up that sequence of steps, it adds up to fifteen years, but I wouldn't place any great confidence on that number. Where do computers as we currently know them fit in with this, as tools or otherwise? Microcomputers are used to control scanning tunneling microscopes and atomic force microscopes, to capture the data from them, to process that data into images. Increasingly in coming years, I would expect to see more powerful computer capabilities being applied to the interpretation of those images. That will be important as molecular nanotechnology is developed, because the cycle of experimentation will involve trying to perform an operation and then probing the product with the point to determine what happened. Software for very rapidly and reliably interpreting those images will play a major role. The other major application is in molecular CAD. Ralph Merkle has been developing software that can be used together with the chemists' standard modeling packages to help design the structures that the other package models, and at this point he has quite a long wish list of capabilities there that they're beginning to work on. What is the distinction between designing and modeling? In molecular design you decide what atoms go where, what is the pattern of atoms and bonds that makes the shape that makes the structure of the component. In modeling you take that design, which is basically just a drawing--and then you turn on physics, or at least you turn on an approximation to the physics of real molecules, and as a result things move around, it all relaxes to a minimum-energy state. In other applications of modeling you can turn on thermal motion, and now the atoms move around subject to the forces that approximate the ones they would experience in a real molecule, so you can watch the dynamical behavior of the system. So this is in a sense mechanical engineering on a completely different scale from what we're used to. This is precisely mechanical engineering, in which something like finite-element approaches come naturally because there are a distinct number of atoms, though it's mathematically rather different from finite element analysis. One thing I've picked up over the last few months is that all this molecule-manipulating is happening at very cold temperatures, and once you get to something like room temperature all the molecules go on about their usual business. So how do you deal with things like turning off heat? How do you isolate what you want to do? That's a set of questions that has different answers in different contexts. The first in a long series of steps will involve having a lot of stray molecules around that you don't want, having all the problems that chemists have with choosing molecules that will react and trying to surround them by others that won't react if they're not supposed to. Today chemists successfully make things by what is the equivalent of throwing parts in a box and shaking. What these AFM-based molecular manipulators will provide is the equivalent of a hand to put parts together. You'll still have other parts shaking around, but you'll add to the process a hand. There are a lot of limits to what you can do with that, but it's a big advance over where we are. At the end of a long development pathway, what is described in Nanosystems is macroscopic systems that have a gas-tight wall around them. There are no stray molecules inside; the only molecules inside are ones that were brought in there after several steps of sorting and testing. They're bound firmly, and they go where you want them to, subject to only very small deviations due to thermal vibration. They can't wander, they can only vibrate within small range, and that range can be small enough that they end up where you want them. This is at the end of a long series of developments. This is where you are many years from now with a lot of work. In one of your papers you talk about how there is not, at this point, a formal academic discipline called nanotechnology, and you talk about what somebody can study at this time to acquire a good background in the field. I would say that some of the more relevant standard academic backgrounds are in physics, chemistry, mechanical engineering, and computer science. Molecular biology has some substantial relevance, particularly if there's been a focus on the molecular mechanisms involved. A lot of molecular biology is at some remove of abstraction from the nuts-and-bolts physical level. I just spent 4 1/2 years working on Nanosystems, and giving it the content that it has, for the purpose of enabling people with backgrounds in those areas to learn what is necessary to analyze and design molecular machine systems, or to develop software to aid in design and analysis, or to gain enough understanding of their capabilities that it's possible to think more concretely about what can be done with them. My hope is that that will be useful for people who are interested in working in the field. Molecular nanotechnology is a strongly interdisciplinary field. The fundamental interactions are chemical, but they need to be analyzed in a way that has more in common with the way physicists think of things than the way most chemists think of things. But the systems to be built, and therefore much of the analysis of what kinds of things to design, are primarily a matter of mechanical engineering. This has led to two problems in the academic world. In the United States at least, chemistry, including synthetic organic chemistry, which is about making new molecules, isn't taught as a kind of manufacturing, or indeed as any branch of engineering and construction; it's instead taught as a natural science. At the University of Tokyo, in contrast, the Department of Synthetic Chemistry is in the school of engineering. So this is an engineering discipline which is founded on the molecular sciences, but people in the molecular sciences, by and large, do not know how to think as engineers. Some do, but it's far outside the mainstream of the culture. That has led to difficulties. And also, deeply rooted in chemistry as practiced today is the assumption that molecules are small and that they move around freely and run into each other in all possible positions and orientations, and that if a reaction is possible in any of those positions and orientations, it will happen. That tremendously limits the building blocks you can use and what you can do with them. A physical system in which those same molecules move in a controlled way and only encounter each other as planned, and can then be pushed together--that seemingly simple change upsets a tremendous number of usually unexamined assumptions in the field, even though there is nothing fundamentally novel about it in a scientific sense. And that's led to a whole pattern of superficial misconceptions. For example, looking at a proposed structure and saying, "That structure is unstable." What's often meant is, "If that structure were to run into another identical structure in a particular position and orientation, then those two copies of the same structure would react." That doesn't arise in the context of advanced molecular machine systems, and in that context such a structure is stable. But a quick glance would suggest to a practicing chemist that whoever wrote down the structure doesn't know any chemistry, and that's enough to block the further study necessary to understand the system context in which the proposal makes sense. That's one of the more striking cases in my experience. In fact, the specific chemist I have in mind who made that observation was quoted as saying that in an article. I subsequently talked with him and put him on my doctoral committee.(laughs) At the end of which, I gather, he decided that I did know some chemistry. So do you find that as time has progressed, the academic community is more receptive than it was initially? One of the points in Unbounding that really struck home was the statement that any new technology is initially going to attract as many flakes and charlatans as reputable practitioners. And it will do that in particular if it has large consequences. What you're describing is a pattern that also works with scientists--society trusts scientists to judge new techn% 1proposals, rightly or wrongly (laughs). Many scientists, however, judge new proposals based not on how they look when they're examined, but on how they smell culturally. And if a set of proposals has striking, exciting human consequences, it will stir a fuss, and the fuss will create a stink, and the scientists will turn up their noses. And therefore we have a systematic mechanism which is very good at rejecting nonsense, that also rejects some of the more important things that are sensible. One of the consequences of this mechanism which is good at rejecting nonsense is that some of the most important long-range developments will be systematically scoffed at, for a while. It's in part a matter of asking the wrong experts. Asking chemists about molecular manufacturing is like asking transistor designers about computer architectures. Organic chemists work with structures that typically have a few to a few hundred atoms. A simple low-end molecular manufacturing system might have a billion atoms. The gap there is comparable to the gap between a single transistor--an isolated transistor and package, that level of technology in the 1950s--and the kind of VLSI that we're only getting toward today. So, go to the 1950s. Find someone who is working on growing germanium crystals to make the old germanium transistors, and ask that person to judge proposals for ten-million-transistor-on-a-chip computers, and whether RISC or CISC architectures are preferred. You'll get a blank expression, and the notion of having ten million devices on one piece of semiconductor would seem to be a fantasy. There's also a major cultural gap between the United States and Japan regarding setting goals, articulating goals. Many Japanese research projects are given extremely ambitious goals. For example, the Kunetaki Molecular Architecture Project speaks of self-organizing, intelligent, self-repairing, etc., etc., etc. molecular systems. That project is over. It was a five-year project. Did they achieve that? No. On the other hand, what they did was, they set the best direction they could think of, very ambitious. A direction that, if achieved, would lead to enormous payoffs. In that project they took steps in that direction, because that was the direction they'd articulated. If you judge their achievements relative to the end point of the direction they were pointing, they fall short, and you call it a failure. If instead you ask, "Did they direct their resources better because they chose those goals?" I think the answer is yes. In the United States we are afraid to articulate ambitious long-term goals, for whatever reasons--in part, the tendency of people to come back two years later and say "You haven't done it yet, and since you couldn't possibly have been planning ten or twenty years in advance, you must have failed." Because of those sorts of attitudes we don't articulate ambitious long-term goals. As a result, I believe, we tend to put our research efforts into directions that have, on the average, less long-term payoff, because we're not looking. I'm not saying that we can predict the outcome of research programs, or that we can predict what will pay off in the future. I'm saying that we do better if we try than if we don't. If we try intelligently than if we don't. Otherwise you're throwing away information, soft information, in favor of hard laboratory data on something that may be uninteresting in the long term. It's safe to assume that for every glorious life-saving application of this technology that might come along, there's almost sure to be an equally sinister and destructive one, perhaps developed in some secret government research lab. Is that an accurate perception, and if it is, how can people who are interested in nanotechnology keep it from going off to the dark side before it even gets out of the lab? That's been a subject of a lot of discussion among people who are really concerned with these issues. The policy that people seem to have settled on as making the most sense is to make it clear that yes, among the applications of this manufacturing technology is the manufacturing of systems that are tremendously subject to abuse, things that are inherently tremendously dangerous. But it's important to more than balance that by focusing on how these capabilities can be used for medicine, increasing wealth per capita while cleaning up the environment, and a whole range of beneficial applications that would naturally occur in the civilian sector, and would naturally encourage international cooperation. Because at this point, with all that we know of the potential dangers of technology and all the discussion there has been of this one, the potential nightmare scenario is not that we plunge ahead blindly ignoring the dangers and have them take us by surprise. The nightmare scenario is that a focus on the destructive potential of new technologies leads to them being classified as a military technology, leading to international rivalry with a military flavor, and precipitating a war. The optimistic scenario is one in which we keep in mind, as has happened so far, the potential for abuse, and the things that would be dangerous if you did them, and consider how we can avoid doing anything like that, while focusing on goals that encourage cooperation. Nanotechnology Resources: Where To Learn More First General Conference on Nanotechnology: Development, Applications and Opportunities November 11-14, 1992 Palo Alto, California Sponsored by the Foresight Institute Box 61058, Palo Alto, CA 94306, USA (415)324-2490 Fax (415)324-2497 Internet: foresight@cup.portal.com Institute for Molecular Manufacturing A nonprofit foundation supporting nanotechnology research 555 Bryant Street Suite 253 Palo Alto, CA 94301 (415)852-1244 fax: (415)852-9098 Internet: kshatter@tmn.com "The Assembler Multitude" An open discussion of New Technical Developments in Nanotechnology and Social Consequences and Dangers of Nanotechnology Meets every other Monday,7:30 PM to 9:30 PM University Lutheran Church 1611 Stanford Avenue, Palo Alto Organized by Ted Kaehler Net: Kaehler2@AppleLink.Apple.com Mailing address: 20525 Mariani Ave. MS 301-3D Cupertino, CA 95014 American Information Exchange (AMIX) (Extensive library of nanotechnology-related documents available for online purchase) 1881 Landings Avenue Mountain View, CA 94043 (415)903-1000 fax (415)903-1094 .