https://spectrum.ieee.org/the-surprising-story-of-the-first-microprocessors [ ] IEEE.orgIEEE Xplore Digital LibraryIEEE StandardsMore Sites Sign InJoin IEEE The Surprising Story of the First Microprocessors Share FOR THE TECHNOLOGY INSIDER [ ] Explore by topic AerospaceArtificial IntelligenceBiomedicalComputingConsumer ElectronicsEnergyHistory of TechnologyRoboticsSemiconductorsSensors TelecommunicationsTransportation FOR THE TECHNOLOGY INSIDER Topics AerospaceArtificial IntelligenceBiomedicalComputingConsumer ElectronicsEnergyHistory of TechnologyRoboticsSemiconductorsSensors TelecommunicationsTransportation Sections FeaturesNewsOpinionCareersDIYEngineering Resources More Special ReportsExplainersPodcastsVideosNewslettersTop Programming LanguagesRobots Guide For IEEE Members The MagazineThe Institute For IEEE Members The MagazineThe Institute IEEE Spectrum About UsContact UsReprints & PermissionsAdvertising Follow IEEE Spectrum Support IEEE Spectrum IEEE Spectrum is the flagship publication of the IEEE -- the world's largest professional organization devoted to engineering and applied sciences. Our articles, podcasts, and infographics inform our readers about developments in technology, engineering, and science. Join IEEE Subscribe About IEEEContact & SupportAccessibilityNondiscrimination PolicyTerms IEEE Privacy Policy (c) Copyright 2021 IEEE -- All rights reserved. A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy. view privacy policy accept & close Enjoy more free content and benefits by creating an account Saving articles to read later requires an IEEE Spectrum account The Institute content is only available for members Downloading full PDF issues is exclusive for IEEE Members Access to Spectrum's Digital Edition is exclusive for IEEE Members Following topics is a feature exclusive for IEEE Members Adding your response to an article requires an IEEE Spectrum account Create an account to access more content and features on IEEE Spectrum, including the ability to save articles to read later, download Spectrum Collections, and participate in conversations with readers and editors. For more exclusive content and features, consider Joining IEEE. Join the world's largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum's articles, archives, PDF downloads, and other benefits. Learn more - CREATE AN ACCOUNTSIGN IN JOIN IEEESIGN IN Enjoy more free content and benefits by creating an account Create an account to access more content and features on IEEE Spectrum, including the ability to save articles to read later, download Spectrum Collections, and participate in conversations with readers and editors. For more exclusive content and features, consider Joining IEEE. CREATE AN ACCOUNTSIGN IN History of Technology Type Feature Topic The Surprising Story of the First Microprocessors You thought it started with the Intel 4004, but the tale is more complicated Ken Shirriff 30 Aug 2016 12 min read Photo: INTEL Photo: INTEL /image/MjgwMTQ2NQ.jpeg The Die is Cast: Intel's 4-bit 4004 chip is widely regarded as the world's first microprocessor. But it was not without rivals for that title. Photo: Intel Transistors, the electronic amplifiers and switches found at the heart of everything from pocket radios to warehouse-size supercomputers, were invented in 1947. Early devices were of a type called bipolar transistors, which are still in use. By the 1960s, engineers had figured out how to combine multiple bipolar transistors into single integrated circuits. But because of the complex structure of these transistors, an integrated circuit could contain only a small number of them. So although a minicomputer built from bipolar integrated circuits was much smaller than earlier computers, it still required multiple boards with hundreds of chips. In 1960, a new type of transistor was demonstrated: the metal-oxide-semiconductor (MOS) transistor. At first this technology wasn't all that promising. These transistors were slower, less reliable, and more expensive than their bipolar counterparts. But by 1964, integrated circuits based on MOS transistors boasted higher densities and lower manufacturing costs than those of the bipolar competition. Integrated circuits continued to increase in complexity, as described by Moore's Law, but now MOS technology took the lead. By the end of the 1960s, a single MOS integrated circuit could contain 100 or more logic gates, each containing multiple transistors, making the technology particularly attractive for building computers. These chips with their many components were given the label LSI, for large-scale integration. Engineers recognized that the increasing density of MOS transistors would eventually allow a complete computer processor to be put on a single chip. But because MOS transistors were slower than bipolar ones, a computer based on MOS chips made sense only when relatively low performance was required or when the apparatus had to be small and lightweight--such as for data terminals, calculators, or avionics. So those were the kinds of computing applications that ushered in the microprocessor revolution. Most engineers today are under the impression that the start of that revolution began in 1971 with Intel's 4-bit 4004 and was immediately and logically followed by the company's 8-bit 8008 chip. In fact, the story of the birth of the microprocessor is far richer and more surprising. In particular, some newly uncovered documents illuminate how a long-forgotten chip--Texas Instruments' TMX 1795--beat the Intel 8008 to become the first 8-bit microprocessor, only to slip into obscurity. What opened the door for the first microprocessors, then, was the application of MOS integrated circuits to computing. The first computer to be fashioned out of MOS-LSI chips was something called the D200, created in 1967 by Autonetics, a division of North American Aviation, located in Anaheim, Calif. /image/MjgwMTQ4Mw.jpeg Three Proud Parents: Posing during induction ceremonies for the National Inventors Hall of Fame in 1996, Federico Faggin, Marcian "Ted" Hoff Jr., and Stanley Mazor (from left) show off the pioneering microprocessor they created in the early 1970s, the Intel 4004. Photo: Paul Sakuma/AP Photos This compact, 24-bit general-purpose computer was designed for aviation and navigation. Its central processing unit was built from 24 MOS chips and benefitted from a design technique called four-phase logic, which used four separate clock signals, each with a different on-off pattern, or phase, to drive changes in the states of the transistors, allowing the circuitry to be substantially simplified. Weighing only a few kilograms, the computer was used for guidance on the Poseidon submarine-launched ballistic missile and for fuel management on the B-1 bomber. It was even considered for the space shuttle. The D200 was followed shortly by another avionics computer that contained three CPUs and used in total 28 chips: the Central Air Data Computer, built by Garrett AiResearch (now part of Honeywell). The computer, a flight-control system designed for the F-14 fighter, used the MP944 MOS-LSI chipset, which Garrett AiResearch developed between 1968 and 1970. The 20-bit computer processed information from sensors and generated outputs for instrumentation and aircraft control. Some Assembly Required This online simulator lets you explore the workings of a simple microprocessor /image/MjgwMTQ5Ng.jpeg The architecture of the F-14 computer was unusual. It had three functional units operating in parallel: one for multiplication, one for division, and one for special logic functions (which included clamping a value between upper and lower limits). Each functional unit was composed of several different kinds of MOS chips, such as a read-only memory (ROM) chip, which contained the data that determined how the unit would operate; a data-steering chip; various arithmetic chips; and a RAM chip for temporary storage. Because the F-14 computer was classified, few people ever knew about the MP944 chipset. But Autonetics widely publicized its D200, which then inspired an even more compact MOS-based computer: the System IV. That computer was the brainchild of Lee Boysel, who left Fairchild Semiconductor in 1968 to cofound Four-Phase Systems, naming his new company after Autonetics' four-phase logic. The CPU of the 24-bit System IV was constructed from as few as nine MOS chips: three arithmetic-logic-unit (ALU) chips of a design dubbed the AL1 (which performed arithmetic operations like adding and subtracting, along with logical operations like AND, OR, and NOT), three ROM chips, and three random-logic chips. Everything's Bigger in Texas Although Texas Instruments' TMX 1795 and Intel's 8008 had a similar number of transistors, the former required a much larger silicon die. Indeed, the TMX 1795 was larger than the Intel 8008 and 4004 combined. Intel's engineers believed that its large size made the TI chip impractical to produce in commercial quantities, but TI's very successful TMS 0100 calculator chip, introduced at about the same time, had an even larger die. So the connection between die size and commercial viability must not have been straightforward. (The relative sizes of the dies are shown below.) /image/MjgwMTU3OQ.jpeg TMX 1795 3,078 transistors /image/MjgwMTU4MA.jpeg 4004 2,300 transistors /image/MjgwMTU4MQ.jpeg 8008 3,098 transistors Images: Computer History Museum Almost simultaneously, a Massachusetts-based startup called Viatron Computer Systems got into the game. Just a year after its launch in November 1967, the company announced its System 21, a 16-bit minicomputer with various accessories, all built from custom MOS chips. We can thank someone at Viatron for coining the word "microprocessor." The company first used it in an October 1968 announcement of a product it called the 2101. But this microprocessor wasn't a chip. In Viatron's lexicon, the word referred to part of a smart terminal, one that came complete with keyboard and tape drives and connected to a separate minicomputer. Viatron's "microprocessor" controlled the terminal and consisted of 18 custom MOS chips on three separate boards. Amid these goings-on at the end of the 1960s, the Japanese calculator maker Business Computer Corp. (better known as Busicom) contracted with Intel for custom chips for a multiple-chip calculator. The final product was simplified to a single-chip CPU, the now-famous Intel 4004, along with companion chips for storage and input/output (I/O). The 4-bit 4004 (meaning that it manipulated data words that were only 4 bits wide) is often considered the first microprocessor. The calculator containing the 4004 first came together at the start of 1971. By this time, it had plenty of competition. A semiconductor company called Mostek had produced the first calculator-on-a chip, the MK6010. And Pico Electronics and General Instrument also had their G250 calculator-on-a-chip working. Within six months, the Texas Instruments TMS 1802 calculator-on-a-chip was also operational, it being the first chip in TI's hugely successful 0100 line. While these circuits worked fine as calculators, they couldn't do anything else, whereas the 4004 operated by carrying out instructions stored in external ROM. Thus it could serve in a general-purpose computer. This was a fast-moving time for the electronic-calculator business, and after running into financial difficulties, Busicom gave up its exclusive rights to the 4004 chip. In November 1971 Intel began marketing it and its associated support chips as a stand-alone product intended for general computing applications. Within a few months, the 4004 was eclipsed by more powerful microprocessors, however, so it found few commercial applications. They included a couple of pinball machines, a word processor, and a system for tallying votes. /image/MjgwMTY1MA.jpeg CPU Flip-Flops: Makers of the Datapoint 2200 terminal sought a single-chip CPU for it from both Intel and Texas Instruments. Neither TI nor Intel's CPU chips saw use in the Datapoint 2200, but they led a wave of 8-bit microprocessors that powered the microcomputer revolution. Photo: History-computer.com In this sense, it was an electronic calculator that begot the first microprocessor, Intel's 4-bit 4004. But the 8-bit microprocessors that quickly succeeded it had a very different genesis. That story starts in 1969 with the development of the Datapoint 2200 "programmable terminal," by a company called Computer Terminal Corp. (CTC), based in San Antonio, Texas. The Datapoint 2200 was really a general-purpose computer, not just a terminal. Its 8-bit processor was initially built out of about 100 bipolar chips. Its designers were looking for ways to have the processor consume less power and generate less heat. So in early 1970, CTC arranged for Intel to build a single MOS chip to replace the Datapoint processor board, although it's unclear whether the idea of using a single chip came from Intel or CTC. /image/MjgwMTYxNQ.jpeg Guy From TI: Gary Boone led the development of the TMX1795, along with other important digital chips. Photo: Computer History Museum By June 1970, Intel had developed a functional specification for a chip based on the architecture of the Datapoint 2200 and then put the project on hold for six months. This is the design that would become the Intel 8008. So whether you consider the calculator-inspired 4004 or the terminal-inspired 8008 to be the first truly useful single-chip, general-purpose microprocessor, you'd have to credit its creation to Intel, right? Not really. You see, in 1970, when Intel began working on the 8008, it was a startup with about 100 employees. After learning of Intel's processor project, Texas Instruments, or TI--a behemoth of a company, with 45,000 employees--asked CTC whether it, too, could build a processor for the Datapoint 2200. CTC gave engineers at TI the computer's specifications and told them to go ahead. When they returned with a three-chip design, CTC pointedly asked whether TI could build it on one chip, as Intel was doing. TI then started working on a single-chip CPU for CTC around April 1970. That design, completed the next year, was first called the TMX 1795 (X for "experimental"), a name that morphed into TMC 1795 when it was time for the chip to shed its prototype status. In June 1971, TI launched a media campaign for the TMC 1795 describing how this "central processor on a chip" would make the new Datapoint 2200 "a powerful computer with features the original one couldn't offer." That didn't happen, though: After testing the TMC 1795, CTC rejected it, opting to continue building its processor using a board of bipolar chips. Intel's chip wouldn't be ready until the end of that year. Many historians of technology believe that the TMC 1795 died then and there. But newly surfaced documents from the late Gary Boone, the chip's lead developer, show that after CTC's rejection, TI tried to sell the chip (which after some minor improvements became known as the TMC 1795A) to various companies. Ford Motor Co. showed interest in using the chip as an engine controller in 1971, causing Boone to write, "I think we have walked into the mass market our 'CPU-on-a-chip' desperately needs." Alas, these efforts failed, and TI ceased marketing the TMC 1795, focusing on its more profitable calculator chips instead. Nevertheless, if you want to assign credit for the first 8-bit microprocessor, you should give that honor to TI, never mind that it fumbled the opportunity. img img img Images: Steve Golson Engines of Change: These memos reveal that Ford Motor Co. considered using TI's pioneering microprocessor as an engine controller. By the time Intel had the 8008 working, at the end of 1971, CTC had lost interest in single-chip CPUs and gave up its exclusive rights to the design. Intel went on to commercialize the 8008, announcing it in April 1972 and ultimately producing hundreds of thousands of them. Two years later, the 8008 spawned Intel's 8080 microprocessor, which heavily influenced the 8086, which in turn opened the floodgates for Intel's current line of x86 chips. So if you're sitting at a PC with an x86 processor right now, you're using a computer based on a design that dates all the way back to Datapoint's 2200 programmable terminal of 1969. As this history makes clear, the evolution of the microprocessor followed anything but a straight line. Much was the result of chance and the outcome of various business decisions that might easily have gone otherwise. Consider how the 8-bit processor architecture that CTC designed for the Datapoint 2200 was implemented in four distinct ways. CTC did it twice using a board stuffed with bipolar chips, first in an arrangement that communicated data serially and later using a parallel design that was much faster. Both TI and Intel met CTC's requirements with single chips having almost identical instruction sets, but the packaging, control signals, instruction timing, and internal circuitry of the two chips were entirely different. Intel used more advanced technology than did TI, most notably self-aligned gates made of polysilicon, which made the transistors faster and improved yields. This approach also allowed the transistors to be packed more densely. As a result, the 4004 and 8008, even combined, were smaller than the TMC 1795. Indeed, Intel engineers considered the TI chip too big to be practical, but that really wasn't the case: TI's highly successful TMS 0100 calculator chip, introduced soon afterward, was even larger than the TMC 1795. Given all this, whom should we credit with the invention of the microprocessor? One answer is that the microprocessor wasn't really an invention but rather something that everyone knew would happen. It was just a matter of waiting for the technology and market to line up. I find this perspective the most compelling. Another way to look at things is that "microprocessor" is basically a marketing term driven by the need of Intel, TI, and other chip companies to brand their new products. Boone, despite being the developer of the TMC 1795, later credited Intel for its commitment to turning the microprocessor into a viable product. In an undated letter, apparently part of a legal discussion over who should get credit for the microprocessor, he wrote: "The dominant theme in the development of the microprocessor is the corporate commitment made by Intel in the 1972-75 period.... Their innovations in design, software and marketing made possible this industry, or at least hurried it along." img Photos, left: Intel; right: Computer History Museum The First Microprocessor: Credit normally goes to the Intel 4004, a 4-bit chip designed to serve in a calculator [left]. But there are other possible firsts, depending on your definitions. One was the AL1 arithmetic-logic-unit chip from Four-Phase Systems [right], which predates the 4004 and was used to demonstrate a working computer in a dispute over an early patent for the microprocessor. Honors for creating the first microprocessor also depend on how you define the word. Some define a microprocessor as a CPU on a chip. Others say all that's required is an arithmetic logic unit on a chip. Still others would allow these functions to be packaged in a few chips, which would collectively make up the microprocessor. In my view, the key features of a microprocessor are that it provides a CPU on a single chip (including ALU, control functions, and registers such as a program counter) and that it is programmable. But a microprocessor isn't a complete computer: Additional chips are typically needed for memory, I/O, and other support functions. Using such a definition, most people consider the Intel 4004 to be the first microprocessor because it contains all the components of the central processing unit on a single chip. Both Boone and Federico Faggin (of Intel's 4004 team) agree that the 4004 beat the earliest TMX 1795 prototypes by a month or two. The latter would then represent the first 8-bit microprocessor, and the Intel 8008 the first commercially successful 8-bit microprocessor. But if you adopt a less-restrictive definition of "microprocessor," many systems could be considered the first. Those who consider an ALU-on-a-chip to be a microprocessor credit Boysel for making the first one at Fairchild in 1968, shortly before he left to cofound Four-Phase Systems. The AL1 from Four-Phase Systems is also a candidate because it combined registers and ALU on a single chip, while having the control circuitry external. If you allow that a microprocessor can consist of multiple LSI chips, the Autonetics D200 would qualify as first. Patents provide a different angle on the invention of the microprocessor. TI was quick to realize the profitability of patents. It obtained multiple patents on the TMX 1795 and TMS 0100 and made heavy use of these patents in litigation and licensing agreements. Based on its patents, TI could be considered the inventor of both the microprocessor and the microcontroller, a single-chip packaging of CPU, memory, and various support functions. Or maybe not. That's because Gilbert Hyatt obtained a patent for the single-chip processor in 1990, based on a 16-bit serial computer he built in 1969 from boards of bipolar chips. This led to claims that Hyatt was the inventor of the microprocessor, until TI defeated Hyatt's patent in 1996 after a complex legal battle. Another possible inventor to credit would be Boysel. In 1995, during a legal proceeding that Gordon Bell later mockingly called "TI versus Everybody [PDF]," Boysel countered TI's single-chip processor patents by using a single AL1 ALU chip from 1969 to demonstrate a working computer to the court. His move effectively torpedoed TI's case, although I don't see his demo as particularly convincing, because he used some technical tricks to pull it off. Regardless of what you consider the first microprocessor, you have to agree that there was no lack of contenders for this title. It's a shame, really, that most people seek to recognize just one winner in the race and that many fascinating runners-up are now almost entirely forgotten. But for those of us with an interest in the earliest days of microcomputing, this rich history will live on. About the Author Ken Shirriff worked as a programmer for Google before retiring in June 2016. A computer history buff, he's fascinated with the earliest CPU chips. At the time of publication of this article, he was helping to restore a 1973 Xerox Alto microcomputer, the computer that introduced the graphical user interface and the mouse. (For more on the restoration, see Shirriff's blog, www.righto.com.) silicon revolution hardware type:feature microcontroller TMX 1795 microprocesor 4004 Intel Ken Shirriff The Conversation (0) Video Friday: Preparing for the SubT Final Type News Topic Robotics Video Friday: Preparing for the SubT Final 6h 3 min read Four thin raised black legs join in an x. A vertical piece with equipment and wires is on top, and hanging under is a circuit board. The top of the base has two wide black propellers extending out. Type News Topic Robotics China's Mars Helicopter to Support Future Rover Exploration 6h 2 min read On the left, a beige and silver motor and flywheel equipment. To the right, a large red metal fram cabinet with silver and green component and blue, white and black wires. Type News Topic Transportation New Fuel Cell Tech Points Toward Zero-Emission Trains 9h 2 min read The Institute Type Interview Topic History of Technology Q&A With Co-Creator of the 6502 Processor Bill Mensch on the microprocessor that powered the Atari 2600 and Commodore 64 Stephen Cass Stephen Cass is the special projects editor at IEEE Spectrum. He currently helms Spectrum's Hands On column, and is also responsible for interactive projects such as the Top Programming Languages app. He has a bachelor's degree in experimental physics from Trinity College Dublin. 16 Sep 2021 5 min read A smiling man with a white beard. Bill Mensch atari 2600 commodore 64 super nintendo gaming consoles microprocessor ieee member news type:ti bill mensch Few people have seen their handiwork influence the world more than Bill Mensch. He helped create the legendary 8-bit 6502 microprocessor , launched in 1975, which was the heart of groundbreaking systems including the Atari 2600, Apple II, and Commodore 64. Mensch also created the VIA 65C22 input/output chip--noted for its rich features and which was crucial to the 6502's overall popularity--and the second-generation 65C816, a 16-bit processor that powered machines such as the Apple IIGS, and the Super Nintendo console. Many of the 65x series of chips are still in production. The processors and their variants are used as microcontrollers in commercial products, and they remain popular among hobbyists who build home-brewed computers. The surge of interest in retrocomputing has led to folks once again swapping tips on how to write polished games using the 6502 assembly code, with new titles being released for the Atari, BBC Micro, and other machines. Mensch, an IEEE senior life member, splits his time between Arizona and Colorado, but folks in the Northeast of the United States will have the opportunity to see him as a keynote speaker at the Vintage Computer Festival in Wall, N.J., on the weekend of 8 October. In advance of Mensch's appearance, The Institute caught up with him via Zoom to talk about his career. This interview had been condensed and edited for clarity. The Institute: What drew you into engineering? Bill Mensch: I went to Temple University [in Philadelphia] on the recommendation of a guidance counselor. When I got there I found they only had an associate degree in engineering technology. But I didn't know what I was doing, so I thought: Let's finish up that associate degree. Then I got a job [in 1967] as a technician at [Pennsylvania TV maker] Philco-Ford and noticed that the engineers were making about twice as much money. I also noticed I was helping the engineers figure out what Motorola was doing in high-voltage circuits--which meant that Motorola was the leader and Philco was the follower. So I went to the University of Arizona, close to where Motorola was, got my engineering degree [in 1971] and went to work for Motorola. TI: How did you end up developing the 6502? BM: Chuck Peddle approached me. He arrived at Motorola two years after I started. Now, this has not been written up anywhere that I'm aware of, but I think his intention was to raid Motorola for engineers. He worked with me on the peripheral interface chip (PIA) and got to see me in action. He decided I was a young, egotistical engineer who was just the right kind to go with his ego. So Chuck and I formed a partnership of sorts. He was the system engineer, and I was the semiconductor engineer. We tried to start our own company [with some other Motorola engineers] and when that didn't happen, we joined an existing [semiconductor design] company, called MOS Technology, in Pennsylvania in 1974. That's where we created the 6501 and 6502 [in 1975], and I designed the input/output chips that went with it. The intention was to [develop a US $20 microprocessor to] compete with the Intel 4040 microcontroller chipset, which sold for about $29 at the time. We weren't trying to compete with the 6800 or the 8080 [chips designed for more complex microcomputer systems]. TI: The 6502 did become the basis of a lot of microcomputer systems, and if you look at contemporary programmer books, they often talk about the quirks of the 6502's architecture and instruction set compared with other processors. What drove those design decisions? BM: Rod Orgill and I had completed the designs of a few microprocessors before the 6501/6502. In other words, Rod and I already knew what was successful in an instruction set. And lower cost was key. So we looked at what instructions we really needed. And we figured out how to have addressable registers by using zero page [the first 256 bytes in RAM]. So you can have one byte for the op code and one byte for the address, and [the code is compact and fast]. There are limitations, but compared to other processors, zero page was a big deal. There is a love for this little processor that's undeniable. TI: A lot of pages in those programming books are devoted to explaining how to use the versatile interface adapter (VIA) chip and its two I/O ports, on-board timers, a serial shift register, and so on. Why so many features? BM: I had worked on the earlier PIA chip at Motorola. That meant I understood the needs of real systems in real-world implementations. [While working at MOS] Chuck, Wil Mathis, our applications guy, and I were eating at an Arby's one day, and we talked about doing something beyond the PIA. And they were saying, "We'd like to put a couple of timers on it. We'd like a serial port," and I said, "Okay, we're going to need more register select lines." And our notes are on an Arby's napkin. And I went off and designed it. Then I had to redesign it to make it more compatible with the PIA. I also made a few changes at Apple's request. What's interesting about the VIA is that it's the most popular chip we sell today. I'm finding out more and more about how it was used in different applications. TI: After MOS Technology, in 1978 you founded The Western Design Center, where you created the 65C816 CPU. The creators of the ARM processor credit a visit to WDC as giving them the confidence to design their own chip. Do you remember that visit? BM: Vividly! Sophie Wilson and Steve Furber visited me and talked to me about developing a 32-bit chip. They wanted to leapfrog what Apple was rumored to be up to. But I was just finishing up the '816, and I didn't want to change horses. So when they [had success with the ARM] I was cheering them on because it wasn't something I wanted to do. But I did leave them with the idea of, "Look, if I can do it here ... there are two of you; there's one of me." TI: The 6502 and '816 are often found today in other forms, either as the physical core of a system-on-a-chip, or running on an FPGA. What are some of the latest developments? BM: I'm excited about what's going on right now. It's more exciting than ever. I was just given these flexible 6502s printed with thin films by PragmatIC! Our chips are in IoT devices, and we have new educational boards coming out. TI: Why do you think the original 65x series is still popular, especially among people building their own personal computers? BM: There is a love for this little processor that's undeniable. And the reason is we packed it with love while we were designing it. We knew what we were doing. Rod and I knew from our previous experience with the Olivetti CPU and other chips. And from my work with I/O chips, I knew [how computers were used] in the real world. People want to work with the 65x chips because they are accessible. You can trust the technology. From Your Site Articles * 25 Microchips That Shook the World - IEEE Spectrum > * The Truth About Bender's Brain - IEEE Spectrum > Related Articles Around the Web * Bill Mensch - Wikipedia > * ANTIC The Atari 8-bit Podcast: ANTIC Interview 96 - Bill Mensch ... > * 2017 Top Embedded Innovator: Bill Mensch, CEO and Founder ... > Keep Reading | Show less Robotics Type News Topic Spot's 3.0 Update Adds Increased Autonomy, New Door Tricks Boston Dynamics' Spot can now handle push-bar doors and dynamically replan in complex environments Evan Ackerman Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes. 15 Sep 2021 5 min read Animated gif from a video showing a yellow and black 4 legged robot which has an arm and opens a door and it closes after the robot goes through. Boston Dynamics Boston Dynamics Spot robot quadruped robots robotics While Boston Dynamics' Atlas humanoid spends its time learning how to dance and do parkour, the company's Spot quadruped is quietly getting much better at doing useful, valuable tasks in commercial environments. Solving tasks like dynamic path planning and door manipulation in a way that's robust enough that someone can buy your robot and not regret it is, I would argue, just as difficult (if not more difficult) as getting a robot to do a backflip. With a short blog post today, Boston Dynamics is announcing Spot Release 3.0, representing more than a year of software improvements over Release 2.0 that we covered back in May of 2020. The highlights of Release 3.0 include autonomous dynamic replanning, cloud integration, some clever camera tricks, and a new ability to handle push-bar doors, and earlier today, we spoke with Spot Chief Engineer at Boston Dynamics Zachary Jackowski to learn more about what Spot's been up to. Here are some highlights from Spot's Release 3.0 software upgrade today, lifted from this blog post which has the entire list: * Mission planning: Save time by selecting which inspection actions you want Spot to perform, and it will take the shortest path to collect your data. * Dynamic replanning: Don't miss inspections due to changes on site. Spot will replan around blocked paths to make sure you get the data you need. * Repeatable image capture: Capture the same image from the same angle every time with scene-based camera alignment for the Spot CAM+ pan-tilt-zoom (PTZ) camera. * Cloud-compatible: Connect Spot to AWS, Azure, IBM Maximo, and other systems with existing or easy-to-build integrations. * Manipulation: Remotely operate the Spot Arm with ease through rear Spot CAM integration and split-screen view. Arm improvements also include added functionality for push-bar doors, revamped grasping UX, and updated SDK. * Sounds: Keep trained bystanders aware of Spot with configurable warning sounds. The focus here is not just making Spot more autonomous, but making Spot more autonomous in some very specific ways that are targeted towards commercial usefulness. It's tempting to look at this stuff and say that it doesn't represent any massive new capabilities. But remember that Spot is a product, and its job is to make money, which is an enormous challenge for any robot, much less a relatively expensive quadruped. Yellow and black four legged robot standing in a factory For more details on the new release and a general update about Spot, we spoke with Zachary Jackowski, Spot Chief Engineer at Boston Dynamics. IEEE Spectrum: So what's new with Spot 3.0, and why is this release important? Zachary Jackowski: We've been focusing heavily on flexible autonomy that really works for our industrial customers. The thing that may not quite come through in the blog post is how iceberg-y making autonomy work on real customer sites is. Our blog post has some bullet points about "dynamic replanning" in maybe 20 words, but in doing that, we actually reengineered almost our entire autonomy system based on the failure modes of what we were seeing on our customer sites. The biggest thing that changed is that previously, our robot mission paradigm was a linear mission where you would take the robot around your site and record a path. Obviously, that was a little bit fragile on complex sites--if you're on a construction site and someone puts a pallet in your path, you can't follow that path anymore. So we ended up engineering our autonomy system to do building scale mapping, which is a big part of why we're calling it Spot 3.0. This is state-of-the-art from an academic perspective, except that it's volume shipping in a real product, which to me represents a little bit of our insanity. And one super cool technical nugget in this release is that we have a powerful pan/tilt/zoom camera on the robot that our customers use to take images of gauges and panels. We've added scene-based alignment and also computer vision model-based alignment so that the robot can capture the images from the same perspective, every time, perfectly framed. In pictures of the robot, you can see that there's this crash cage around the camera, but the image alignment stuff actually does inverse kinematics to command the robot's body to shift a little bit if the cage is including anything important in the frame. When Spot is dynamically replanning around obstacles, how much flexibility does it have in where it goes? There are a bunch of tricks to figuring out when to give up on a blocked path, and then it's very simple run of the mill route planning within an existing map. One of the really big design points of our system, which we spent a lot of time talking about during the design phase, is that it turns out in these high value facilities people really value predictability. So it's not desired that the robot starts wandering around trying to find its way somewhere. Do you think that over time, your customers will begin to trust the robot with more autonomy and less predictability? I think so, but there's a lot of trust to be built there. Our customers have to see the robot to do the job well for a significant amount of time, and that will come. Can you talk a bit more about trying to do state-of-the-art work on a robot that's being deployed commercially? I can tell you about how big the gap is. When we talk about features like this, our engineers are like, "oh yeah I could read this paper and pull this algorithm and code something up over a weekend and see it work." It's easy to get a feature to work once, make a really cool GIF, and post it to the engineering group chat room. But if you take a look at what it takes to actually ship a feature at product-level, we're talking person-years to have it reach the level of quality that someone is accustomed to buying an iPhone and just having it work perfectly all the time. You have to write all the code to product standards, implement all your tests, and get everything right there, and then you also have to visit a lot of customers, because the thing that's different about mobile robotics as a product is that it's all about how the system responds to environments that it hasn't seen before. The blog post calls Spot 3.0 "A Sensing Solution for the Real World." What is the real world for Spot at this point, and how will that change going forward? For Spot, 'real world' means power plants, electrical switch yards, chemical plants, breweries, automotive plants, and other living and breathing industrial facilities that have never considered the fact that a robot might one day be walking around in them. It's indoors, it's outdoors, in the dark and in direct sunlight. When you're talking about the geometric aspect of sites, that complexity we're getting pretty comfortable with. I think the frontiers of complexity for us are things like, how do you work in a busy place with lots of untrained humans moving through it--that's an area where we're investing a lot, but it's going to be a big hill to climb and it'll take a little while before we're really comfortable in environments like that. Functional safety, certified person detectors, all that good stuff, that's a really juicy unsolved field. Spot can now open push-bar doors, which seems like an easier problem than doors with handles, which Spot learned to open a while ago. Why'd you start with door handles first? Push-bar doors is an easier problem! But being engineers, we did the harder problem first, because we wanted to get it done. From Your Site Articles * Boston Dynamics' Spot Robot Gets Even More Capable With ... > * "Boston Dynamics Will Continue to Be Boston Dynamics," Company ... > * Boston Dynamics' Spot Robot Dog Now Available for $74,500 - IEEE ... > Related Articles Around the Web * Spot Release 3.0: Flexible autonomy and repeatable data capture > * Spot's Got an Arm! - YouTube > * Spot | Boston Dynamics > Keep Reading | Show less Trending Stories The most-read stories on IEEE Spectrum right now The Institute Type Interview Topic History of Technology Q&A With Co-Creator of the 6502 Processor 16 Sep 2021 5 min read A smiling man with a white beard. Robotics Type News Topic Spot's 3.0 Update Adds Increased Autonomy, New Door Tricks 15 Sep 2021 5 min read Animated gif from a video showing a yellow and black 4 legged robot which has an arm and opens a door and it closes after the robot goes through. Computing Type Analysis Topic Competing Visions Underpin China's Quantum Computer Race 15 Sep 2021 5 min read Quibit pattern Consumer Electronics Type News Topic Hum to Google to Identify Songs 09 Nov 2020 3 min read Semiconductors Type News Topic How and When the Chip Shortage Will End, in 4 Charts 29 Jun 2021 4 min read Consumer Electronics Type Analysis Topic Will iPhone 13 Trigger Headaches and Nausea? 15 Sep 2021 2 min read iPhone 13 in various colors. Type News Topic Energy Graphene Jolts Sodium-Ion Batteries' Capacity 11 Sep 2021 2 min read Four layers of silver balls and connected lines. In between are molecule symbols and three large green balls with black plus signs on them. Energy Type Feature Topic Here's How We Could Brighten Clouds to Cool the Earth 07 Sep 2021 12 min read Silver and blue equipment in the bottom left. A large white spray comes from a nozzle at the center end.