Creating the 8080: The Processor That Started the PC Revolution

If the introduction of the Intel 8008 led to a variety of interesting devices that can be considered the first early PCs, it was its successor?the Intel 8080 microprocessor?that really became the foundation on which the early PC industry was based.

If the introduction of the Intel 8008 led to a variety of interesting devices that can be considered the first early PCs, it was its successor—the Intel 8080 microprocessor—that really became the foundation on which the early PC industry was based.

Compared to the Intel 4004 and 8008 which preceded it, the 8080 was a far more powerful chip. Compared with the 4004's 2,300 transistors, the 8080 would end up with more than 4,500 transistors and could run at up to 2MHz. But more importantly, many of the things that required extra chips surrounding the 4004 and the 8008 were now integrated.

But perhaps the biggest difference is that while the 4004 and 8008 were designed as custom processors for a single company—the 4004 for Busicom's calculator and the 8008 for Datapoint's computer terminal—the 8080 was designed for a more general set of customers. In short, it was designed to be a building block for any company that wanted it—and this flexibility made it particularly suited for what would become the nascent PC industry.

Developing the 8080 The concepts for the 8080 go back to 1971 when Intel had finished the 4004 chip and was still working on the 8008, which would be formally launched in April 1972.

After the stories about the "CPU on a chip" came out, Intel was beginning to see interest in the microprocessor from all sorts of customers. According to Michael S. Malone's The Intel Trinity, "the entire electronics industry seemed to undergo an awakening."

"Suddenly, as if overnight, engineers they visited understood the meaning of the microprocessors," Malone wrote. "They had read the articles, heard the speeds, talked to their peers, and as if as one, jumped about the silicon bandwagon."

In the late summer of 1971, Federico Faggin, who led the design of the 4004 and would become the primary architect of the 8080, was giving some technical seminars on the 4004 and the 8008 and was visiting customers. In those visits, he said, "I received a fair amount of criticism—some of it valid—about the architecture and performance of the microprocessors. The more computer-oriented the company I visited was, the nastier people's comments were."

"They were seeing many limitations in our microprocessors, and particularly the interrupt structure. It was highly criticized and rightly so, because the 8008 had a very primitive, barely functional interrupt structure." Customers were also complaining about the size of the package and that the company was multiplexing addresses and data. "And of course, they wanted much higher speed. The speed of the 8008 at 0.5 megahertz was not adequate."

Faggin says that by the time he returned home, "I had an idea of how to make a better 8-bit microprocessor than the 8008, incorporating many of the features that people wanted: most important, speed and ease of interfacing. I could have boosted both of these features if I had used a 40-pin package instead of the 8008's 18-pin package and integrated the functions of the support chips."

In other words, he was considering making what would be, by most accounts, the first real "computer-on-a-chip."

Around this point, Intel had developed "n-channel technology"—a more efficient method of manufacturing transistors—primarily for its 4K dynamic memory, and Faggin thought that would allow him to have more and faster transistors in the package. He also thought about integrating a stack pointer and additional instructions to improve performance, as well as the 40-pin package, which made it possible to have a 16-bit address and 8-bit data bus.

In the spring of 1972, as the 8008 was wrapping up, Faggin sent a memo to his boss, Les Vadasz, asking to start work on the next project.

But surprisingly and frustratingly to Faggin, Intel didn't approve the project. Faggin says that Intel wanted to see how the market would react to the 4004 and 8008 first, while others noted the problems Intel was having getting its latest generation of memory chips out the door and wanted to focus on that.

As a result, Intel didn't approve the 8080 project until late September or early October of 1972, at which point Faggin (with Vadasz's approval) had hired Masatoshi Shima, the former Busicom engineer who had worked closely with Faggin on the development of the 4004.

According to Ted Hoff, he and Stanley Mazor, who were behind the early concepts for the 4004 and were trying to sell the concept to customers, were getting a lot of requests for help from companies that "were looking at the 8008 and trying to push it beyond its capabilities." Mazor says Intel actually had a number of options for the follow-on to the 8008, including a completely new design, but ended up picking an "enhanced 8008" because it would take less time to design.

As a result, he said, they aimed for a chip that wouldn't have strict machine code capability but would make the assembly language convertible, so if someone wrote a program for the 8008, they could convert it to the 8080.

Work on the architecture occurred early in 1972, and Faggin credits Shima, Mazor, Hoff, and 8008 circuit designer Hal Feeney as contributing a lot in the early discussions and specification of the chip. When Shima joined Intel in the fall of 1972, he began working for Faggin on the circuit design for the chip.

While the 4004 and 8008 would be manufactured using a 10 micron process, the 8080 would use a 6 micron process, allowing for much more miniaturization. (The process distance theoretically measures the size of features within the processor, such as the distance between transistors. Today's latest processors are produced at 14nm, with 10nm products being developed. Those would be theoretically 1,000 times closer together.) The four-chip package of the 8008 had 3,500 transistors, but the 8080 would have 5,000. And it would run at 2MHz, a huge leap in performance.

As a result, the 8080 was the first microprocessor whose instruction set and memory-addressing capability approached those of the minicomputers of the day.

Selling the Microprocessor The first production of the chip was in December 1973, and after working out some typical last-minute issues, Intel introduced the product in March 1974.

The 8080 was initially priced at $360 for a chip, which some have suggested was set to suggest a comparison with the IBM System /360. By that point, Intel knew there was market for the chip. Intel's Hal Feeney said the company provided over 400 customers with the 8080 specification before the chip was even completed.

By that point, Intel had engaged in a big marketing effort, led by Ed Gelbach and Regis McKenna, who marketed it as the "first computer on a chip." As part of this, there was a greater emphasis on development systems, such as Intel's Intellec machines, and software for such systems, including the work by Gary Kildall on the PL/M language and what would become the basis for CP/M.

Intel saw software as a way to sell chips, not as a business on its own. According to Paul Freiberger and Michael Swaine's Fire in the Valley, "when [Kildall] asked Intel executives if they had any objections to his marketing it on his own, they shrugged and told him to go ahead. They weren't going to sell it themselves."

Around this time, Intel was becoming more worried about competitors in the microprocessor business. Rockwell had introduced its PPS-4, a 4-bit processor in 1972, and Texas Instruments was working on a chip of its own. And, unknown to Intel, Motorola was working on its 6800 8-bit processor, which came out in the middle of 1974, just a few months after the 8080. In Faggin's estimation, the 6800 had the better architecture but used a process technology that made the chip large and slow, relative to the 8080.

One question that comes up is why Intel didn't choose to get into the PC business itself.

In an interview I did with Gordon Moore in 1997, he described the Altair as "just a hobby device where the inputs were toggle switches and the outputs were LEDs. You could demonstrate the way a computer worked, but a tough way to do any practical computing."

"I even turned down the idea of a home computer in that time period sometime," Moore said. "One of our engineers came up with the idea that you could build a computer and you could put it in the home, and I kind of asked him what it was good for, and the only application I got back was that the housewife could put her recipes on it. I could imagine my wife sitting there with a computer by the stove...it didn't really look very practical.

"In fact, even when Steve Jobs came over and showed us what was going on at Apple, you know I viewed it as just...one more of the hundreds of applications that existed for microprocessors and didn't appreciate that it was a significant new direction."

Noyce had a similar view, saying "The whole consumer business was an area we just didn't see in the beginning. It just seemed impossible that this phenomenal level of electronic sophistication represented by the microprocessor could ever be reduced enough in cost so that simple consumer requirements could be met."

Not long after the introduction of the 8080, Faggin left Intel to found Zilog, taking Shima with him. Together, they created the Z-80 microprocessor, which was designed to have binary compatibility with the 8080, so it could run the same software. The Z-80 itself would go on to be used in many of the early personal computers in the late 1970s, mostly running CP/M.

Meanwhile, the 8080 would get used in the first of the machines that would really gain the attention of the hobbyists who built the personal computer business, starting with the Altair 8800.

I'm not sure that the 8080 was really "the most important single product of the twentieth century," as Michael Malone calls it. But it was surely a product that changed the world.

Next: The Altair 8800

For more information, see Andy Grove: The Life and Times of an American by Richard S. Tedlow (2006, Portfolio Hardcover), The Birth of the Microprocessor by Federico Faggin, The Chip by T.R. Reid (2001, Random House Trade Paperbacks), "The Evolution of a Revolution," Intel Corporation, Fire in the Valley by Paul Freiberger and Michael Swaine (1984, McGraw-Hill, Inc.), "From the Archives: Gordon Moore," PCMag, A History of Modern Computing by Paul E. Ceruzzi (2003, The MIT Press), Inside Intel by Tim Jackson (1997, Harper Collins), The Intel Trinity by Michael S. Malone (2014, HarperBusiness), The Man Behind the Microchip by Leslie Berlin (2006, Oxford University Press), Microchip by Jeffrey Zygmont (2002, Basic Books), The New Alchemists by Dirk Hanson (1983, The Book Service Ltd), "Oral History Panel on the Development and Promotion of the Intel 8080 Microprocessor," Computer History Museum.

Michael J. Miller's Forward Thinking Blog: forwardthinking.pcmag.com
Michael J. Miller is chief information officer at Ziff Brothers Investments, a private investment firm. From 1991 to 2005, Miller was editor-in-chief of PC Magazine, responsible for the editorial direction, quality and presentation of the world's largest computer publication.
Until late 2006, Miller was the Chief Content Officer for Ziff Davis Media, responsible for overseeing the editorial positions of Ziff Davis's magazines, websites, and events. As Editorial Director for Ziff Davis Publishing since 1997, Miller took an active role in...
More »