How Apple’s PA Semi Acquisition Fits Into Its Chip History

April 28th, 2008

Daniel Eran DilgerApple’s acquisition of PA Semi does not signal an entirely new direction for the company. Throughout its history, Apple has designed sophisticated custom chips for use in its computers, in addition to codeveloping complete microprocessors. According to those in the know, it appears that after acting to jettison its internal custom silicon efforts and delegate much of that work to Intel, Apple experienced some remorse and acquired PA Semi to get right back into the chip design business. Here’s a look at Apple’s history in chips, leading up to how Apple’s acquisition of PA Semi relates to the beleaguered future of Microsoft’s Windows.

A Brief History of PC Chip Fabs in the Early 70s.
Intel and Texas Instruments independently developed the microprocessor almost simultaneously. In 1971 Intel released the 4004, what it calls the first commercial microprocessor, while TI introduced the first single-chip microcomputer the same year. The first obvious consumer applications were in calculators (TI’s specialty) and data terminals (the business Intel pursued).

At that time, the chip design process was integrated as part of chip manufacturing. Chip designers had to be large enough to build their own chip manufacturing plants, which created a significant barrier to entry in the business of custom designed chips. The expense of developing a new chip design and the limited demand for custom silicon in an era before ubiquitous personal computing and consumer electronics also helped to limit the playing field to a few major fabs.

MOS Technologies got started manufacturing TI designed chips for use in calculators, and became a principle chip vendor for Commodore Business Machines. Shortly after the release of Intel’s 8-bit 8080 in 1974, Motorola introduced its own alternative, the 6800. Both chips cost around $150 to $300 each. This price point prevented the widespread adoption of microprocessors, as their functions could be also performed with individual components for less.

Rapid Innovation in Chip Development in the Late 70s.Key chip developers at Motorola, including Chuck Peddle, wanted to develop a much cheaper version of the 6800 that could find wider adoption. After Motorola expressed no interest in the strategy, Peddle left with Bill Mensch and a half dozen other designer to build their own processor. They needed a chip manufacturer, so they went to work for MOS where they developed the 6501, which was similar to the Motorola 6800 but much cheaper, faster, smaller, and more efficient to manufacture.

Motorola sued MOS, resulting in the release of the similarly cheap, but less infringing 6502. Commodore purchased MOS in 1976 after fearing that TI would destroy its calculator business if it did not “vertically integrate” by buying up the means to develop its own components. Owning its own chip fab enabled Commodore to rapidly develop the custom processors used in its computers much faster and cheaper than competitors like Apple and Atari, who had to develop their custom chip designs using outside partners.

Commodore originally hoped to sell the cheap 6502 microprocessors to manufacturers such as Ford, but the real demand would come from personal computers. Steve Wozniak designed his original Apple I computer using the $25 MOS 6502 processor. Many other home computers of the day also used the same chip family, including 8-bit systems from Acorn, Atari, and Commodore, the Apple II line, and the NES game console.

After Commodore purchased MOS, it signaled the intention to enter the gaming market. Mensch, the 6502′s leading engineer, left to set up the Western Design Center, where he licensed MOS’ 6502 to manufacture the improved 65C02 used by Apple in its later Apple II systems, and the 16-bit 65C816, used by the 1986 Apple IIGS, the Super Nintendo, and various embedded devices from automotive controllers to pacemakers, now numbering above five billion chips sold.

Meanwhile, Federico Faggin, a key developer of Intel’s 8080, left to start Zilog, which introduced the binary compatible, simpler, and cheaper Z80 processor. This allowed Z80 systems to run the same CP/M operating system developed for the Intel 8080 by Gary Kildall of Digital Research. Zilog also widely licensed its processor design to other chip manufactures, creating a software standardization platform behind CP/M that made Z80 coprocessor cards popular even among home computers based on the 6502.

IBM Kills Innovation with the PC.A wide variety of 6502 and Z80 based systems competed for attention throughout the late 70s as the home computer market exploded. In 1981 however, IBM entered the microcomputer market and forever changed everything.

IBM had been working on an advanced RISC design called the IBM 801, a predecessor to the IBM POWER architecture. However, IBM decided to instead select the relatively old hat Intel 8088 as the basis of its 1981 PC to prevent its new desktop system from eating into its higher end computing products. IBM also chose to license DOS from the then mostly unknown Microsoft; the software was a direct, unauthorized clone of Digital Research CP/M.

This resulted in putting IBM’s tremendous marketing power behind one of Intel’s least exceptional processors and some unremarkable copycat software. IBM quickly lost control over the DOS PC industry it invented, unintentionally ceding control to Microsoft and a variety of PC cloners, including Compaq, a company founded by TI employees.

IBM’s PC rapidly killed CP/M and Z80 computers, which had been the closest thing to a lingua franca for personal computers prior to the DOS PC. It eventually also devastated the market for most other alternative desktop platforms. Acorn, Atari, and Commodore increasingly struggled to compete against the PC throughout the 80s. By 1995, Apple was the only non x86 computer maker with any real business left, and even it was looking less than viable.

Macintosh Reintroduces Innovation.While IBM’s original PC simply perpetuated more of the same thing, Apple had began work years earlier to dramatically advance the state of the art past the simple 8-bit text based systems of the late 70s, in parallel to its sales of the wildly profitable Apple II. The company invested $60 million into the development of an entirely graphical desktop computing system called the Lisa, along with a consumer version that became the Macintosh.

Xerox’ Palo Alto Research Center invested in Apple to help bring its advanced graphical computing technologies into the mainstream. In order to power such a system, Apple needed a far more advanced processor than the MOS 6502 family used in the Apple II line or the Intel x86 family used by IBM’s PC. Apple chose Motorola’s new 68000, an elegantly designed, forward looking, hybrid 16/32-bit processor.

Motorola’s 680×0 family powered the advanced graphics capabilities of the 1984 Macintosh and was also used by the Atari ST and Commodore’s Amiga, as well as the futuristic hardware developed by NeXT Computer in the late 80s and early 90s and workstations from HP, Sun, and Apollo. Intel had developed its own advanced processors, but was forced by the economies of scale related to the DOS PC clone market to build upon the weak x86 foundation of the processor selected by IBM.

Microsoft and x86 Slaughter Better Technologies.
Microsoft leveraged its position as an Apple partner to appropriate the Mac’s graphical user interface for use on the x86 PC; starting with Windows 3.0 in 1990 and particularly with Windows 95, Microsoft pushed its DOS user base to Windows, further entrenching the marginal x86 processor architecture.Motorola’s elegant 680×0 in the Mac was pitted against Intel’s clumsy and complicated but widely used x86 processor line in a battle of intense competition throughout the 80s. The size of the DOS PC market made it increasingly difficult for any better processor architectures to compete.

The weak technology but strong marketing behind x86 systems running Microsoft’s software was also increasingly a problem for Sun’s SPARC, SGI’s MIPS, DEC’s Alpha, IBM’s POWER, and HP’s PA-RISC architectures. All of those architectures were superior to Intel’s x86 line, but the market for those alternatives was consistently eroded throughout the 90s by the high volume sales of the x86 PC.

In the mid 90s, Microsoft even made efforts to develop cross platform versions of Windows NT for Alpha, MIPS, PowerPC in addition to x86 PCs, in order to get its software working on more substantial and sophisticated hardware. However, Microsoft was not able to deliver a workable cross platform architecture for NT 4.0; it could not even match the processor portability NeXT had perfected years earlier. Microsoft’s next major version, NT 5 (Windows 2000), withdrew support for any hardware other than x86 PCs.

At the same time, Microsoft also worked with Intel to replace the x86 architecture with a new 64-bit processor architecture called Itanium, using a design that originated at HP. Intel and HP partnered with SGI and Compaq in a spectacular boondoggle effort that directly sacrificed three of the five leading processor architectures to create what would ultimately end up being the fourth place loser behind x86, PowerPC and SPARC. More than anything else, Itanium helped promote the hegemony of the x86 by eliminating superior competition.

Apple’s Invisible Chip Business.
Along with Sun and IBM, Apple remained among the minority of computing companies that didn’t blindly follow the Microsoft/Intel juggernaut of marginal PC technology. As Motorola’s 680×0 began to run out of steam in the late 80s after a decade-long run, Apple decided to jump to a new processor architecture in order to keep pace with the fierce investment Intel could afford to pour into its x86 line.

As a vendor of both hardware and software, Apple was commonly compared to x86 PC clone manufacturers such as Compaq, or against Microsoft as a software platform vendor. However, Apple really went far beyond the PC vendors or Microsoft to develop highly integrated custom technology.

While PC vendors simply slapped commodity parts together and sold them with a license to Microsoft’s copycat DOS or Windows, Apple developed groundbreaking, high performance technology that made its desktop computers closer to the workstations sold by higher end vendors such as Sun and SGI. That required sophisticated custom chips.

To develop these, Apple began working with VLSI Technologies, a chip maker founded by Doug Fairbairn of Xerox PARC. VLSI specialized in ASICs, or application specific integrated circuits. Apple developed a variety of ASICs to reduce part costs and develop components that did not yet exist on the market. As chip development grew increasingly specialized and affordable, it began to make more sense for hardware makers to develop their own custom designs rather than only using mass produced, general purpose components. For example:

Much of the Apple II had been designed by Wozniak using a brilliant combination of mostly off the shelf parts. In developing the Apple III, Wendell Sander created a custom LSI (large scale integration, referring to 10,000s of transistors) chip that bundled the Apple II’s various disk controller components into a single chip, which he called the “Integrated Woz Machine.” This chip was used in the Macintosh.

Burrell Smith, working on the prototype Macintosh, developed a design to pack similar components together in what he called the “Integrated Burrell Machine,” as a play on the name of Apple’s top hardware competitor at the time. The design was not used in production.

In the mid 80s, Apple engineers Dan Hillman and Jay Rickard integrated the entire Apple II logic board into a single VLSI (very large scale integration, referring to 100,000s of transistors) chip called the Mega II, which was used in the Apple IIGS as well as the Apple IIe compatibility card for the Mac LC.

Throughout the 80s and 90s, Apple designed its own custom audio and graphics chips, SCSI controllers and other components. In the late 80s it even developed FireWire as a high speed, isochronous serial replacement to the existing parallel SCSI bus.

Apple’s highly integrated Mac hardware had little similarity to the basic PC designs that commonly lacked any built in support for audio, networking, SCSI, or even decent graphics throughout the 80s and into the early 90s. However, Microsoft’s increasing market power kept the archaic x86 architecture used by DOS PCs firmly in place.

Apple Assumes RISC with ARM and POWER.Because Apple was working on projects that few other companies were investing in, including the handheld Newton, it made sense for Apple to investigate the development of its own microprocessors. After meeting with Acorn Computer in the late 80s, Apple, Acorn and VLSI teamed up to develop a joint ARM architecture for low power processors suitable for use in the Newton and Acorn’s desktops.

Inspired by a project at Berkeley to develop a RISC processor, Acorn decided that if students could design a new chip architecture, it could too. Apple got involved after finding that Acorn’s design had most of what it wanted for the Newton’s processor. ARM needed the investment Apple could provide in order to develop additional requirements such as an integrated memory management unit. Along with VLSI as their fab partner, the two companies spun off ARM as an independent joint venture, which later licensed ARM processor designs to other companies.

Berkeley’s RISC processor itself served as the basis for Sun’s SPARC architecture. Stanford University’s parallel MIPS project was also spun out into an independent commercial company later acquired by SGI in 1992, which similar to Commodore before it, wanted to make sure it had continued access to the processors for use in its systems. The success of both projects inspired additional investment in RISC design, including an awakening of IBM’s dormant 801 project (resurrected as POWER), HP’s PA-RISC, and DEC’s Alpha processor.

In 1986, Apple had set out to develop the Aquarius project, its own RISC processor, as a successor to the 680×0 Macintosh architecture out of concern that Motorola wouldn’t be able to ship successive generations of the 680×0 on time. To do so, it hired a team of 50 engineers and purchased a Cray supercomputer. Those efforts didn’t materialize.

After Aquarius, Apple started work with Motorola to use its 88100 RISC processor in new Macs under the name Jaguar. IBM, which had just been jilted by Microsoft in the OS/2 fiasco, approached Apple with plans to collaborate on both a next generation operating system software project called Taligent and the use of IBM’s POWER server processor architecture in Apple’s desktop systems. Because of the work Apple had already invested in Motorola’s 88100, the three companies worked together to develop a hybrid chip called the 601, which used Motorola’s bus interface with IBM’s RISC core, resulting in the first member of the PowerPC family.

Apple’s ARM partnership resulted in the world’s most popular mobile processor family, and its AIM PowerPC partnership produced one of the most popular server and embedded processors. ARM was sheltered from competition with x86 due to the lack of suitability of either x86 or Microsoft’s software in the mobile space, but apart from the Mac, PowerPC was largely forced to find markets outside of the PC desktop it originally aimed at. Efforts to deliver Windows NT, OS/2, NeXTSTEP, and other operating systems to PowerPC were all derailed largely because of the market power behind the x86 PC.

Apple worked so closely with VLSI that it had its own dedicated division at the chip fab. In a press release from the mid 90s, Umesh Padval, vice president and general manager of VLSI’s Apple Products Division announced, “Integrating customized and standard functions on a single chip results in a number of distinct advantages for the customer. These include enhanced performance combined with decreases in the power, size, weight and price of the end-product. Our proprietary FSB cells have enabled VLSI to address Apple’s silicon needs quickly, thereby contributing to their innovative Power Macintosh 5200/75 LC family.”

As an increasingly beleaguered Apple worked to simplify its hardware operations in the late 90s to stem the flow of blood, it canceled the Newton, shed its ARM holdings, and worked to use more common, industry standard parts whenever possible to save money. It rapidly replaced ADB with USB, dropped its custom monitor adapters for the emerging DVI standard, and began using cheaper ATA drive controllers over custom designed SCSI ones.

Apple continued to recruit chip designers to develop custom silicon however. A 2004 Apple job posting for a Senior VLSI CAD Engineer said, “an ideal candidate will have extensive knowledge of the design flow required to build complex ASIC designs. This candidate is expected to define and implement process improvements for the Apple IC design and verification teams as well as set technical direction for CAD projects and infrastructure.”

Apple Delegates ASIC Development to Intel.By 2005, the future of PowerPC processors looked increasingly like the future of Motorola’s 680×0 a decade prior. Intel, motivated by tough competition from AMD, had turned the sow’s ear of the x86 architecture into a silk purse with its new Core processor. Apple hoped to be able to delegate nearly all of its custom chip development work to Intel and benefit from the same economies of scale that had enabled it to outsource its specialized graphics processor efforts to companies such as Nvidia and ATI.

John C Randolph writes, “Apple had a very good in-house VLSI design group that used to develop the ASICs for Apple’s PowerPC motherboards. With the Intel switch, Apple handed that responsibility over to Intel, and rather short-sightedly let most of those engineers go.”

“That really bit them on the ass when they were developing the iPhone, because not having their own chip design experts in-house made for very poor communication with Samsung, which is why the H1 processor in the iPhone wasn’t quite what they wanted, although it was exactly what they’d asked for; in other words, mostly Apple’s fault, not Samsung’s.”
The Future of Apple’s Chip Plans.“Now, the most significant data point to me about that acquisition,” Randolph added, “is that PA Semi was founded by the man who ran the DEC Alpha project. Apple can afford to develop a whole new CPU architecture, and if Steve Jobs decides it’s worth the risk, then the results could be fantastic. Imagine a processor designed specifically to work well with the new generation of compilers from the LLVM project. Not to mention, it would render cloning just about impossible.”

“PA Semi isn’t a small acquisition, however much Apple’s trying to downplay it. The last company they spent hundreds of millions for was NeXT. At $275M, I don’t believe this is just about better parts for the phone. I think Steve’s got something bigger in mind, although we probably won’t see the results for three years or more.”

Apple appears to have no real use for the PWRficient processor line PA Semi has been developing, although the chip already has seen significant interest from a variety of companies, particularly in aerospace and defense. It is therefore somewhat ironic that there are so many chip makers that have processor designs nobody seems to be interested in.

The next segment compares the contenders for the future of microprocessors and points out why the tables are turning for the historical market leverage Microsoft has enjoyed with the x86 architecture in the PC world.

I really like to hear from readers. Comment in the Forum or email me with your ideas.

Like reading RoughlyDrafted? Share articles with your friends, link from your blog, and subscribe to my podcast! Submit to Reddit or Slashdot, or consider making a small donation supporting this site. Thanks!

Nice historical overview by the way. Randolph seems to have an excellent view of what’s been going on.

Biggest thing since NeXT … now there’s a monumental thought.

http://tech.shirl.com bshirley

I don’t think you meant to overstate this, because two sections later you mention the non-personal computer architectures.

“By 1995, Apple was the only non x86 [personal] computer maker with any real business left…”

tundraboy

Yes, Jobs is again skating to where the puck is going to be. Which I think he has decided will be mobile devices with capabilities we can’t even dream of right now. I’m just going to sit back and enjoy the show (and revel in my AAPL holdings as they go through the roof).

SequimRealEstate

Who is John C. Randolph? Could not find out through Google, Answers, Yahoo and reread the article twice.

hi.wreck

Umm.. The MIPS processor was developed at Stanford, across the bay from Berekely, and not MIT, which is across the country. From Wikipedia: “In 1981, a team led by John L. Hennessy at Stanford University started work on what would become the first MIPS processor.”

gus2000

Remember, Apple is no longer “Apple Computer”. They resell music, build music players, sell phones, and have invaded the living room TV. Jobs is always working on integration, making the experience seamless, which is where I see this going.

A $1 bluetooth chip that is Bonjour-aware would go a long way toward making electronics truly “smart”, particularly if there was some kind of open-sourced control interface and data structure (and maybe an integrated web browser too). Call it “iWorld-compatible” and shame all the manufacturers of electronic components to include it.

I can’t wait to see what Apple comes up with. Where should I start queuing up?

http://johnsessays.blogspot.com John Muir

@ gus2000

Daniel already wrote about that … and you’re right, this PA Semi move actually fits right in with the idea.

Perhaps one reason Apple is not too excited about the appearance of Hackintoshes or Mac Clones is that they’re planning on using cheap, low-power chips from their new acquisition to instantiate Core Video, Core Audio, Core Animation, etc., in hardware, thus giving a real Mac performance that would completely blow any wannabes away. This would be a return to the historical situation of Macs being stuffed full of proprietary technology, but I think that’s the direction Apple really prefers. They can continue buying the expensive components essentially off-the-shelf, while maintaining a closed hardware architecture: except this time it will give them a stupefying performance advantage, instead of crippling overhead like before.

Of course, any manufacturer would be free to try the same thing with Windows’ APIs–they’ve been talking about it for 18 years. It’s just that Windows is such a moving target, it’s impossible to see it actually happening.

http://www.thecarbonlesspaper.com johnnyapple

When I first read about this I suspected it was iPhone/iPod touch related. I also wondered if Apple might develop it’s own graphics chip for portables or low end Macs to replace the so-so Intel GMA series. I also wonder if Apple might develop some kind of co-processor specific to OS X acceleration which could lock out attempts at cloning while maintaining hardware compatibility with Windows.

“In the mid 90s, Microsoft even made efforts to develop cross platform versions of Windows NT for Alpha, MIPS, PowerPC in addition to x86 PCs, in order to get its software working on more substantial and sophisticated hardware.”

I wasn’t there, but my understanding is that NT was originally targeted for i860, not x86.

http://www.cyclelogicpress.com Partners in Grime

Definitely would put the brakes on the cloners.

Berend Schotanus

What I am wondering about more and more: How is it possible that Intel, apparently overnight, changed its obsolete x86 chip line to the much admired Core series? Even with the big sales and economies of scales they must have done something terribly right in order to extend their success. Most of computer history is about small companies starting with a good idea, growing, then failing to keep up with ongoing evolution giving the lead to new small companies. It would make perfect sense when Intel would be going down with its aging x86 line just like Microsoft is going down with its aging Windows line. Why is this not the case?

http://johnsessays.blogspot.com John Muir

@ Berend

One word: AMD.

Microsoft engineered themselves an essential monopoly and have fought to maintain it, letting their actual wares stagnate and become self-serving and inward looking as these are the natural qualities their priorities promote.

Intel meanwhile are engineers who take pride in their work. More to the point: they never had a monopoly grip like Microsoft. No one can clone Windows and get away with it; but AMD (and others over the years) are perfectly free to make functionally equivalent processors for the PC.

I think even Intel became frustrated with the x86 long ago. Look at how many times they’ve launched alternatives on tangents. Those have all failed to date, except for the one in the majority of shipping systems now: the Core. It may be an x86 microprocessor, but it actually borrows a lot from RISC designs like the PPC and uses microcode (if I have the right name) to actually get the internal work done: “translating” x86 into something more compatible with modern design.

AMD are a vital reason behind the Core’s existence because their Athlon series – the Opteron and Athlon 64 in particular – came from behind and clobbered Intel’s troubled Pentium 4 some years ago and gave the company a challenge. Intel actually threw the P4 to the sideline and returned to the P3 designs their team in Israel had been perfecting for laptops, using these are the core to the Core. That was a giant move for a company the like of Intel. As Apple fans we’re used to agile twists and turns, but it was quite a shock to the rest of the industry and has really slammed AMD against the ropes.

Incidentally: AMD are in such a poor position again now it’s possible that Intel may respond to the decreased pressure. Heads had to roll (and internal politics had to foment) to get from the P4 to the Core. They only really did it because their rival had them by the throat. Now that they’re enjoying the rewards of their hard decisions several years ago, they may naturally be inclined to make easier choices today which will impact on the future.

Mind, if there’s one hardware maker who are best positioned to ably side-step any such issue, I think we can guess who it is.

Great explanation! I wasn’t so much aware of the competition between Intel and AMD. Your story makes perfect sense.

kovacm

xexexe… Jack Tramiel (Commodore) even gave some MOS 6052 chips for free to Wozniak and Jobs (they where collage boys) to “play” with them :)))

airmanchairman

@kovacm: That name sure is a blast from the past. Atari ST lovers will always remember the venerable TOS (Tramiel Operating System) which, like GEM, AmigaOS, DR-DOS and amazingly OS/2 totally misjudged the software threat that Microsoft posed to their futures with the PC/OEM juggernaut and sadly, sank without a trace…

dentaldoc

As a non-professional in the computer world I will bury my face if my comment is “silly”, but it seems to be that if Apple were to start introducing several custom hardware implementations to accelerate core audio, core video, core animation, etc, wouldn’t it tend to make the Apple “boxes” less competitive and less attractive to the virtualization market (i.e., Parallels to run Windows).

It seems that as they move to custom hardware solutions for these functions, they will necessarily devote less talent toward evolving these technologies in software. Won’t it be much harder for companies like Parallels to hook into these custom chips to keep the virtualization experience snappy and transparent? It seemed like the whole reason why virtualization has been so successful is the fact that the current Apple hardware is, not identical to, but very similar to the generic Windows boxes.

tehawesome

@ dentaldoc
My thinking is that hardware which is designed to specifically accelerate OS X Core technologies wouldn’t make any difference to a virtualized alternate OS: it simply wouldn’t have any way to make use of the additional hardware. As long as the implementation doesn’t alter the hardware architecture so fundamentally as to render it unusable by that virtualized OS, all that really happens is that you get a genuine hardware advantage to running OS X on the machine rather than anything else. Which leads me to an additional thought: if this accelerated hardware were to be made available through virtualisation, suddenly a Mac becomes a much more attractive machine no matter what OS you run. Apple is, after all, a hardware company.

Eric

From cost/ margin angle (which Apple is very sharp about), can only have enough economy of scale where it dominates – so likely to be chips levearging on PMP market where it already dominate (as compared to small % in PC market where the benefit of economy of scale belongs to Intel), then up to the iPhone class – so very likely chips for portable/ mobile device.

This is a battle between “Custom/ Need based” vs. “Open/ Mass based” approach