I grew up on the Commodore 128. We got one for Christmas 1985 (an upgrade from a Commodore 64). It was a bit of a quirky machine, but I liked it.

On the retro computing forums, it might be the most controversial thing Commodore ever did. Which says something, seeing as some computer historians have summed up Commodore’s history in four words: Irving Gould‘s stock scam. But that’s another story.

The cool thing about Commodore was that its engineers weren’t shy about talking about their projects. Bil Herd, Fred Bowen, and Dave Haynie have all weighed in over the years, talking about what they did and why and what they would have done differently.

CP/M vs. MS-DOS. Perhaps the biggest question is why they didn’t make the 128 IBM compatible, instead, going with the past-its-prime CP/M.

Mainly it came down to cost. CP/M was cheap to implement–Bil Herd said it cost $1 to do. Presumably that was the cost of the software. The hardware wasn’t all that expensive either. Commodore had lots of Z-80 CPUs in inventory, presumably left over from the ill-fated CP/M cartridge for the C-64, which didn’t work properly most of the time, and didn’t work well when it was working properly. So it didn’t sell.

The rest of the hardware was free, since CP/M just used the 8502 and its support chips for I/O. CP/M let you get away with that kind of trickery. The result wasn’t as fast as a Kaypro, but it was faster than a 64 running CP/M.

The Z-80 wasn’t an expensive CPU anyway, and besides giving CP/M capability, it fixed other problems. Some C-64 cartridges didn’t behave well, and they caused the C-128 prototypes to crash. Bil Herd solved that problem by letting the Z-80 do the system initialization. Since it started at a different address than 6502-series CPUs do (the 8502 was a souped-up 6502), it could work around the weirdness. It worked.

As for IBM compatibility, Commodore knew how to do things cheaply, but Commodore couldn’t do it for the price of a Z-80 and a CP/M license. I don’t remember what the Amiga Bridgeboard, which added an 8088 and MS-DOS to an Amiga 2000, cost when it first came out, but it cost $300 when I bought one in 1991 or so. The 128 retailed for $349 when it came out in 1985. It couldn’t have sold in 1985 for $349 with any kind of MS-DOS capability on board.

It was too expensive. It’s interesting that people complain about its original retail price on one hand, and want IBM compatibility on the other. The 128 was more expensive than the 64, but it was intended to compete with the Apple IIc and IBM PCjr. And it did end up coming about halfway between those and the C-64, price-wise.

Herd actually designed the surplus chip inventory into the machine. Short-term it helped costs. Maybe long term it didn’t, but it cleared out the warehouses, made the company $200 million in its first year, and it was only designed with a shelf life of 14 months. Per Herd, its reason for existing was to give them something to show and sell until the Amiga was ready.

Since the machine lasted four years on the market and a total of about a billion dollars in revenue, well past its intended 14 months, that’s pretty good.

Maybe they should have been more ambitious. The 64 had already by 1985 lasted longer than it was supposed to. Even the VIC-20 lasted longer than expected. But in 1985, everyone expected the 68000-based machines to be the future. Even Microsoft. The enduring popularity of the 6502 generation caught everyone by surprise. Nobody expected in 1985 to still be selling 6502-based machines in 1990.

Herd admitted in the links above that he didn’t expect to be able to anticipate what users would and wouldn’t use. So they designed what he could in five months, knowing users would use about 45% of the capability inside, but not knowing which 45%, they did what they could.

Sure, it would have sold more if it could have cost $149, but they still had the 64, which could (and did) meet that price point. And less. The 128, with twice the memory, two video chips, and an MMU to handle the memory, was always going to be more costly than the 64.

64 mode. Some people argue the 128 didn’t need to be 64-compatible. But the knock on the Plus/4 was that it wasn’t 64-compatible, and they weren’t going to make that mistake again. Apple tried something similar with the Apple III, walking away from its earlier machine, and the Apple III flopped about as badly as the Plus/4 did. Backward compatibility sold computers in the 1980s. To a large degree it still does now. Lack of 64 compatibility was one of the things that doomed the Commodore 256.

It would have been nice if they could have gone about it differently, building an enhanced 64, without doing the dual-mode thing. Add more video modes to the VIC-II, stuff like that. But that wasn’t an option. The engineers who designed the VIC-II and the SID were gone soon after the 64 hit the market. Perhaps someone else could have gone in and revised them–someone else did design a VIC-III for the never-released C-65–but they wouldn’t have been able to go from concept to finished machine in 5 months if they had.

The VDC. Herd never did have anything nice to say about the 8563 VDC chip, used for 80-column text mode and RGB graphics. He would have preferred to use a Motorola 6845 (or, presumably, Commodore’s own very similar MOS 6545) for that capability. Salvaging the VDC from the project it was designed for probably was appealing, but with the problems they had getting the 8563s to work right, going with the 6545 or even buying 6845s probably would have ended up being cheaper.

The VDC ended up having a lot of nice capabilities, but a lot of those didn’t come to light until the machine was near the end of its life.

The 6845 was more of a known quantity, having been used in other computers. Even if it had been a worse chip than the 8563, people knowing how to program it right away would have meant software that utilized it would have appeared more quickly.

Failure. It’s hard to call the 128 a failure, though people sometimes do anyway. It didn’t sell in the numbers the 64 did, but nothing else did either. There’s a lot of dispute over how many 128s actually sold. Info magazine announced Commodore had shipped 2 million units by 3Q 1988. Some estimates say 4 million total, by the time the machine was discontinued in 1989. Some people dispute that estimate. But Apple sold 5-6 million Apple IIs, from 1977 to 1993. Commodore sold 1/3 that many 128s in three years.

I don’t think anyone calls the VIC-20 a failure, but it sold 2.5 million units. The 128’s sales were comparable to the VIC-20, in spite of it costing three times as much as the VIC. Interestingly, both of those machines were designed with getting rid of excess chip inventory in mind.

The Amiga. And of course there’s the frequent complaint that Commodore didn’t care about the 128, they only cared about selling Amigas. I voiced that complaint in the 1980s just as much as anyone else did.

Commodore and its competitors Apple and Atari continued to sell 8-bit computers much longer than they expected. But they knew that generation would eventually wind down. None of them knew when. Apple and Commodore tried to predict it; I don’t think Atari cared. Apple and Commodore both got it wrong anyway.

And you don’t stay in business by standing still. Commodore bet on the Amiga being the next big thing and it wasn’t. But no 6502-based computer was going to keep PCs from becoming the next big thing.

Commodore mishandled the Amiga about as badly as they mishandled the 128. They sold about 3 million Amigas from 1985 to the bitter end, 1994. They couldn’t compete with IBM-compatible PCs on price, and eventually those PCs caught up in capability. Intel just outmuscled Commodore and beat it at its own game, eventually building a Commodore-like vertical integration where they could make all the chips that went on their motherboards. The difference was that Intel sold those motherboards to all takers, rather than selling complete computers like Commodore did. Commodore didn’t modernize its manufacturing process as quickly as Intel did, so Intel got 25-50% more chips per wafer than Commodore was getting. And eventually, when it came to the last-generation Amigas, Commodore couldn’t make all of the custom chips anymore and had to turn to outside companies to make them. So much for vertical integration, and so much for price.

The difference between Intel and Commodore was that Intel’s leaders were engineers. Gordon Moore and Andy Grove came up the ranks as chip designers. So when their engineers floated proposals, they were capable of understanding them. Commodore’s leaders were financiers who stumbled onto the computer business. They didn’t really understand it, and they never tried to learn.

It took 17 years for Intel to beat Commodore at its own game. But in 1994, Commodore was facing the bankruptcy judge while Intel’s legal problems largely centered on being too successful.

Handling the 128 better and handling the Amiga better wouldn’t have turned Commodore into Intel. They needed an Andy Grove, but their environment never allowed any of their engineers to become that. And their management ran off the only guys they ever had who could have become a Steve Jobs, slamming the door on that route to sustained success too.

Interesting. Missed out on it because it was a few years ahead of my time. But interesting nonetheless, and brings to mind some of the parallels facing us today with ARM and AMD and Intel’s purportedly better fabrication process. I don’t know enough about hardware engineering or the market to judge process nodes. But regarding ARM, I think that the current obsession with all things ARM replacing the X86 ISA is definitely off-track. Our industry is fad-driven.

It’s almost the same story, where Intel is usually a generation ahead in process technology. AMD and IBM develop technology jointly in order to keep pace. But at least they’re trying. Commodore didn’t try, and that’s why they went from tech darling to bankruptcy in 9 short years. But when Intel can get 10, 15 percent more chips per silicon wafer AND can charge the higher price, it’s a double whammy. That’s why AMD never seems to gain ground.

As for ARM, it makes sense for some things. ARM won’t displace x86 for everything, but when power requirement is paramount, x86 isn’t going to keep pace with ARM. ARM’s rightful place remains to be seen of course, but it’s useful for more than phones. And I don’t think ARM in the datacenter is a bad idea at all, for purposes like domain controllers, DNS, and even file servers. I worked in a datacenter that had lots of empty floor space, but they couldn’t fill it because the power was at capacity.

I think the reason ARM gets so much hype right now is because nobody knew what it was, then all these ARM-based smartphones appeared out of nowhere, and there’s so much potential with them. Meanwhile PCs are mature technology and there’s not much new going on with them. Smartphones have people excited about technology again, so ARM gets attention.

And we’ve always had lots of architectures; using x86 for desktop PCs, servers, and Macintoshes is a very new phenomenon. Used to be there was x86 and Motorola 68000, a couple of different 8-bit CPUs, and a bunch of 32-bit RISC CPUs. One by one they all fell to x86. Now history is just repeating itself as we realize that x86 isn’t quite the one size fits all solution Intel wanted it to be.

Looking back also remember the time as personal computings wild west and shortly thereafter phrases such as ‘nobody ever got fired for buying IBM’, safely lifting the decision making process from a sea of choices. I once considered a Northstar Advantage for the record, but it was a VIC-20 and a tape cassette found under the Christmas tree, six months later replaced by a Commie-64 with a floppy drive.

The Commie was rode hard beyond its sell by date with much strain to fingers and eyeballs when the 128 came out. A slick package around an improved keyboard and the promise of 80 column text, at work we were hacking TRS-80 Model III’s and it just didn’t seem to offer improvement over that. Besides, the XP clones were coming. Eight megahertz machines with 640K of memory, 20 Meg hard drives and Hercules graphics!

Yep, when cheap XT clones made in Japan and Korea hit the market, it changed everything. Once you could get a decent XT clone for under $1,000, it was the beginning of the end for everything else. It took time of course, but in the mid 1980s people started buying Tandy 1000s and Leading Edge Model Ds so they could take work home, then game companies started making games for PCs, which led to better peripherals and better capabilities, which led to better software, and the momentum kept building.

I’m one of those bitter Amiga owners though. It was sometime around Windows 2000 that a PC stopped feeling limiting to me, though the Amiga was much more efficient with its memory and CPU power. Amigas had full pre-emptive multitasking, good graphics and sound capability, and plug and play that just worked. In 1985. Running on an 8 MHz CPU with as little as 256K of RAM, though it was much happier with 512K or a meg. A 16 MHz 386 with a meg of RAM sure couldn’t keep up.