ALBUQUERQUE, N.M.--It's entirely possible that one of the worst things that ever happened to this city was the success of the first commercially viable PC, the Altair 8800.

Some might call that conclusion heresy, but here's my twisted theory: Albuquerque-based Micro Instrumentation and Telemetry Systems (MITS) created the Altair 8800 in 1975. This in turn led to Bill Gates and Paul Allen writing the Altair version of Basic, founding Microsoft in the process, and moving here to set up shop close to the Altair's manufacturer.

Sounds good, right? But here's the rub: the Altair was a hit, probably more so than MITS could handle. As production problems and other technical issues arose, Altair began losing money. It got bought up by an out-of-town corporation. And because Gates and Allen no longer had anything tying their now million-dollar-a-year business to Albuquerque, they packed their bags and took Microsoft back to where they'd grown up, the Seattle area.

The result? Redmond, not Albuquerque, is now home base for most of Microsoft's 65,000 employees. And what city wouldn't want an employer that creates that kind of tax base? Instead, Albuquerque is left with the notoriety of being only where Microsoft began.

These are some of my conclusions after stopping here during Road Trip 2007, my tour around the Southwest. I'm visiting "Startup: Albuquerque and the Personal Computer Revolution," an exhibit conceived of and largely funded by Allen that's now housed at the New Mexico Museum of Natural History. It's probably not the message Allen and the exhibit's curators wanted to convey, but there you go.

The real point of "Startup," if you can see past my perverted logic, and as its name implies, is that the personal computer revolution did in fact begin in Albuquerque--something that might surprise many people unfamiliar with the PC's history.

The exhibit, most likely the only one in the world specifically devoted to the history of the microcomputer, makes its point elegantly by first laying out the technology that led to the PC revolution, and then explicitly spelling out Albuquerque's role.

The exhibit starts by quickly taking us back to the earlier 20th century. One of the very first artifacts is a little book, Songs of the IBM, a 1931 volume filled with the fellowship songs Big Blue employees would chant at company meetings.

A little farther down is one of the exhibit's masterpieces: a beautiful Univac-1 console, part of the machine that in 1953 became perhaps the first commercial computer. Of course, as the exhibit points out, a contemporary little handheld computer has 45,000 times as much memory and works 450,000 times faster than a Univac-1, but who's counting?

Visitors are then presented with this factoid: in 1953, it took $1 million, or $35,000 a month in rent, to own a computer, and you'd also need enough electricity for a small town, enough air conditioning for a three-bedroom house, the ability to speak machine language, and a staff of seven to operate it.

Next, we enter the '60s, and we're told that the computer revolution wouldn't have happened were it not for the military and space programs requiring compact computer processing power.

Luckily they did, and one of the first results was Spacewar!, what may be the world's first video game. It was a project of several MIT students, who took their oh-so-powerful DEC PDP-1 computer in 1961 and used it to make a two-player game in which each controls a spaceship and tries to shoot the other while maneuvering around the gravity well of a star.

From there we move swiftly to the fact that Dartmouth College math professors John Kemey and Thomas Kurtz invented Basic in 1964, and to a brief introduction of the evolution of computer processing power, from vacuum tubes to transistors in 1947, and then onto integrated circuits, which, coincidentally, Jack Kilby and Robert Noyce both independently invented in 1958.

There's also a brief discussion of the founding of Intel, which invented the microprocessor, even though it didn't intend to.

According to the exhibit, a client asked Gordon Moore and Noyce's young company to build a complex calculator that would have required a dozen chips. But Intel didn't have the manpower to do the job, so it came up with another idea: What if they could put all the computing functions on a single chip? Thus, computer history was made.

And many more. It was an exciting period that today's youth wouldn't appreciate as much as we did. Hand coding in machine code was the rule of the day and if you had 4KB of static ram in your system you were on the top. Along with a surplus Teletype ASR33 with paper tape punch and reader you were in nirvana.Today more power exists in a handheld calculator or any most other appliances.We have come a long way since then but now the excitement of building your own from components is long gone except for the Microcontroller builders out there and even there most of the periphery that we used to have to bus together is incorporated into a single chip.Many of us "older" geeks appreciate the success of the likes of Bill Gates, Steve Jobs, and the "Woz" and thank them for their contributions, not to mention Intel and their 4004 chip which started it all.

Why don't you give credit for creating BASIC where it is due? Not Bill Gates but some long forgotten professor at Dartmouth, who couldn't be bothered with copyrighting it. Gates has never invented squat, his only talents are in buying the creations of others at bargain prices and in hiring excellent lawyers.

Report offensive content:

If you believe this comment is offensive or violates the CNET's Site Terms of Use, you can report it below (this will not automatically remove the comment). Once reported, our staff will be notified and the comment will be reviewed.

E-mail this comment to a friend.

E-mail this to:

Note: Your e-mail address is used only to let the recipient know who sent the e-mail and in case of transmission error. Neither your address nor the recipients's address will be used for any other purpose.