A Tale of Three Operating Systems

Back in the Olden Dayes, before there was telephony competition, Ma
Bell (AT&T) was a regulated common carrier,
meaning it was a monopoly under government control to keep them from gouging
the public. The government doesn't do very many things well, and regulation
is among the things it does not do well, but the theory is that without
government regulation, the common carrier monopolies like telephone companies
would make excessive profits at the expense of the public. That part is
true. One of the restrictions imposed on Ma Bell was that they could not
comingle regulated and unregulated activities. They were allowed a fixed
percentage profit after expenses, and if they had ways to increase their
expenses that made an unregulated subsidiary more profitable, they probably
would do that. So it was prohibited.

Why is this important?

A reasonable expense for any ongoing business is research and development
(R&D), so AT&T
poured a lot of their gross revenues into R&D,
a far bigger percentage than most unregulated businesses would, which made
the American telephone system one of the best in the world. After deregulation,
that percentage went down to something closer to what most businesses do,
and the rest of the world (which still runs government monopolies with
no profit motive) caught up and passed the US in telecommunications quality.
But that's another story.

Anyway, Bell Labs became one of the foremost research groups in the
world. They invented the transistor. Their microwave research discovered
the background microwave radiation that supposedly proves the Big Bang
cosmology. And they did a lot with computers. The Unix operating system
came out of Bell Labs. Because of their vast funding, Unix quickly became
one of the best operating systems of its time. It still is one of the best
operating systems of its time (the 1970s). But I'm getting ahead of myself.

AT&T could not sell Unix to the public,
because that violated the restriction on comingling regulated with unregulated
businesses. But they could give it to non-profit organizations (read: universities),
who themselves were prohibited by the nature of being non-profit from selling
it to the public. The universities,using government grants, did research
on operating systems (read: Unix, which was the only system available to
them in source code), and they taught Unix to their computer science students.
So all the geeks coming out of the university CS departments knew Unix
only. That's still true for the same reason 30 years later, but the economics
are slightly different.

Out in the real world, businesses who had to compete for their markets
and profits could not invest that kind of R&D
into making better operating systems. Well, IBM could, they had runaway
profits from the first scientific mainframe, the 704. They anticipated
selling maybe ten of them worldwide, so they amortized their development
costs over the first ten systems. They wound up selling something closer
to 160 of them, so they had huge profits on the other 150. They gave them
to the universities at very low prices (still at a profit), and all the
students coming through the universities learned on IBM systems. They went
out into the world and when they had to buy a computer, guess which brand
they bought. But the software source code was proprietary, not available
for study, so the geeks still learned Unix. The geeks did not make business
decisions.

Then Intel invented the microprocessor, which made computers available
to hobbyists. Hobbyists are geeks. Many of them did not study computer
science in college, but some of them did. They only knew Unix, so the first
Disk
Operating System(s) for microcomputers looked and ran more
or less like Unix. There are a lot of fascinating side stories about how
things played out, but today I want to focus on that operating system.
IBM was the de-facto monopoly in mainframes, but microcomputers were so
cheap that hobbyists could do amazing things in their spare time.

One of those amazing things was the spreadsheet. Nobody had thought
of it before, because computers to run it were not available. But VisiCalc
ran on Apple computers, and it made business calculations so much easier,
business people bought Apple computers just to run VisiCalc. IBM execs
saw these things showing up on their customers' desks and realized where
business was going, so they commissioned a crash project to build an IBM
desktop computer. Their own operating system wouldn't run on such
a tiny computer, so they sent a team out to buy software: VisiCalc, Basic,
and the operating system to run it all on. But the story is that the guy
who had made the OS was not selling, and Bill Gates, who already had his
foot in the door with Basic, hunted around to buy up a clone, which he
offered to IBM as MS-DOS. As a result, Microsoft became
another monopoly company making huge profits, like AT&T
and IBM.

Bill Gates is a consumate businessman, and maybe he figured out that
he could get better R&D at lower cost by
buying up hobbyists who did their thing in a garage. He also paid his own
programmers very well (stock options), but rich people do not produce quality
products. So MS-DOS grew with the business and the
size of the computers, but did not get much better. It didn't need to,
because there was no competition.

Xerox was another of those monopoly companies with huge profits -- all
perfectly legal: they had a patent on xerography -- and they also had a
big R&D site in Palo Alto (Palo Alto Research
Center, or PARC), which drew on the academic community
at Stanford. Apple co-founder Steve Jobs and some of his team toured the
Xerox PARC and saw what they were doing with What
You See Is What You Get (WYSIWYG) graphical user interfaces.
They went back home and created the Macintosh, which did the same thing
only better. Bill Gates saw what a fantastic improvement that was over
the Unix user interface of MS-DOS, so he went back
home and created Windows, which did the same thing only not as well. But
Microsoft had more money than Apple, and Gates was a better businessman
than Jobs, so Windows eventually became more usable, and took over the
monopoly market they already had.

Somewhere around that time, the government decided it was in the public
interest to break up the world's best telephone company and make second-rate
citizens of the pieces. As a result, AT&T
stopped giving away Unix and tried to make a profit product out of it.
AT&T
had been so many years in the protective shell of regulation that they
couldn't make an honest profit, and there was a lot of reshuffling while
they jostled to get back into the monopoly telecommunications business
they know so well. The universities still had their Unix license, and they
could not give it away. So Linus Torvalds decided to make his own Unix
clone, which he named Linux after himself. He was just a hobbyist, so he
gave it away. All the geeks at the universities loved it, and it eventually
replaced the AT&T Unix in academia. It
was just as hard to use as Unix has always been, but the geeks tend to
like that. Eventually Linux grew big enough with all that free labor to
qualify for the Unix brand label.

Toward the end of the century, we had three operating systems in general
use. Mainframes were long gone. The minicomputers Unix was designed for
had been replaced by microprocessors -- now more powerful than the previous
mainframes had been. The tiny minority of geeks still favored Unix, now
Linux (which was the same thing). The small number of creative people,
who wanted to get their job done rather than fighting the system, used
Macintosh. Everybody else was already using Microsoft's product, or else
bought what their friends had or the local computer expert recommended.
The Microsoft operating system -- especially MS-DOS
-- was so hard to use, that there sprang up a typically American entrepeneurial
business training and fixing the many mistakes. It was these self-taught
"experts", not the college-trained geeks, and not the reclusive creative
people, who were the ones people asked what to buy, and of course they
recommended what they knew would continue to bring in revenue. They recommended
Windows.

So people bought Windows-based computers. The geeks continued to use
Unix, but all of them already had computers; they bought replacements when
computers got faster, but otherwise their numbers stayed the same -- and
their market share went down. Similarly, the creative people (Mac users)
already had all the computers they needed, so they did not buy any more.
Worse: the Mac worked so well, they never had any reason to upgrade. Upgrades
are the lifeblood of the computer industry. So the software vendors began
abandonning the Mac for the greener pastures of Windows -- and took some
of their customer base with them. Apple's market share plummeted.

Although not the business wizard that Bill Gates is, Steve Jobs is no
fool. When he got pushed out of Apple, he started his own "NeXt" computer
company. Operating systems were already too big to build from scratch,
and Unix by then was commercially available, so he built his system on
Unix. But he did not forget his Xerox PARC trip, so
he built a WYSIWYG front end on Unix. Even with a
WYSIWYG front end on it, Unix is far too hard for
ordinary people to use, so NeXt was never the commercial success Jobs was
hoping for. But the geeks liked it.

Meanwhile, the wizards who invented the Mac went off to do other things,
and Apple hired college fresh-outs to replace them. These geeks did not
understand the MacOS, they only understood Unix. The resulting blunders
were deemed irrecoverable, so Apple discarded their marvelous invention,
bought NeXt, and renamed it "Macintosh OSX". The Unix-trained
geeks continued to love it. Most of the creative people, when their computers
finally wore out, stayed with Apple and tried to grumble as little as possible.
OSX
is still Unix and harder to use than Windows, but Apple's industrial design
people are pretty good at painting a pretty face on it.

A few years ago, back when there was still a MacOS for sale at your
local computer store, MacWeek ran an article in which they quoted some
expert saying "Everybody knows Unix software crashes all the time." That's
still true (see my "Three Systems"
blog post), so people constantly need to upgrade to newer versions that
crash in different ways. It's a marketer's dream come true, so the vendors
love it. It's also true to a lesser degree of Windows, which is only an
imitation
of Unix, not the real thing, so it doesn't crash as often. But most people
already have their computer, and when it wears out, they buy another of
the same kind. The few left to buy a computer for the first time still
ask their friends or the local computer "expert" for a recommendation,
and the majority system (Windows) is what they recommend.

Bill Gates is no longer at the helm of Microsoft, and without his business
sense, the bean counters and the lawyers will not be long taking it from
top of the industry to mediocrity. Steve Jobs is also getting too old to
run Apple, with similar expected results. So what the dominant operating
system will be at the end of the decade is anybody's guess.

But it probably won't be anything like the Macintosh. I could wish for
it, but the software vendors don't want anything so easy to use that nobody
upgrades, and the geeks who write software and operating systems only understand
Unix. Whether a businessman, with enough cash to force his will on the
programmers and enough vision to know what to force on them, will come
along and do what is needed seems at this time unlikely. But you never
know.