Integration's what you need

Here's another paradox: every year, the computer industry offers more and more power, and more and more capabilities, but does it with fewer and fewer parts. The ultimate goal could be for a whole PC to reside on a single chip - and for everything to be done by a single program.

The magic that makes this happen is integration, and it is the real driver in the IT industry. Integration lets you take the computer power that used to fill a large hall, when it was implemented in vacuum tubes (valves) or transistors, and reduce it to a credit card-sized pocket calculator, so that yesterday's 30-tonne monster is today's single chip.

This process is fundamental to the whole IT industry, from the smallest chip to the most complex piece of software. Indeed, the microchip is the prime example. It's called an "integrated circuit" precisely because it integrates a huge raft of separate components - transistors, capacitors, resistors, wiring etc - onto one sliver of silicon. Silicon is actually a poor material for making, say, capacitors, but that does not matter. The benefits of the integrated circuit are more important than any such drawbacks. The whole is more than the sum of its parts.

If you had to hand-wire a hundred million tiny components to make a Pentium 4, it would be impossible to make one that worked. You could not knock out millions at a few bucks apiece.

Integration happens at the chip level, too. When Intel launched the first Pentium-type processor, the 80386, it wanted PC makers to use its 385 cache controller and 387 maths co-processor chips as well. But as the manufacturing process improved, all three were integrated onto one chip.

This was bad news for Weitek and other companies that were competing to make maths co-processors. It was great news for users, who got processors that did a lot more and cost less.

Almost everything in today's computer business used to be available separately. At one point, for example, if you wanted to assemble a mainframe disk farm, you could buy frames, platters, spindles and controllers separately. (The spindle included the motor, drive shaft, and read/write heads.) Today, you buy an IDE hard drive - IDE stands for Integrated Drive Electronics. The whole thing comes in one small, sealed package, and provides vastly more storage for an unbelievably low price.

The same process of integration has driven the development of software. Operating systems used to be primitive, if they existed at all. Over the years, they have integrated more and more features to support all the different things users want to do.

The first electronic computer, ENIAC (Electronic Numerical Integrator Analyser and Computer) didn't have any software: it was programmed by plugging in cables, like an old telephone switchboard. And it only calculated numbers: specifically, it calculated shell trajectory tables for the US army.

Over the next 50 years, computers learned to handle more sophisticated types of information, such as the letters of the alphabet, rows of pixels (graphics), sounds, photographic images, and moving video. All became integrated in software. Typefaces, for example, went from being non-existent to being an optional extra to being built in as standard. The same thing happened to networking, graphical user interfaces, email, web browsing, and so on.

The Apple Macintosh is a case in point. When it was launched, it barely had an operating system at all. But Apple added more and more functions over the years, including the ability to handle Ethernet networking, CD-rom drives, and movies (with QuickTime). More recently it has added extra software to handle photographs and music, to edit movies, to burn CDs and DVDs, and do many other things.

Some people think this is unfair: it is hard for third party software houses to compete with the utilities that a manufacturer bundles for free.

But it isn't fundamentally different from the problem that faced the manufacturers of vacuum tubes and capacitors, or drive platters and spindles, or maths co-processors, or typefaces.

Every mainstream operating system has grown in much the same way as Mac OS, from the cheapest GNU/Linux bundle to the most expensive IBM mainframe OS. The only real difference is the ignorance of the public.

People are more aware of software integration because it happens in front of their eyes, while they remain blissfully ignorant of the continuous integration of chips, hard drives and other components.

You could turn the clock back 20 years, ship personal computers with just enough software to run a disk drive and put a dot on a screen. Users could spend weeks finding a couple of dozen drivers, selecting their favourite graphical user interface, installing utilities and picking their favourite 57 typefaces from the thousands on offer. It would be logical.

It would superficially increase "competition". It would also be insane. In fact, attempts to stop the natural process of integration are not just hugely expensive and harmful to consumers, they reduce the potential for useful innovation.

Let's take the browser issue, since it has been widely discussed. At one time, I think I had five browsers on my PC. I had the latest Netscape Navigator and Internet Explorer, because I downloaded them. I also had online access software for CompuServe and AOL, both of which sported their own browser. Finally, I had a separate version of Netscape, which came as part of the Encyclopaedia Britannia on CD-rom.

The problem was that software houses and online services could not assume I had a browser installed, so they supplied one. This obliged them to go to the expense of writing or buying a browser, and making it work with their application. It put me to the trouble of installing extra browsers I neither wanted nor needed. The extra cost and development time increased prices and also discouraged software and service providers from adding online elements to their applications. This hampered innovation.

Bundling a browser with the operating system - as Apple, Microsoft, BeOS, IBM and others did - meant application providers and online services could take a browser for granted, and therefore they could exploit it.

Even if they didn't use the browser, they could use its HTML "engine" in other applications: for example, to display web pages in Microsoft Word.

Again, there's nothing unusual about this process of integration. At one time, for example, many MS-DOS applications came with dozens of typefaces and dozens of printer drivers because no software house could assume you had either. Every application programmer had to come with his or her own user interface (and yes, they were all different). It cost a fortune. Once these facilities were incorporated into the operating system, ready on call, programmers could devote their efforts to creating better applications instead.

In computing, what's "given" - what you can assume almost all users have - is called "the stack". That's the foundation on which everyone can build. It is a lot easier and more productive to innovate on top of the stack than to start with bare circuit boards, or bare wires, and build everything from scratch - though you can do that, if you want. But contributing to the stack is how you reach the biggest possible audience at the lowest possible cost.

"The stack" has some interesting but overlooked effects. One is that computing isn't really driven by invention, it is driven by consolidation.

There's no shortage of good ideas: they're 10 a penny. Progress is driven by integrating those ideas - and the hardware and software that embodies them - into the stack and thus delivering them at an affordable price to millions of people. This is what Apple, Intel, Microsoft and others do. It's what GNU/Linux programmers are doing, too.

It's a pretty egalitarian form of progress. Personal computers are not like cars or houses or hot dinners where there are enormous differences between the rich and the poor. The rich get palaces and the poor get hovels.

But in the PC market, there is no significant difference between the PCs used by burger flippers and pump jockeys and the PCs used by George Bush, Britney Spears, and Bill Gates. In fact, the burger flipper with a cheap PC hot-rodded for playing Quake 3 Arena probably has a better system than most of the world's billionaires. And the pump jockey who really knows how to search Google could well be better informed.