Please log in

or

Register now for free

or

Choose your profile *

Email *

A valid e-mail address. All e-mails from the system will be sent to this address. The e-mail address is not made public and will only be used if you wish to receive a new password or wish to receive certain news or notifications by e-mail.

Password

Username *

Newsletters

Higher education updates from the THE editorial team

World University Rankings news

If you do not wish to be contacted about offers, products or services from THE/TES Global then please check this box

Babbage to Bill and way beyond

Antiquity in electronic computing is relatively recent. The earliest electronic computing machines of the 1940s and early 1950s are still within living memory. Serious historians have been properly cautious in avoiding premature commitment as we struggle to absorb the explosive phenomenon of the computer into our assumptive worlds. General historiography of modern computing has been commandeered by the oracles of the immediate - journalists - who regale us with tales of Californian misfits who catapulted themselves from garages to multinationals in less than a decade. The sensational rise of Apple and Microsoft, of Wozniak, Jobs and Gates has become a parable of modern entrepreneurial success.

The prehistory of modern computing highlights visionary individuals, Charles Babbage and Herman Hollerith in the 19th century; Alan Turing, Presper Eckert, John Mauchly, John von Neumann and many others in this century. The genre is that of heroic narrative with a focus that borders on fixation with invention and hardware. Memoirs of pioneers and "gee whizz" accounts of the computer's "remarkable story" have been the non-specialist staple.

Computer marks an important transition from this frenzied scramble to a more reflective history. A clue to the treatment is in the subtitle. The machine is not treated as a jumped-up number-cruncher capable of phenomenal feats of arithmetical wizardry in a stunningly short time. Nor is it treated as an "electronic brain" vying for intellectual supremacy with its human makers. In fact, artificial intelligence does not feature at all. We have instead a history of the computer as information machine. What distinguishes Computer is its shift from heroes and wondrous machines to the cultural and economic context that predisposed one or another venture to succeed. We are not asked to marvel but to reflect.

Computer is written by two leading historians of computing, one English, one American. Both belong to the transitional generation. Neither is a first-generation computer pioneer. Nor are they kids for whom general-purpose computers with sophisticated graphical interfaces are commonplace objects rather than mystifying devices that threaten or perplex. Without wishing to do violence to notable exceptions, American historians of technology traditionally favour contextual history, the Brits a more internalist cut which tends to focus on the workings and development of devices and systems. Campbell-Kelly, reader in computer science at Warwick University, is himself an exception to this generalisation having straddled the two traditions with his ICL: A Business and Technical History, published in 1989.

Despite the transatlantic balance in the nationality of the authors the book is weighted in favour of the United States. The first automatic electronic calculator, the ENIAC developed at the Moore School, Pennsylvania for military use during the Second World War, Project Whirlwind, the Sage air defence project, the Sabre airline reservation system and others feature prominently, as do the origins and rise of IBM and other US corporations involved in the scramble for computer markets. The British projects of local legend, the Colossus at Bletchley Park used for wartime code-breaking, and the Manchester team which ran the first stored-program computer in June 1948, receive short shrift. Even Britain's Lyons Electronic Office (LEO) developed specifically for commercial use gains nary a mention. This may be a conscious tailoring of content for the US market (all prices are in US dollars). Alternatively it may be that in the "new history" which seeks to account for the dominance by the US of technology and markets, Britain, despite its early inventive lead, became an economic bit-player in the world picture.

The first half of the book recounts the history and prehistory of computing and office data processing starting in the 19th century when the word "computer" denoted a person, and ending with the rise of IBM and its fall from grace in the mid-1970s. As for the well-rehearsed chestnut disputes of US versus British technological precedence, the personal vendettas, the undignified scramble for historical credit in the US among some of the pioneers of the ENIAC, and the merits or otherwise of the patent lawsuit that ensued, these episodes are recounted with wholesome neutrality and the welcome voice of historical distance.

The second half of the book is groundbreaking. It attempts to frame the developments of the past two decades including the personal computer "revolution" and the advent of the Internet and the World Wide Web. More significantly, it attempts, uniquely in a modern non-specialist account, a history of software and the role and development of programming languages. The watchword throughout is not sensationalism but considered sobriety. This is an historically brave and highly accomplished work which achieves what every book proposal routinely promises: authority and accessibility.