Someone could have built a working computer long before 1935-Babbage
almost did. The components (relays, vacuum tubes, teletypes, etc.) had all
been
around for quite a while. But the burst of activity after 1935 shows that
only then
were social and economic conditions really favorable for the computer's
invention. The demands of business, government, and science had built up to a
point
where it is not surprising that the idea of an automatic computer occurred
simultaneously to a number of inventors at that time. The Second World War was
the
catalyst that brought those demands together, while providing the necessary
technical and (at least in America) financial support. There would have been
computers had the war never taken place, but those computers would not look
the
way they do today, nor is it likely that there would be a computer impact as
revolutionary as it is today. From the writings of the computer pioneers in
the
late 1930's one sees a modest vision: a few computers in the universities,
some in
government, modest-sized computers for industrial use, but not much more.
They hardly
foresaw the giant scale of the first computers, nor did they see the impact
of their creation
on everyday life. In America today, large and complex computer systems are
an important
part of industrial production, payroll and accounting, taxation, telephone
and television
service, transportation, and military defense.

Reckoners, Rev.?, page 0150

Yet the computer's impact, like that of any pervasive technology, cannot be
easily judged.
Certainly for the roomsful of clerks who labored over adding machines life
has changed, as it
has for the scientist who no longer has to struggle with doing matrix
calculations by hand.
That is really just a small segment of society that has felt the impact of
the computer. One
easily forgets that the pioneers in computing did not envision their
machines as doing much
more.

Nowhere is that better illustrated than by the frequent statements made in
the early days
that only four or five computers of the speed and power of the ENIAC would
be needed to
satisfy all the computing requirements of the United States. (It was felt
that three would
satisfy England's requirements-with perhaps one more for the Scots!) It is
hard to pin down
just who made those estimates, but such a vision of the future was common
in the early
days.1 Obviously they were wrong-they looked at the computer only in terms
of the human
beings it was replacing, as the automobile was first called the "horseless
carriage," or the
radio the "wireless." In terms of what the computer replaced they were
correct: five
computers like the ENIAC could do as much work as 400,000 human beings
equipped with
desk calculators, considering their respective multiplication speeds and
the fact that a
computer can work around the clock.

What they did not foresee was that the computer would soon be doing work of
a very
different kind, work that no human beings could do. With the adoption of
the stored
program principle, it became possible to program computers to sort and
otherwise keep
track of vast quantities of information, a task that human
beings simply cannot do above a small threshold, and for which manual
aids like
file cards are likewise inadequate. The other unforeseen development
was the
adoption of high-level languages in lieu of arcane binary codes to get a
computer
to do its work. That innovation, also a consequence of the stored program
principle, meant that a trained mathematician does not have to accompany
every
computer to its every job. And finally there has been the revolution in
electronics
technology that reduced the size and electric power requirements for
computer
who could have foreseen that?

Some of the skewed perception of the computer's potential was due to the
military's
influence. The American armed forces supported computing projects because
they faced
problems so complex (for instance, firing tables) that existing means for
their solution were
felt to be totally inadequate. The immediate onset of the Cold War in 1945,
especially with
its focus on nuclear weapons, continued that atmosphere. As late as 1950
only one large-
scale computing machine (the IBM SSEC) was not under military control in
America.

Reckoners, Rev.?, page 0151

Among other things that meant that computers would be geared toward number
processing, and they would tend to be as large and fast as technically
feasible. If anyone had
an idea of building a modest-sized computer for small businesses, for
libraries, or even for
game-playing, he or she would not have found much support. But today the
most exciting
part of the computer business is at the small end-in computers for small
businesses and the
home. (Konrad Zuse was alone among the computer pioneers in wanting to sell
small
computers at first, slowly graduating to bigger machines as one gained more
experience. It is
no coincidence that he was working in postwar Germany, where he did not
have the
financial support of the military that John von Neumann had in America.2

But has there really been a computer "revolution" after all? Some have
argued that its
invention has not fundamentally altered social or economic patterns of
American life; if
anything, the computer has made it easier for the patterns existing before
1945 to persist.3
But "revolution" is an appropriate word aside from the question of the
computer's impact on
society. Its invention has triggered a revolution in thinking, in our
understanding of our place
in the universe, much as Copernicus's On the Revolution of the Heavenly
Orbs did after
A.D. 1600.

For one thing,, it changed the whole concept of a "machine." By the
classical definition,
machine is an assemblage or- stiff elements, so arranged as to perform one
specific motion.
Any other motions are defined as "play," of which a good machine has
little.4 By that
definition a computer is hardly a machine, since it is certainly not
restricted to one specific
"motion." A computer is a universal machine; it can do "anything"-that is,
whatever we can
program it to do. And we continue to be startled at the ingenuity of human
beings in
dreaming up new applications for the computer. Once programmed it begins to
look more
like a classical machine, doing one specific thing, but whatever that thing
is can neither be
specified nor foreseen by the computer manufacturer as it is sent out the
factory door. That
quality is a natural consequence of the fact that even when programmed, it
is not possible to
determine the actions of a computer in advance.

Computer pioneers used a biological analogy is sketching out their thoughts
on building
their machinery-for example, they called the storage unit a "memory," the
programming code
a "language." Von Neumann even based his description of the EDVAC on a
neural notation,
referring to the computer's various "organs." Those terms have probably
caused more
confusion than anything else. But they have won out over more prosaic terms
like store or
programming plan.

But the analogy works the other way, too: the structure and function of the
computer provides human beings with a powerful model to aid our
understanding
of human thought. Mechanical analogies are certainly nothing new (Thomas
Hobbes's introduction to his Leviathan being a famous example) but the
analogy
of the computer to the brain is much more powerful precisely because it is
not a
simple machine in the classical sense. The rich and complex behavior of a
computer, set in motion by its executing an internally stored program of
modest
length, is an appropriate model for biological systems, in which an
organism's
characteristics represent an "expression" of information coded in its DNA.
Modern
computers contain read-only-memories (ROMs) which, like an organism's
DNA,
cannot be altered. They also contain random-access memories (RAMS) whose
contents
can be changed at will; these information stores also have their biological
counterparts in
those parts of the brain that can remember experiences and alter their
future behavior
based on what they have learned.

Reckoners, Rev.?, page 0152

All that reasoning by analogy has its limits, of course: the brain is not a
computer, nor
is human or animal behavior the expression of "programs." But there is
enough in common
in that both represent complex systems. Mathematical theories of such
systems have
proven useful to our understanding of both.5

Some of those ideas were on the minds of the builders of the first
computing machines
as they struggled with their arrangements of levers, pins, relays, tubes,
wires, and paper
tapes. The gulf between "reckoning" and "thinking" could at times appear
narrow enough
to suggest that one day an arithmetic machine might become a thinking
machine. But on the
whole their story has been one of perseverance, attention to details, and
just plain hard
work. The pioneers had a vision of creating something new and perhaps
revolutionary.
Precisely because they were facing unknown areas of inquiry and experience,
what they
said and did in response can be useful to us today, as it was remarkably
free from a
narrowness of vision that today's "mature" science of computing suffers
from. Yes, we
should understand computers and how they affect our lives today. If for no
other reason, it
might be a good way to help us understand ourselves.

GLOSSARY A

Translations and Equivalents
of German Terms

The terminology used throughout computer science today comes from the
English
language. Those terms gradually assumed their meanings after 1950, when
commercial
installations of computers began. Inasmuch as Konrad Zuse was working
independently
and without much contact with Anglo-American computing activities before
and after the
Second World War, he developed an independent terminology, which is
nonetheless
equivalent to the more familiar words in use today. The following table
lists first Zuse's
term plus any related German terms, followed by a literal English
translation, and finally
the modern English term I feel comes closest to describing what Zuse
intended. I caution
the reader, however, that the modern terms are not exact equivalents, but
are intended only
to suggest a similar concept.