History of computing hardware: Map

Wikipedia article:

Map showing all locations mentioned on Wikipedia article:

The history of computing hardware is the record of
the constant drive to make computer hardware faster, cheaper, and
store more data.

Before the development of the general-purpose computer, most
calculations were done by humans. Tools to help humans calculate
are generally called calculators.
Calculators continue to develop, but computers add the critical
element of conditional response, allowing automation of both
numerical calculation and in general, automation of many
symbol-manipulation tasks. Computer technology has undergone
profound changes every decade since the 1940s.

Computing hardware has become a platform for uses other than
computation, such as automation, communication, control,
entertainment, and education. Each field in turn has imposed its
own requirements on the hardware, which has evolved in response to
those requirements.

Aside from written numerals, the first aids to computation were
purely mechanical devices that required the operator to set up the
initial values of an elementary arithmetic operation, then propel
the device through manual manipulations to obtain the result. An
example would be a slide rule where
numbers are represented by points on a logarithmic scale and
computation is performed by setting a cursor and aligning sliding
scales. Numbers could be represented in a continuous "analog" form,
where a length or other physical property was proportional to the
number. Or, numbers could be represented in the form of digits,
automatically manipulated by a mechanism. Although this approach
required more complex mechanisms, it made for greater precision of
results.

Both analog and digital mechanical techniques continued to be
developed, producing many practical computing machines. Electrical
methods rapidly improved the speed and precision of calculating
machines, at first by providing motive power for mechanical
calculating devices, and later directly as the medium for
representation of numbers. Numbers could be represented by voltages
or currents and manipulated by linear electronic amplifiers. Or,
numbers could be represented as discrete binary or decimal digits,
and electrically-controlled switches and combinatorial circuits
could perform mathematical operations.

The invention of electronic amplifiers made calculating machines
much faster than mechanical or electromechanical predecessors.
Vacuum tube amplifiers gave way to
discrete transistors, and then rapidly to
monolithic integrated circuits.
By defeating the Tyranny of
numbers, integrated circuits made high-speed and low-cost
digital computers a widespread commodity.

This article covers major developments in the history of computing
hardware, and attempts to put them in context. For a detailed
timeline of events, see the computing timeline article. The
history of computing article
treats methods intended for pen and paper, with or without the aid
of tables.Since all computers rely on digital storage, and tend to
be limited by the size and speed of memory, the history of computer data storage is tied to the
development of computers.

Before computer hardware

The first use of the word "computer" was recorded in 1613,
referring to a person who carried out calculations, or
computations, and the word continued to be used in that sense until
the middle of the 20th century. From the end of the 19th century
onwards though, the word began to take on its more familiar
meaning, describing a machine that carries out computations.

Earliest hardware

Devices have been used to aid computation for thousands of years,
using one-to-one
correspondence with our finger.
The earliest counting device was probably a form of tally stick. Later record keeping aids
throughout the Fertile Crescent
included calculi (clay spheres, cones, etc.) which represented
counts of items, probably livestock or grains, sealed in
containers. Counting rods is one
example.

The abacus was used for arithmetic tasks. The
Roman abacus was used in Babylonia as early as 2400 BC. Since then, many
other forms of reckoning boards or tables have been invented. In a
medieval counting house, a checkered
cloth would be placed on a table, and markers moved around on it
according to certain rules, as an aid to calculating sums of
money.

Scottish mathematician and physicist John
Napier noted multiplication and division of numbers could be
performed by addition and subtraction, respectively, of logarithms
of those numbers. While producing the first logarithmic tables
Napier needed to perform many multiplications, and it was at this
point that he designed Napier's
bones, an abacus-like device used for multiplication and
division. Since real numbers can be
represented as distances or intervals on a line, the slide rule was invented in the 1620s to allow
multiplication and division operations to be carried out
significantly faster than was previously possible. Slide rules were
used by generations of engineers and other mathematically inclined
professional workers, until the invention of the pocket calculator.

German polymath Wilhelm Schickard
built the first digital mechanical calculator in 1623, and thus
became the father of the computing era. Since his calculator used
techniques such as cogs and gears first developed for clocks, it
was also called a 'calculating clock'. It was put to practical use
by his friend Johannes Kepler, who
revolutionized astronomy when he condensed decades of astronomical
observations into algebraic expression. An original calculator
by Blaise Pascal (1640) is preserved
in the Zwinger
Museum. Machines by Pascal (the Pascaline, 1642) and Gottfried Wilhelm von Leibniz (the
Stepped Reckoner, c. 1672)
followed. Leibniz once said "It is unworthy of excellent men to
lose hours like slaves in the labour of calculation which could
safely be relegated to anyone else if machines were used."

Around 1820, Charles Xavier
Thomas created the first successful, mass-produced mechanical
calculator, the Thomas Arithmometer,
that could add, subtract, multiply, and divide. It was mainly based
on Leibniz' work. Mechanical calculators, like the base-ten
addiator, the comptometer, the Monroe, the Curta and
the Addo-X remained in use until the 1970s.Leibniz also described
the binary numeral system, a
central ingredient of all modern computers. However, up to the
1940s, many subsequent designs (including Charles Babbage's machines of the 1800s and
even ENIAC of 1945)
were based on the decimal system; ENIAC's ring counters emulated
the operation of the digit wheels of a mechanical adding
machine.

In Japan, Ryoichi Yazu patented a
mechanical calculator called the Yazu Arithmometer in 1903. It
consisted of a single cylinder and 22 gears, and employed the mixed
base-2 and base-5 number system familiar to users to the soroban (Japanese abacus). Carry and end of
calculation were determined automatically.More than 200 units were
sold, mainly to government agencies such as the Ministry of War and
agricultural experiment stations.

In 1833, Charles Babbage moved on
from developing his difference
engine to developing a more complete design, the analytical
engine, which would draw directly on Jacquard's punched cards for
its programming. In 1835, Babbage described his analytical engine. It was the plan of a
general-purpose programmable computer, employing punch cards for
input and a steam engine for power, using the positions of gears
and shafts to represent numbers. His initial idea was to use
punch-cards to control a machine that could calculate and print
logarithmic tables with huge precision (a specific purpose
machine). Babbage's idea soon developed into a general-purpose
programmable computer, his analytical engine. While his design was
sound and the plans were probably correct, or at least debuggable, the project was slowed by various
problems. Babbage was a difficult man to work with and argued with
anyone who didn't respect his ideas. All the parts for his machine
had to be made by hand. Small errors in each item can sometimes sum
up to large discrepancies in a machine with thousands of parts,
which required these parts to be much better than the usual
tolerances needed at the time. The project dissolved in disputes
with the artisan who built parts and was ended with the depletion
of government funding. Ada Lovelace,
Lord Byron's
daughter, translated and added notes to
the "Sketch of the Analytical Engine" by Federico Luigi, Conte
Menabrea.

A
reconstruction of the Difference
Engine II, an earlier, more limited design, has been
operational since 1991 at the London Science Museum. With a few trivial changes, it works as
Babbage designed it and shows that Babbage was right in theory. The
museum used computer-operated machine tools to construct the
necessary parts, following tolerances which a machinist of the
period would have been able to achieve. The failure of Babbage to
complete the engine can be chiefly attributed to difficulties not
only related to politics and financing, but also to his desire to
develop an increasingly sophisticated computer.

Following in the footsteps of Babbage, although unaware of his
earlier work, was Percy Ludgate, an
accountant from Dublin, Ireland. He independently designed a
programmable mechanical computer, which he described in a work that
was published in 1909.

In the late 1880s, the American Herman
Hollerith invented the recording of data on a medium that could
then be read by a machine. Prior uses of machine readable media had
been for control (automatons such as
piano rolls or looms), not data. "After some initial trials
with paper tape, he settled on punched
cards…" Hollerith came to use punched cards after observing how
railroad conductors encoded
personal characteristics of each passenger with punches on their
tickets. To process these punched cards he invented the tabulator, and the key punch machines. These three inventions were
the foundation of the modern information processing industry. His
machines used mechanical relays (and solenoids) to increment mechanical counters. Hollerith's method
was used in the 1890 United
States Census and the completed results were "... finished
months ahead of schedule and far under budget". Hollerith's company
eventually became the core of IBM. IBM developed punch
card technology into a powerful tool for business data-processing
and produced an extensive line of unit record equipment. By 1950, the
IBM card had become ubiquitous in industry and government. The
warning printed on most cards intended for circulation as documents
(checks, for example), "Do not fold, spindle or mutilate," became a motto
for the post-World War II era.

Computer
programming in the punch card era revolved around the computer
center. The computer users, for example, science and engineering
students at universities, would submit their programming
assignments to their local computer center in the form of a stack
of cards, one card per program line. They then had to wait for the
program to be queued for processing, compiled, and executed. In due
course a printout of any results, marked with the submitter's
identification, would be placed in an output tray outside the
computer center. In many cases these results would comprise solely
a printout of error messages, necessitating another edit-compile-run cycle. Punched cards are still
used and manufactured to this day, and their distinctive dimensions
(and 80-column capacity) can still be recognized in forms, records,
and programs around the world.

Desktop calculators

By the 1900s, earlier mechanical calculators, cash registers,
accounting machines, and so on were redesigned to use electric
motors, with gear position as the representation for the state of a
variable. The word "computer" was a job title assigned to people
who used these calculators to perform mathematical calculations. By
the 1920s Lewis Fry
Richardson's interest in weather prediction led him to propose
human computers and numerical analysis to model the weather;
to this day, the most powerful computers on Earth are needed to adequately model its weather using
the Navier-Stokes
equations.

In 1948, the Curta was introduced.
This was a small, portable, mechanical calculator that was about
the size of a pepper grinder. Over
time, during the 1950s and 1960s a variety of different brands of
mechanical calculators appeared on the market. The first
all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced
the four-function EC-130. It had an all-transistor design, 13-digit
capacity on a CRT, and introduced
Reverse Polish notation
(RPN) to the calculator market at a price of $2200. The EC-132
model added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a
10-digit transistorized desktop calculator that used a Nixie tube
display and could compute logarithms.

Advanced analog computers

Cambridge differential analyzer,
1938

Before World War II, mechanical and
electrical analog computers were
considered the "state of the art", and many thought they were the
future of computing. Analog computers take advantage of the strong
similarities between the mathematics of small-scale properties—the
position and motion of wheels or the voltage and current of
electronic components—and the mathematics of other physical
phenomena, for example, ballistic trajectories, inertia, resonance,
energy transfer, momentum, and so forth. They model physical
phenomena with electrical voltages and
currents as the analog
quantities.

Centrally, these analog systems work by creating electrical
analog of other systems, allowing
users to predict behavior of the systems of interest by observing
the electrical analogs. The most useful of the analogies was the
way the small-scale behavior could be represented with integral and
differential equations, and could be thus used to solve those
equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an
electrical example is the Mallock
machine built in 1941. A planimeter
is a device which does integrals, using distance as the analog quantity. Unlike modern
digital computers, analog computers are not very flexible, and need
to be rewired manually to switch them from working on one problem
to another. Analog computers had an advantage over early digital
computers in that they could be used to solve complex problems
using behavioral analogues while the earliest attempts at digital
computers were quite limited.

The art of analog computing reached its zenith with the differential analyzer, invented in
1876 by James Thomson and
built by H. W. Nieman and Vannevar
Bush at MIT starting in 1927.Fewer than a dozen of
these devices were ever built; the most powerful was constructed at
the University of
Pennsylvania's Moore School of
Electrical Engineering, where the ENIAC was
built. Digital electronic computers like the ENIAC spelled
the end for most analog computing machines, but hybrid analog
computers, controlled by digital electronics, remained in
substantial use into the 1950s and 1960s, and later in some
specialized applications. But like all digital devices, the decimal
precision of a digital
device is a limitation,as compared to an analog device, in which
the accuracy is a limitation. As electronics progressed during the twentieth
century, its problems of operation at low voltages while
maintaining high signal-to-noise
ratios were steadily addressed, as shown below, for a digital
circuit is a specialized form of analog circuit, intended to
operate at standardized settings (continuing in the same vein,
logic gates can be realized as forms of
digital circuits).But as digital computers have become faster and
use larger memory (for example, RAM or internal storage), they have
almost entirely displaced analog computers. Computer programming, or coding, has
arisen as another human profession.

Digital computation

The era of modern computing began with a flurry of development
before and during World War II, as
electronic circuit elements
replaced mechanical equivalents, and digital calculations replaced
analog calculations. Machines such as the Z3, the Atanasoff–Berry Computer, the Colossus
computers, and the ENIAC were built by hand using circuits
containing relays or valves (vacuum tubes), and often used punched cards or punched paper tape for input and as the main
(non-volatile) storage medium. Defining a single point in the
series as the "first computer" misses many subtleties (see the
table "Defining characteristics of some early digital computers of
the 1940s" below).

Alan Turing's 1936 paper proved
enormously influential in computing and computer science in two ways. Its main
purpose was to prove that there were problems (namely the halting problem) that could not be solved by
any sequential process. In doing so, Turing provided a definition
of a universal computer which executes a program stored on tape.
This construct came to be called a Turing
machine. Except for the limitations imposed by their finite
memory stores, modern computers are said to be Turing-complete, which is to say, they have
algorithm execution capability equivalent
to a universal Turing machine.

For a computing machine to be a practical general-purpose computer,
there must be some convenient read-write mechanism, punched tape,
for example. With a knowledge of Alan Turing's theoretical
'universal computing machine' John von
Neumann defined an architecture which uses the same memory both to store programs and data:
virtually all contemporary computers use this architecture (or some
variant). While it is theoretically possible to implement a full
computer entirely mechanically (as Babbage's design showed),
electronics made possible the speed and later the miniaturization
that characterize modern computers.

There were three parallel streams of computer development in the
World War II era; the first stream largely ignored, and the second
stream deliberately kept secret. The first was the German work of
Konrad Zuse.The second was the
secret development of the Colossus computers in the UK.Neither of these had much influence on the
various computing projects in the United States. The third stream of computer development,
Eckert and Mauchly's ENIAC and EDVAC, was widely publicized.

George Stibitz is internationally
recognized as one of the fathers of the modern digital computer.
While working at Bell Labs in November 1937, Stibitz invented and
built a relay-based calculator that he dubbed the "Model K" (for
"kitchen table", on which he had assembled it), which was the first
to calculate using binary form.

Zuse

A reproduction of Zuse's Z1
computer

Working in
isolation in Germany, Konrad Zuse started construction in 1936 of his
first Z-series calculators featuring memory and (initially limited)
programmability. Zuse's purely mechanical, but already
binary Z1, finished in 1938, never
worked reliably due to problems with the precision of parts.

Zuse's later machine, the Z3, was
finished in 1941. It was based on telephone relays and did work
satisfactorily. The Z3 thus became the first functional
program-controlled, all-purpose, digital computer. In many ways it
was quite similar to modern machines, pioneering numerous advances,
such as floating point number.
Replacement of the hard-to-implement decimal system (used in
Charles Babbage's earlier design) by
the simpler binary system
meant that Zuse's machines were easier to build and potentially
more reliable, given the technologies available at that time.

Programs were fed into Z3 on punched
films. Conditional jumps were missing, but since the 1990s it has
been proved theoretically that Z3 was still a universal computer (ignoring its physical
storage size limitations). In two 1936 patent
applications, Konrad Zuse also
anticipated that machine instructions could be stored in the same
storage used for data—the key insight of what became known as the
von Neumann architecture,
first implemented in the British SSEM of 1948.
Zuse also
claimed to have designed the first higher-level programming language, (Plankalkül), in 1945 (published in 1948)
although it was implemented for the first time in 2000 by a team
around Raúl Rojas at the Free
University of Berlin—five years after Zuse died.

Zuse suffered setbacks during World War
II when some of his machines were destroyed in the course of
Allied bombing campaigns.
Apparently his work remained largely unknown to engineers in the UK
and US until much later, although at least IBM was aware of it as
it financed his post-war startup company in 1946 in return for an
option on Zuse's patents.

Colossus

During
World War II, the British at Bletchley
Park (40 miles north of London) achieved a number of
successes at breaking encrypted German military
communications. The German encryption machine, Enigma, was attacked with the help of
electro-mechanical machines called bombes. The bombe, designed by Alan Turing and Gordon Welchman, after the Polish
cryptographic bomba by
Marian Rejewski (1938), came into
use in 1941. They ruled out possible Enigma settings by performing
chains of logical deductions implemented electrically. Most
possibilities led to a contradiction, and the few remaining could
be tested by hand.

The Germans also developed a series of teleprinter encryption
systems, quite different from Enigma. The Lorenz SZ 40/42 machine was used for
high-level Army communications, termed "Tunny"
by the British. The first intercepts of Lorenz messages began in
1941. As part of an attack on Tunny, Professor Max Newman and his colleagues helped specify the
Colossus. The Mk I Colossus was
built between March and December 1943 by Tommy Flowers and his colleagues at the
Post Office
Research Station at Dollis
Hill in London and then shipped to Bletchley
Park in January 1944.

Colossus was the first totally
electronic computing device. The Colossus used a large
number of valves (vacuum tubes). It had paper-tape input and was
capable of being configured to perform a variety of boolean logical operations on its data, but it
was not Turing-complete. Nine Mk II
Colossi were built (The Mk I was converted to a Mk II making ten
machines in total). Details of their existence, design, and use
were kept secret well into the 1970s. Winston Churchill personally issued an
order for their destruction into pieces no larger than a man's
hand. Due to this secrecy the Colossi were not included in many
histories of computing. A reconstructed copy of one of the Colossus
machines is now on display at Bletchley Park.

In 1939, John Vincent Atanasoff and Clifford E. Berry of Iowa State
University developed the Atanasoff–Berry Computer
(ABC), The Atanasoff-Berry Computer was the world's first
electronic digital computer. The design used over 300 vacuum tubes
and employed capacitors fixed in a mechanically rotating drum for
memory. Though the ABC machine was not programmable, it was the
first to use electronic tubes in an adder. ENIAC co-inventor John
Mauchly examined the ABC in June 1941, and its influence on the
design of the later ENIAC machine is a matter of contention among
computer historians. The ABC was largely forgotten until it became
the focus of the lawsuit Honeywell v.Sperry Rand, the ruling of
which invalidated the ENIAC patent (and several others) as, among
many reasons, having been anticipated by Atanasoff's work.

In 1939, development began at IBM's Endicott laboratories on the
Harvard Mark I. Known officially as
the Automatic Sequence Controlled Calculator, the Mark I was a
general purpose electro-mechanical computer built with IBM
financing and with assistance from IBM personnel, under the
direction of Harvard mathematician Howard Aiken. Its design was
influenced by Babbage's Analytical Engine, using decimal arithmetic
and storage wheels and rotary switches in addition to
electromagnetic relays. It was programmable via punched paper tape,
and contained several calculation units working in parallel. Later
versions contained several paper tape readers and the machine could
switch between readers based on a condition. Nevertheless, the
machine was not quite Turing-complete. The Mark I was moved
to Harvard
University and began operation in May 1944.

ENIAC

The US-built ENIAC (Electronic Numerical Integrator and Computer)
was the first electronic general-purpose computer. It combined, for
the first time, the high speed of electronics with the ability to
be programmed for many complex problems. It could add or subtract
5000 times a second, a thousand times faster than any other
machine. (Colossus couldn't add). It also had modules to multiply,
divide, and square root. High speed memory was limited to 20 words
(about 80 bytes).Built under the direction of John Mauchly and J.Presper
Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from
1943 to full operation at the end of 1945. The machine was
huge, weighing 30 tons, and contained over 18,000 valves. One of
the major engineering feats was to minimize valve burnout, which
was a common problem at that time. The machine was in almost
constant use for the next ten years.

ENIAC was unambiguously a Turing-complete device. It could compute
any problem (that would fit in memory). A "program" on the ENIAC,
however, was defined by the states of its patch cables and
switches, a far cry from the stored
program electronic machines that evolved from it. Once a
program was written, it had to be mechanically set into the
machine.Six women did most of the
programming of ENIAC. (Improvements completed in 1948 made it
possible to execute stored programs set in function table memory,
which made programming less a "one-off" effort, and more
systematic).

First-generation machines

Even before the ENIAC was finished, Eckert and Mauchly recognized
its limitations and started the design of a stored-program
computer, EDVAC. John von Neumann
was credited with a widely circulated
report describing the EDVAC design in
which both the programs and working data were stored in a single,
unified store. This basic design, denoted the von Neumann architecture, would
serve as the foundation for the worldwide development of ENIAC's
successors.In this generation of equipment, temporary or working
storage was provided by acoustic
delay lines, which used the propagation time of sound through a
medium such as liquid mercury (or
through a wire) to briefly store data. A series of acoustic pulses is sent along a tube; after a
time, as the pulse reached the end of the tube, the circuitry
detected whether the pulse represented a 1 or 0 and caused the
oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a
television picture tube to store and retrieve data. By 1954,
magnetic core memory was
rapidly displacing most other forms of temporary storage, and
dominated the field through the mid-1970s.

250

EDVAC was the first stored-program computer designed; however it
was not the first to run. Eckert and Mauchly left the project and
its construction floundered. The first working von Neumann machine
was the Manchester "Baby" or Small-Scale Experimental
Machine, developed by Frederic C.Williams and Tom Kilburn at the University
of Manchester in 1948; it was followed in 1949 by the Manchester Mark 1 computer, a complete
system, using Williams tube and magnetic
drum memory, and introducing index
registers.The other contender for the title "first
digital stored program computer" had been EDSAC, designed and constructed at the University
of Cambridge. Operational less than one year after the
Manchester "Baby", it was also capable of tackling real problems.
EDSAC was actually inspired by plans for EDVAC (Electronic Discrete
Variable Automatic Computer), the successor to ENIAC; these plans
were already in place by the time ENIAC was successfully
operational. Unlike ENIAC, which used parallel processing, EDVAC
used a single processing unit. This design was simpler and was the
first to be implemented in each succeeding wave of miniaturization,
and increased reliability.Some view Manchester Mark 1 / EDSAC /
EDVAC as the "Eves" from which nearly all current computers derive
their architecture. Manchester University'smachine became the
prototype for the Ferranti Mark 1.
The first Ferranti Mark 1 machine was delivered to the University
in February, 1951 and at least nine others were sold between 1951
and 1957.

The first
universal programmable computer in the Soviet Union was created by
a team of scientists under direction of Sergei Alekseyevich Lebedev from
Kiev Institute of
Electrotechnology, Soviet Union (now Ukraine). The computer MESM
(МЭСМ, Small Electronic Calculating Machine)
became operational in 1950. It had about 6,000 vacuum tubes and
consumed 25 kW of power. It could perform approximately 3,000
operations per second. Another early machine was CSIRAC, an Australian design that ran its first test
program in 1949. CSIRAC is the oldest computer still in existence
and the first to have been used to play digital music.

Commercial computers

In October 1947, the directors of J.Lyons &
Company, a British catering company famous for its teashops but
with strong interests in new office management techniques, decided
to take an active role in promoting the commercial development of
computers. By 1951 the LEO I computer
was operational and ran the world's first regular routine office
computer job.On 17 November 1951, the
J. Lyons company began weekly operation of a bakery valuations job
on the LEO (Lyons Electronic Office). This was the first business
application to go
live on a stored program computer.

In June 1951, the UNIVAC I (Universal
Automatic Computer) was delivered to the U.S.Census Bureau. Remington Rand
eventually sold 46 machines at more than $1 million each ($ as of
). UNIVAC was the first "mass produced" computer; all predecessors
had been "one-off" units. It used 5,200 vacuum tubes and consumed
125 kW of power. It used a mercury delay line capable of storing
1,000 words of 11 decimal digits plus sign (72-bit words) for
memory. A key feature of the UNIVAC system was a newly invented
type of metal magnetic tape, and a high-speed tape unit, for
non-volatile storage. Magnetic media is still used in almost all
computers.

In 1952, IBM publicly announced the IBM 701
Electronic Data Processing Machine, the first in its successful
700/7000 series and its first
IBM mainframe computer. The IBM 704, introduced in 1954, used magnetic core
memory, which became the standard for large machines. The first
implemented high-level general purpose programming language, Fortran, was also being developed at IBM for the 704
during 1955 and 1956 and released in early 1957. (Konrad Zuse's
1945 design of the high-level language Plankalkül was not implemented at that
time.) A volunteer user group, which
exists to this day, was founded in 1955 to share their software and experiences with
the IBM 701.

IBM 650 front panel

IBM introduced a smaller, more affordable computer in 1954 that
proved very popular. The IBM 650 weighed
over 900 kg, the attached power supply weighed around
1350 kg and both were held in separate cabinets of roughly 1.5
meters by 0.9 meters by 1.8 meters. It cost $500,000 ($ as of ) or
could be leased for $3,500 a month ($ as of ). Its drum memory was
originally 2,000 ten-digit words, later expanded to 4,000 words.
Memory limitations such as this were to dominate programming for
decades afterward. Efficient execution using drum memory was
provided by a combination of hardware architecture: the instruction
format included the address of the next instruction; and software:
the Symbolic Optimal Assembly Program, SOAP, assigned instructions
to optimal address (to the extent possible by static analysis of
the source program). Thus many instructions were, when needed,
located in the next row of the drum to be read and additional wait
time for drum rotation was not required.

IBM introduced its first magnetic
disk system, RAMAC (Random Access Method
of Accounting and Control) in 1956. Using fifty metal disks, with
100 tracks per side, it was able to store 5 megabytes of data at a cost of $10,000 per megabyte
($ as of ).

Second generation: transistors

The bipolar transistor was invented in
1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to
the "second generation" of computers. Initially the only devices
available were germaniumpoint-contact transistors, which
although less reliable than the vacuum tubes they replaced had the
advantage of consuming far less power. The first transistorised computer was built at the
University
of Manchester and was operational by 1953; a second version was
completed there in April 1955. The later machine used
200 transistors and 1,300 solid-statediodes and had a power consumption of 150 watts.
However, it still required valves to generate the clock waveforms
at 125 kHz and to read and write on the magnetic drum memory, whereas the Harwell CADET operated without any valves by
using a lower clock frequency, of 58 kHz when it became operational
in February 1955. Problems with the reliability of early batches of
point contact and alloyed junction transistors meant that the
machine's mean time between
failures was about 90 minutes, but this improved once the
more reliable bipolar
junction transistors became available.

Compared to vacuum tubes, transistors have many advantages: they
are smaller, and require less power than vacuum tubes, so give off
less heat. Silicon junction transistors were much more reliable
than vacuum tubes and had longer, indefinite, service life.
Transistorized computers could contain tens of thousands of binary
logic circuits in a relatively compact space. Transistors greatly
reduced computers' size, initial cost, and operating cost.Typically, second-generation
computers were composed of large numbers of printed circuit boards such as the
IBM Standard Modular
System each carrying one to four logic
gates or flip-flops.

A second generation computer, the IBM 1401, captured about one
third of the world market. IBM installed more than one hundred
thousand 1401s between 1960 and 1964.

Transistorized electronics improved not only the CPU (Central Processing Unit), but
also the peripheral devices. The IBM 350 RAMAC was introduced in 1956 and was the
world's first disk drive. The second generation disk data storage units were able to store tens
of millions of letters and digits. Next to the fixed disk storage units, connected to the
CPU via high-speed data transmission, were removable disk data
storage units. A removable disk stack can be easily exchanged with
another stack in a few seconds. Even if the removable disks'
capacity is smaller than fixed disks,' their interchangeability
guarantees a nearly unlimited quantity of data close at hand.
magnetic tape provided
archival capability for this data, at a lower cost than disk.

Many second generation CPUs delegated peripheral device
communications to a secondary processor. For example, while the
communication processor controlled card reading and punching, the main
CPU executed calculations and binary branch instructions. One databus would bear data between the main CPU and
core memory at the CPU's fetch-execute cycle rate, and other
databusses would typically serve the peripheral devices. On the
PDP-1, the core memory's cycle time was 5
microseconds; consequently most arithmetic instructions took 10
microseconds (100,000 operations per second) because most
operations took at least two memory cycles; one for the
instruction, one for the operand data
fetch.

During the second generation remote terminal units (often in the
form of teletype machines like a
Friden Flexowriter) saw greatly
increased use. Telephone connections provided sufficient speed for
early remote terminals and allowed hundreds of kilometers
separation between remote-terminals and the computing center.
Eventually these stand-alone computer networks would be generalized
into an interconnected network of networks—the
Internet.

Post-1960: third generation and beyond

The
explosion in the use of computers began with "third-generation"
computers, making use of Jack St. Clair
Kilby's and Robert Noyce's
independent invention of the integrated circuit (or microchip),
which later led to the invention of the microprocessor, by Ted Hoff, Federico
Faggin, and Stanley Mazor at Intel.The
Intel 4004 (1971) die was 12 mm^2, composed of 2300 transistors; by
comparison, the Pentium Pro was 306 mm^2, composed of 5.5 million
transistors, according to

The
integrated circuit in the image on the right, for example, an
Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and
I/O in the same chip.

During the 1960s there was considerable overlap between second and
third generation technologies. IBM implemented its IBM Solid Logic Technology
modules in hybrid circuits for the
IBM System/360 in 1964. As late as 1975, Sperry Univac continued
the manufacture of second-generation machines such as the UNIVAC
494. The Burroughs large
systems such as the B5000 were stack
machines, which allowed for simpler programming. These pushdown automatons were also implemented
in minicomputers and microprocessors later, which influenced
programming language design. Minicomputers served as low-cost
computer centers for industry, business and universities. It became
possible to simulate analog circuits with the simulation
program with integrated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs for
electronic design automation (EDA).The
microprocessor led to the development of the microcomputer, small, low-cost computers that
could be owned by individuals and small businesses. Microcomputers,
the first of which appeared in the 1970s, became ubiquitous in the
1980s and beyond. Steve Wozniak,
co-founder of Apple
Computer, is
sometimes erroneously credited with developing the first
mass-market home computers.
However, his first computer, the Apple I,
came out some time after the MOS Technology
KIM-1 and Altair 8800, and the first
Apple computer with graphic and sound capabilities came out well
after the Commodore PET. Computing has
evolved with microcomputer architectures, with features added from
their larger brethren, now dominant in most market segments.

Systems as complicated as computers require very high reliability. ENIAC remained on,
in continuous operation from 1947 to 1955, for eight years before
being shut down. Although a vacuum tube might fail, it would
be replaced without bringing down the system. By the simple
strategy of never shutting down ENIAC, the failures were
dramatically reduced. Hot-pluggable
hard disks, like the hot-pluggable vacuum tubes of yesteryear,
continue the tradition of repair during continuous operation.
Semiconductor memories routinely have no errors when they operate,
although operating systems like Unix have employed memory tests on
start-up to detect failing hardware. Today, the requirement of
reliable performance is made even more stringent when server farms are the delivery platform. Google has managed this by using fault-tolerant
software to recover from hardware failures, and is even working on
the concept of replacing entire server farms on-the-fly, during a
service event.

In the twenty-first century, multi-core
CPUs became commercially available. Content-addressable memory (CAM)
has become inexpensive enough to be used in networking, although no
computer system has yet implemented hardware CAMs for use in
programming languages. Currently, CAMs (or associative arrays) in
software are programming-language-specific. Semiconductor memory
cell arrays are very regular structures, and manufacturers prove
their processes on them; this allows price reductions on memory
products. When the CMOS field effect transistor-based logic gates supplanted bipolar transistors,
computer power consumption could decrease dramatically (A CMOSfield-effect
transistor only draws significant current during the
'transition' between logic states, unlike the substantially higher
(and continuous) bias current draw of a BJT).This has allowed computing to become a commodity which is now ubiquitous, embedded in
many forms, from greeting cards and telephones to satellites. Computing
hardware and its software have even become a metaphor for the
operation of the universe. Although DNA-based computing and quantum
qubit computing are years or decades in the future, the
infrastructure is being laid today, for example, with DNA origami on photolithography.

An indication of the rapidity of development of this field can be
inferred by the history of the seminal article. By the time that
anyone had time to write anything down, it was obsolete. After
1945, others read John von Neumann's First Draft of a Report on
the EDVAC, and immediately started implementing their own
systems. To this day, the pace of development has continued,
worldwide.