The Computer

As we continue to barrel through the information age, it is hard
to imagine conducting business without computers. Each day, millions of
people working in offices and homes around the world depend on computer
technology to do their jobs efficiently and economically. To truly understand
the computer's history involves a daunting journey through mathematics,
physics, and electrical engineering; through binary code, Boolean logic,
real time, magnetic core memories, floating-point numerical notation, transistors,
semiconductors, integrated circuits, and much, much more.

Luckily, most office workers do not need to understand
this complex history to use computers and the dizzying array of software
programs they execute. When computers were first developed nearly fifty
years ago, the people who programmed them considered the task quite maddening.
Fortunately, learning to use a personal computer today is often as simple
as spending a few hours reading an instruction manual or following a hands-on tutorial.

In recent years, computer technology has been incorporated
into a wide range of consumer and industrial products. Computers are routinely
used in word processing, e-mail, video games, and other applications that
require repetitive tasks that can be automated.

What Is a Computer?

Emerging technologies are continually advancing the computer's
capacity and usefulness, making "the computer" a difficult term
to define. In the broadest sense, a computer is an information processing
machine. It can store data as numbers, letters, pictures, or symbols and
manipulate those data at great speeds by following instructions that have
been stored in the machine as programs.

Why Computers?

The first computers were not computers as we define them
today. They were calculators--machines designed to solve complex mathematical problems. They reduced the extraordinary
amount of time it took people just to attempt to solve the problems
themselves. One of the largest mathematical nightmares of the precomputer
age was analyzing the U.S. population data collected by the Census Bureau.
The headcount itself took only a few months, but data analysis took years--and
by then, the information was outdated.

Various inventors built machines to speed up mathematical
computation. By 1941 a German engineer who hated engineering's mathematical
drudge work had developed fast but limited relay calculating machines used
in the German war effort.

In fact, military needs have played a major role in the
development of the computer. When the United States entered World War II,
the Ballistic Research Laboratory at Aberdeen Proving Ground had human "computers"one
hundred (mostly female) college graduates who calculated the ballistic firing
tables that were used for accurate weapons aiming. It took about three days
to calculate a single trajectory, and two thousand to four thousand trajectories
were needed for each weapon.

The Army soon realized that its human "computers" could
not perform these calculations quickly enough. In 1945, the Army received
financial support to develop a huge machine called ENIAC (Electronic Numerator,
Integrator, Analyzer, and Computer), which weighed thirty tons, took up
1,800 square feet of floor space, and required six full-time technicians
just to keep it running. Thousands of times faster than any of its predecessors,
ENIAC demonstrated the unmistakable advantage of machine computing.

The UNIVAC, the first commercial computer system in America,
followed in the 1950s. Office workers became accustomed to the separate
areas--sometimes entire office floors--that housed the new machines and
the programmers and technicians who knew how to use them. Data processing
departments soon became commonplace.

Working with Computers

As their technical capacities increased from handling only
mathematical computations to manipulating words and other data, computers
began to change the way many businesses did their work. Crews of mostly
female keypunch operators, who put data into machine-usable form, became
a new class of low-skilled labor. Despite their increased role in the workplace,
computers were long considered strange and noisy machines housed in cold
rooms down the hall.

Technological advances did help make computers smaller,
faster, and extremely capable information handlers, but no more "friendly"
to most office workers. By the 1970s, integrated circuit technology made
producing a small and relatively inexpensive personal computer possible.
Yet even with this available technology, many computer companies chose not
to develop a personal computer. They could not imagine why anyone would
want a computer when typewriters and calculators were sufficient.

The Personal Computer

The first personal computer--developed by Digital Equipment
Corporation and Massachusetts Institute of Technology's Lincoln Laboratory
in 1962--was intended for a single researcher and cost $43,000. Later personal
computers were developed not by big corporations but by electronics buffs
who typically read about computers, sent away for instructions and materials,
and built them in their basements.

In 1976, a college dropout named Steve Wozniak and a teenager
named Steve Jobs founded the Apple Computer Company, which made affordable computers designed
for easy use. Eight years later, they introduced the Macintosh--a microcomputer
with an intuitive user interface including familiar icons and a mouse. Meanwhile,
Paul Allen and Bill Gates were busy with their new company, Microsoft. Microsoft's
DOS (introduced in 1981) and Windows (introduced in 1985) programs would
soon operate the majority of personal computers on the market.

Understanding and knowing how to program a computer are,
for most users, irrelevant because thousands of inexpensive programs called software are available to perform almost
any imaginable task. Using built-in rules and procedures, these programs
offer fast and efficient ways to conduct business. Routine office tasks
once performed by hand--such as data storage, correspondence, research,
and report preparation are now computer-driven to such an extent that office
typewriters, filing cabinets, and calculators are tools of the past.

Computer Networks

In most offices of the 1990s, personal computers are linked to
one other through internal--and often external--networks. This "networking"
allows employees to gather information from a vast array of outside sources
(particularly the World Wide Web) and to share it quickly with their colleagues,
outside business partners, and customers.

The network to which a personal computer is linked now
defines what it can do. Many different types of machines can be connected
to a single internal or external network, including mainframes to handle
large quantities of data and supercomputers designed for complex scientific
work. All of these computers are invisible to the networked user, who can
tap in and retrieve or process data that a personal computer by itself could
not handle.

External information networks are accessed through
modems (modulators/demodulators). Modems translate the digital language
of the computer code into analog signals, which can be sent across telephone
lines. The analog signals are then translated back into digital code for
use by the receiving computer.

Modems provide access to the most widely used external
information network--the Internet--which, in the late 1990s, reaches more
than twenty-five million computer users (an increase from 213 registered
computers in 1981). This represents considerable growth from the days of
ARPANET, the "Mother of the Internet," which began as a U.S. government
experiment linking researchers with remote computer centers to allow them
to share hardware and software resources.

As new technologies are developed, personal computers will
likely become even smaller in the future. They may also incorporate a greater
number of data input and output methods (e.g., voice commands), efficiently
interacting with one another because of greater software compatibility.
In addition, computer information networks in public places--which began
with the introduction of automated teller machines in the early 1970s--will
likely become quite commonplace as more and more daily business is conducted
electronically.