Personal Computing

Current personal computers are very different from the contraptions of World War II. By the 1970s, people could purchase unassembled “microcomputers” and program them, but the machines were a novelty. Today, personal computers, sophisticated software, and games are available everywhere. At home and work, we use PCs to do everything.

Primitive computers were enormous, expensive, and required teams to run properly. The famous Electronic Numerical Integrator Analyzer and Computer (ENIAC), built during World War II, cost $500,000, weighed 30 tons, and covered 2,000 square feet.

One significant invention that ushered in the PC revolution was the microprocessor. Developed in 1971 by Intel engineer Ted Hoff, the 1/16-by-1/8-inch chip, called the 4004, had a similar capability to the ENIAC.

These innovations made computers easier to manufacture, and the small, “microcomputer,” predecessor of the “personal computer” was invented.

In 1975, MITS hired Harvard students Paul G. Allen and Bill Gates to adapt BASIC programming language for the Altair. The software made computers easier to use. In April 1975 the programmers took the money they made from “Altair BASIC” and formed Microsoft.

In 1976, two engineers, Steve Jobs and Stephen Wozniak, built a homemade computer that would change the game. The Apple I computer was sophisticated, with advanced memory, an inexpensive microprocessor, and a good monitor screen. In April 1977, Jobs and Wozniak introduced the Apple II, with a keyboard and color screen. The company’s programmers created “applications,” making the Apple practical for people and businesses.

Soon companies like Xerox, Tandy, Commodore, and IBM entered the market, and computers became ubiquitous in society. Innovations like the “Graphical User Interface” allowed users to select icons on the computer screen instead of writing commands. Today, laptops, smartphones, and tablet computers enable us to have a PC everywhere we go.