What device did Douglas Engelbart invent?

What device did Douglas Engelbart invent?

In 1963, there was no Internet, computers were anything but personal, and BASIC still had that new-programming-language smell [source: CHM]. Yet, two decades before Apple rolled out the Macintosh, Douglas Engelbart, a professor at Stanford Research Institute (SRI) in Menlo Park, Calif., dreamed up a wired box that would point and click its way into computer history. In a testament to why electrical engineers need marketing departments, he dubbed his invention the "X-Y Position Indicator for a Display System." We know it as the computer mouse [source: MIT].

At the time, Engelbart's team was engaged in an ambitious effort to raise society's "collective IQ" through computers, and he needed quicker, finer control over what we now, tellingly, call the mouse pointer. Arrow keys were too slow and cumbersome; he needed something hand-sized, with perpendicular wheels to track fine movements. Engelbart discussed his idea with co-designer William English, who stuffed the guts of the prototype into a roughly 3 x 4 x 3-inch (7 x 10 x 8-centimeter) block of wood [sources: Alexander; Biersdorfer; CHM; DEI; Markoff].

The block rode atop two knife-thin wheels, one running left-right and the other tracking up-down. It sported a single button, not because Steve Jobs went back in time and said it should, but because only one microswitch would fit (Jobs didn't build his time machine until much later). Someone in his research group said it resembled a mouse, and the name stuck. A later, commercially produced version added two more buttons, rounded the case to a more familiar shape and moved the "tail" to the top to keep it out of the way [sources: Alexander; Biersdorfer; CHM; DEI; Markoff].

Using the first mouse was a bit like riding one of Dr. Dolittle's pushmi-pullyus, but NASA saw its potential. So did Xerox's Palo Alto Research Center, which in 1973 paired a three-button, trackball mouse with the Alto, the first small computer with a graphical user interface (GUI). Jobs visited the center in 1979, and both Apple and Microsoft would later pluck some of Xerox's Palo Alto researchers [sources: Alexander; Biersdorfer; Markoff].

Although the Alto didn't go anywhere, the pattern had been set. With Apple's 1984 launch of the Macintosh and Microsoft's 1985 debut of Windows 1.0, the GUI had arrived, and the revolutionary transition from organizational mainframes to personal computers was in full swing. Graphics-driven machines and software helped propel the mouse -- now equipped with a grime-collecting rubber ball instead of two wheels -- to its present ubiquitous status [sources: Alexander; Biersdorfer].

But before the mouse roared, there was a man with a vision, and that vision extended far beyond a brick with a button. In 1950, Douglas Engelbart envisioned a wired world very like our own; because he couldn't see how to get there, he set about helping to invent it [source: Markoff].

Alternate view of Engelbart and English's computer mouse prototype. Check out those wheels.

PC Polymath

Although he was best known for the mouse, Engelbart pioneered a slew of personal computing and Internet technologies. More than that, he articulated a vision of an information society that we are only beginning to realize today [source: Markoff].

While the politicians of his childhood might have touted a chicken in every pot, Engelbart envisioned a computer terminal in every office, connected to a central computer through which workers could share data, files and ideas. This augury of the office network came to him in 1950, in an era of room-sized computers, vacuum tubes and punched-tape programming [sources: DEI; Markoff; MIT].

His stint as a radar technician in World War II had convinced him of the potential uses of screen displays, but how to get from massive corporate mainframes to a network of desktop terminals remained unclear -- until the integrated circuit debuted in 1959 [sources: CHM; Markoff; MIT].

Engelbart saw great potential in integrated circuits. He believed that the same principles of scaling that he had witnessed while working in aerospace research could be applied, in reverse, to scale down integrated circuits. He laid out his arguments in a 1959 paper, "Microelectronics and the Art of Similitude." Some argue that Gordon E. Moore was influenced by Engelbart's work in formulating his famous law, which states that the number of transistors on integrated circuits doubles approximately every two years [source: Brock; Markoff].

But it was Engelbart's belief that computers could improve our daily experience, add value to our work and boost our brainpower -- a phenomenon he called "bootstrapping" -- that truly set this electrical-engineer-turned-computer-scientist apart [sources: Flynn; Markoff].

The extent of Engelbart's vision and accomplishments became clear in his Dec. 9, 1968, demonstration at the Fall Joint Computer Conference held in San Francisco -- the famous "mother of all demos" in which he unveiled the computer mouse. The demo was possible because Engelbart's Augmentation Research Center, which was funded by the U.S. Department of Defense's Advanced Research Projects Agency (aka DARPA), was the second node on ARPANet, the packet-switching papa of the Internet [source: DEI; UC Berkeley].

As he walked the audience through the work he and 17 researchers at SRI's Augmented Human Intellect Research Center had accomplished, he also lifted the curtain on early examples of videoconferencing, word processing, hypertext and networking -- the building blocks of his vision for boosting intelligence and productivity through computers [sources: DEI; Markoff; Stanford; UC Berkeley].

The Mouse That Roared

Today, it's hard to imagine that the mouse's future was ever uncertain. We sometimes forget that it once competed for dominance with the light pen, the trackball and stranger gadgets, such as knee-guides, foot pedals or helmet-mounted devices.

While light pens, trackballs and styluses persist and have resurged on tablets and smartphones, the mouse has multiplied and evolved. Optical and laser mice removed the troublesome ball, wireless mice untangled us and gyroscopic mice unchained us from our desks. Now, we stand on the cusp of a new generation of input devices that work off gestures, voice and, perhaps one day soon, thoughts.

The ABCs of Bootstrapping

Engelbart's vision for improving collective IQ involves raising awareness about challenges and then conquering them using a highly networked, collaborative approach to working, living and sharing information.

From AI to IA

Engelbart founded the precursor to the Augmentation Research Center at Stanford the same year that mathematician John McCarthy established the Stanford Artificial Intelligence Laboratory. While McCarthy began the long climb toward artificial intelligence (he's the man who actually coined the term), Engelbart sought to use computers and computer networks to improve human productivity and expand access to information, a process known as intelligence augmentation (IA) [sources: Caruso; Markoff; Markoff].

Science fiction authors have long explored lowering the barriers between human and machine in examples ranging from plugs that "jack" directly into virtual worlds of data, to "cyberbrains" -- surgically implanted computer cores linked to the worldwide network of information. But we need not go so far to imagine the ways that networked computers can change our lives. Search engines, especially Google, already provide instant access to global information (however dubious its source). Smartphones let users mine advice and opinions from the online community on the fly and will soon provide augmented reality.

All of this nibbles somewhat at the edges of the full potential of IA to, as Engelbart put it, "raise our collective IQ" and create "high-performance organizations." The problem, according to Engelbart and his colleagues, is that we lack a true study of the coevolution of people, computers and networks [source: Caruso].

Even as the Internet has evolved to insinuate itself into seemingly every aspect of our daily lives, it has failed to live up to Engelbart's vision of coevolution, of a system in which people could work together in a shared information space that enabled them to improve their work, as well as the process of improvement itself. Groupware -- group collaboration improved through software -- which Engelbart is credited with inventing, represents only the first faltering step in this direction [source: Caruso].

Through crowdsourcing, we begin to see programs that unite human capacities to recognize patterns or solve problems: FoldIt uses a rules-based game that lets visitors help solve protein folding problems; EteRNA lets users design synthetic RNA with potential applications in biology and nanotechnology; and Galaxy Zoo relies on users to classify a million-plus galaxies found by telescopes such as the Sloan Digital Sky Survey. Yet we still have not achieved Engelbart's vision, which he now pursues with his daughter through his nonprofit research organization, the Doug Engelbart Institute [sources: Caruso; DEI].

He'll have a tough row to hoe. Consortiums of businesses, especially ones dependent upon sharing trade secrets, tend to collapse under their own self-interest -- even when sharing would be to their mutual benefit. Still, if there's one thing the Internet Age has taught us, it's that you can't keep a powerful idea down [source: Caruso].

One of the initial ways he envisioned this process involved a coevolution between organizations and their tools that occurred on three levels, which he termed "A, B and C." "A" refers to the work a company does; "B" encompasses efforts to improve how A gets done, and "C" entails improving on B.

Engelbart believed this final level, C, which does not involve trade secrets per se, would be sharable among organizations and improve productivity across the board, but industry remains lukewarm on the idea [sources: Caruso; DEI].

Author's Note

Our data-soaked, gadgetized world instills in us a sensation of forward movement, of evolution. But, really, that feeling derives largely from marketing: The next big chipset, the next graphics card, the next smartphone seem to represent advancement, but mostly they are refinements of a constricted, highly lucrative road map.

It's useful to remind ourselves every once in a while that we might have arrived here, near here or someplace better via other paths -- trails blazed by philosophy, or marked by scientific study of human-computer interaction. Such approaches have played a role, certainly, and an important one, but so, too, have floundering and flailing, throwing this or that against the wall and seeing if it sticks.

Engelbart and pioneers like him remind us of the transformative power of vision. Is his achievable? Years ago, I might have said no. But, if the apps, shareware and freeware available on the Web, or the open source, crowdsource and copyleft movements have taught me anything, it's that if there's a good idea, someone out there will try it, even at the risk of potential profits. The cream, or something like it, will rise to the top.