While many are happily seduced by the wonders and innovations of our contemporary high-tech life, others see danger lurking in our ever-growing reliance on digital technology. Nicholas Carr has a solid piece in a recent Atlantic Monthly about the hazards of progressive automation. One major development he explores is the unintended consequences of airplane autopilot systems. Carr discusses two recent fatal crashes, one a Continental Commuter flight flying between Newark and Buffalo that killed all 49 passengers and crew and the other an Air France flight from Rio de Janeiro to Paris that crashed into the Atlantic killing all 228 on board. In both cases, the autopilot disconnected, forcing the pilot to take control. And in both cases the pilots reacted by taking the wrong action and actually causing their planes to lose velocity and crash. So it seems that while autopilot systems have contributed to greater air safety over time, they have also contributed to pilot errors and new types of accidents.

Studies show that pilots, and others whose work has been largely automated, become complacent. Workers develop a kind of blind confidence that computers will operate perfectly, and this attitude fails to acknowledge the dangers that increasingly complex computer systems, as they interact with each other, may malfunction. Workers, in effect, become computer monitors, Carr argues. They become less aware of the processes they oversee and often less attentive to the tasks they actually have to do. Automation can also make workers just plain rusty in performing ordinary tasks so that, when the computer system malfunctions or fails, workers make mistakes. Skills decline when they go unpracticed and workers can actually forget how jobs are supposed to be done. “Knowing,” Carr reminds us, “requires doing.” By separating workers from the work, ends are achieved without workers grappling with the means. “Computer automation severs the ends from the means,” Carr explains. And he claims “it’s the work itself—the means—that make us who we are.”

Automation, in effect, changes who we are. We become passive, unengaged “creatures of the screen.” I recall the overwhelming public embrace of Chesley “Sully” Sullenberger after his spectacular and highly skilled landing of a US Airways plane on the Hudson, which saved the lives of all 155 passengers on board. He was proclaimed a “hero” and showered with honors. But I do not think it was simply "The Miracle on the Hudson” that drew people’s attention to him and made him into a popular hero. Rather it was the back story—the story of how he had been a strong advocate for safety all his life, he maintained his own skills and practiced alertness. He understood the limitations of the automated systems he used, and above all he worked hard to live with the integrity, humility, and value system that defined his life and his work. The reviewer of Sully’s autobiography in The Washington Post summed up public perception well:“Sullenberger’s all-American life story is so compelling that it screams to be required reading for all young people, or anybody else who needs confirmation that courage, dignity and extraordinary competence can still be found in this land.... [A] remarkable life story.”Carr’s question in the end is the right one: “Does our essence still lie in what we know, or are we now content to be defined by what we want?” Are we to become “creatures of the screen” or are we to maintain our full humanity, each of us heroes in our own way, by continuing to know and to learn by doing rather than letting our machines work on the assumption that the human being is probably the weakest link in any given system.