lection

command and control

4 april 2014

Half of Eric Schlosser's book Command and Control is a white-knuckle narrative of an accident at an Arkansas nuclear-missile base in 1980. One serviceman died in the explosion of a Titan II missile; several others were badly hurt; but the state of Arkansas is still there, as I can attest. And that result is rather amazing.

The "Damascus accident," as the 1980 Titan explosion is known, happened because a maintenance worker dropped a wrench socket onto the missile. A fuel leak ensued, and eventually the fuel met up with the supply of oxidizer that was supposed to help it to the Soviet Union in case of war. The explosion threw the missile silo's immense concrete roof hundreds of yards away, propelling one of the most powerful thermonuclear warheads ever built out into the open. The weapon just sat there while all hell broke loose around it, and was eventually disarmed and carted away.

The military blamed the event on the technician who dropped the wrench, but Schlosser argues vigorously that such blaming is nonsensical. People drop wrenches all the time.

[Sociologist Charles B.] Perrow concluded that human error wasn't responsible for these accidents. The real problem lay deep within the technological systems, and it was impossible to solve. What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal. (460)

If a weapon that can kill millions is vulnerable to a dropped wrench, the answer cannot be "dropping wrenches is unacceptable." Dropping things is natural to humanity. If your car exploded every time your dropped your keys, the solution isn't being extra damn careful with your keys: it's "redesign the car."

A theme in Command and Control, however, is that despite decades of accidents, some of them surreally severe, the U.S. strategic-arms community resisted redesigning its weaponry. Cost was a factor; so was the idea that the Soviets might annihilate us if we made it harder for nuclear weapons to detonate. (The Soviets themselves, equally vulnerable to accident, apparently took more precautions, but might not have been working with systems as reliable as ours; but their records are still more secret than ours, so nobody really knows.)

Schlosser's book is hardly neutral; no histories of military mistakes can be. It's tempting to say that he's selective and has an axe to grind. But the harrowing stories of accidents and near-accidents that accumulate throughout Command and Control are well-documented (many of them via the Freedom of Information Act). Most of us are probably still under the impression that near-misses of nuclear war are the stuff of thrillers, even of satire. Schlosser's opinion is that Dr. Strangelove is "far more authentic" than most fictions and even than quite a few official press releases (297). Much official doctrine from the Cold War, including both superpowers' war plans and the "Perimeter" system that the Soviets really built and didn't tell us about (along the lines of Strangelove's Doomsday Machine, 467-68) is more absurd than any possible parody.

And of course the implication is that we don't know much about the absurdities that persist in a post-Cold-War world, where nuclear weapons still exist in large numbers, if not quite on such hair-trigger alert. We still have world-destroying weapons at the command of errant humans in mundane settings where much can go wrong. And we don't seem to be able to do much about the danger.

Schlosser, Eric. Command and Control: Nuclear weapons, the Damascus accident, and the illusion of safety. New York: Penguin, 2013.