Eric Schlosser, Bard of Folly

It took decades after the invention of nuclear weapons for today’s taboos against them to take hold. Some witnesses to the first nuclear explosions apprehended their horror immediately. Some planners, civilian and military, fell in love. In the 1950s and 1960s, the U.S. built nuclear reactors in Iran, Pakistan, and dozens of other countries; in the 1960s and 1970s, the Atomic Energy Commission made plans to use nuclear explosions to dig a canal in Nicaragua and carve a pass-through in the California mountains for Interstate 40. Influential strategists like Herman Kahn were enthralled by the potential of nuclear weapons to reshape the world. On Thermonuclear War, Kahn’s best-known book, contains scenarios not only for how nuclear weapons would work in World War III but also in World Wars IV, V, VI, and VII.

All too often, the history of nuclear weapons has been told as a history of those schemes, a history of plans for wars that never took place. The genesis of nuclear weapons has also been a locus of fascination, for good reason. But these two threads leave out a crucial history of the tens of thousands of nuclear weapons, which, like any other technological artifact, differed in reality from the idealization. Eric Schlosser’s Command and Control fills that gap, telling of the weapons in their silos and in bombers. (Schlosser says little about submarines, the third part of the so-called nuclear triad.)

Schlosser does not dwell on the nuclear weapons the United States has blown up in 1,054 tests since 1945, but instead focuses on the bombs that almost went off but didn’t. “The need for a nuclear weapon to be safe and the need for it to be reliable were often in conflict. A safety mechanism that made a bomb less likely to explode during an accident could also, during wartime, render it more likely to be a dud,” he explains. That conflict is the central one in Command and Control. As Schlosser acknowledges, no American bomb ever exploded accidentally. But he tells us just how close we came. The repeat narrow escapes that Schlosser enumerates occurred because military planners took reckless chances for decades.

On May 21, 1946, Louis Slotin, one of the designers of the first atomic bomb, was lowering a beryllium shell—which reflected neutrons and so brought plutonium closer to a chain reaction—over a sphere of plutonium. The screwdriver he was using to hold the beryllium slipped, “the core went supercritical, and a blue flash filled the room.” Slotin’s parents were flown in to say goodbye, and his death throes were videotaped, a mournful lesson in what radiation can do to a man. As Schlosser notes, thousands in Hiroshima and Nagasaki died in similar agony.

Schlosser’s story extends from the mid-1940s to the end of the Cold War. He documents just how slow and contingent—and perhaps avoidable—the buildup of nuclear weapons was. He talks about an April 1947 meeting in the White House, in which Harry Truman discusses the top-secret size of America’s available nuclear arsenal: one. The weapons still under development were haphazard devices:

The Mark 3 bomb had a number of inherent shortcomings. It was a handmade, complicated, delicate thing with a brief shelf life. The electrical system was powered by a car battery, which had to be charged for three days before being put into the bomb. The battery could be recharged twice inside the Mark 3, but had to be replaced within a week—and to change the battery, you had to take apart the whole weapon. The plutonium cores radiated so much heat that they’d melt the explosive lenses if left in a bomb for too long. And the polonium initiators inside the cores had to be replaced every few months.

Early nuclear weapons were particularly vulnerable to accidental detonation. To mitigate that risk, nuclear cores—the plutonium or uranium that would undergo nuclear fission, releasing tremendous amounts of energy—were generally kept separate from the explosives that would detonate them until the last possible moment.

Nuclear cores are no longer stored separately in any of the 2,150 or so nuclear warheads the United States currently has deployed. But today’s weapons are safer than those of the 1950s. The technology to make them safer—better electronic detonators and more stable explosives, less vulnerable to accidental detonation—existed for many years before weapons designers at the Sandia National Laboratory could succeed in integrating them into weapons. Military planners used risky techniques for decades, willing to chance an accidental detonation rather than make it harder for a bomb to go off when it was supposed to.

Schlosser notes that one government study found that “at least 1,200 nuclear weapons had been involved in ‘significant’ incidents and accidents between 1950 and March 1968.” Some of the accidents were minor. Schlosser takes the reader on a well-paced tour of the more important ones: B-52 crashes in Kentucky in 1959, California in 1961, and near Palomares, Spain, in 1966. Another crash blanketed the ice outside Thule Air Force Base in Greenland with radioactivity in 1968. In an aerial refueling that went wrong in 1961, near Goldsboro, North Carolina, as the B-52 spun around, “centrifugal forces pulled a lanyard in the cockpit,” which triggered the arming mechanism for the bombs:

The barometric switches closed. The timer ran out, activating the high-voltage thermal batteries. The bomb hit the ground, and the piezoelectric crystals inside the nose crushed. They sent a firing signal. But the weapon didn’t detonate. Every safety mechanism had failed, except one: the ready/safe switch in the cockpit.

But for that one switch, a large hydrogen bomb would have exploded in North Carolina, blanketing Washington, D.C., and much of the Eastern Seaboard with radioactive fallout. Schlosser’s eye for the surreal and for details keeps the large number of nuclear accidents he covers from feeling repetitive. He uses one accident in particular to discuss the ongoing technical and cultural questions surrounding nuclear weapons and their launch systems. In the early morning hours of September 19, 1980, a Titan II missile exploded in its silo outside Damascus, Arkansas, after a dropped spanner led to an oxidizer leak. As early as 1967, the Pentagon had announced that the Titan II missile in question “was no longer needed and would be decommissioned.” Bureaucratic infighting, not necessity, was what had kept the temperamental missile in service for years. Schlosser chronicles the infighting with sympathy and brio, admirably restraining any stridency, which, though merited, would undercut his argument.

While the turn toward nuclear weapons might appear to be a break with Schlosser’s past work on fast food, agriculture, and illegal drugs, a thematic unity runs through his writing. In all his books, vivid individuals in a particular time and place create a new technology. As those technologies grow to industrial scale, they become forces unto themselves—with a streak of cruelty. Meatpacking plants ruin the bodies of workers on their assembly line. Toxic chemicals leaking from missiles fill repairmen’s lungs with fluid.

Jeff Kennedy, one of the technicians on the refueling team for the Titan II, is a hero of Schlosser’s story. Kennedy knows the Titan II inside out. But prior to the explosion, as dangerous pressure builds inside it, he’s forced to watch while distant supervisors dither; he ends up blown 150 feet into the Arkansas air. As Schlosser’s examples illustrate, the questions of when to trust the man in the field and when to trust centralized control are not simple. How, Schlosser asks, did a blast that powerful not set off the nuclear warhead atop the rocket? The explosion was a technological and human failure. The warhead’s durability was a technological and human success.

Another Schlosser hero, Bob Peurifoy, rises through the ranks at the Sandia National Laboratory, eventually becoming a vice president of the lab, which by then was part of the Department of Energy. Throughout his career, Peurifoy agitates for greater safety measures. He lobbies the military to put systems into bombs that would require a unique sequence of electrical pulses to explode the bomb, instead of a constant current that could be set off by accident. Until Peurifoy succeeds, “the lock had been placed on the bomber, not inside the bombs—and a stolen weapon could still be detonated with a simple DC signal.”

But a technological fix is not in itself sufficient. Schlosser recounts an anecdote from the 1970s: “[When] a coded switch was finally placed in the control center of every SAC ballistic missile,” the Air Force, in an act of defiance, chose the same combination for every missile: 00000000. As Schlosser notes, “Denying the safety problems only made them worse.” To this day, the Navy has resisted replacing the volatile explosives in the warheads of some of its submarine-based missiles with more stable alternatives.

The lies and obfuscations of the Air Force about nuclear accidents over the years reveal a military leadership smugly confident in its ability to determine, in secret, what is best for the nation. They also reveal alarming gaps in the leadership’s strategic approach. Schlosser quotes General William Odom describing his reaction when he was placed in charge of the Strategic Air Command and first learned details of the SIOP, or “Single Integrated Operational Plan”—the master plan for how America’s nuclear arsenal would be used: “It was just a huge mechanical war plan aimed at creating maximum damage without regard to the political context. I concluded that the United States had surrendered political control over nuclear weapons to a deterministic theory of war.”

The great flaw of Command and Control is that it more or less ends in the early 1980s, shortly after Peurifoy boards a plane to Arkansas, joining the rest of Schlosser’s sweeping history with the close narrative of the Damascus accident. Schlosser makes brief mention of Russia’s Perimeter Doomsday Machine (built between 1974 and 1985 and chronicled at length by David Hoffman in The Dead Hand), which was meant to automatically launch nuclear weapons in response to an American strike but whose deterrent value was (as in Stanley Kubrick’s Dr. Strangelove) wasted because it was kept secret.

Schlosser touches on the 2007 scandal in which six nuclear-armed cruise missiles were accidentally flown from North Dakota to Louisiana on a B-52 (the scandal wasn’t so much the flight as the fact that weapons went missing for a day and a half before anybody noticed) and makes passing mention of a 2010 incident in which one-tenth of America’s land-based nuclear weapons went off-line for almost an hour. The Air Force says that no hacking occurred; the incident was the result of a “circuit card improperly installed in one of the computers during routine maintenance.” This could be true. But Schlosser’s documentation of dishonesty on the Air Force’s part, during the Damascus accident and many others, raises the question—which he does not address—if the missiles had been hacked, would the Air Force admit it?

Schlosser glances at the $180 billion in planned spending in the decade on America’s existing stockpile. But he fails to break down the $180 billion in any meaningful way. For instance, the Air Force is planning to spend a billion dollars on a tail kit that would make the B61 bomb more accurate, a pursuit at odds with high-level Pentagon policy that nuclear weapons should be made less central to “U.S. national security strategy.”

Of course, times have changed. The military realized there was no point in destroying everything and gave up planning to do so. Schlosser’s story of how nuclear safety was gradually improved by gadflies like Peurifoy is reassuring. It shows that insiders committed to the civic good can bring the national-security establishment back from the brink, even after many billions of dollars are wasted.

But Schlosser also reminds us that secrecy can lead to, and further distort, design tradeoffs in the nascent phases of a young technology. Moreover, the assumption that key weapons are under civilian control—to be given to the military only in event of an attack—can become a legal fiction, not so unlike the legal fictions surrounding National Security Agency surveillance that we are learning about today. Zeal in the search for tactical advantage remains dicey. At the height of the Cold War, the Army purchased 2,100 nuclear bazookas, each likely to kill any soldier who used one even if it worked as designed. Are the backdoor vulnerabilities the NSA has reportedly inserted into Internet security the modern-day analogue of those bazookas, unable to discriminate between friend and foe?

The last American nuclear test took place early in the morning of September 23, 1992, about 100 miles northwest of Las Vegas. That final test, called “Divider,” was about the same size as Little Boy, the bomb that killed hundreds of thousands of civilians in Hiroshima. Since 1992, the U.S. has followed a self-imposed moratorium on nuclear testing; it has signed, but not ratified, a treaty banning nuclear tests. Despite the decades without testing, the U.S. maintains, at great cost, around 5,000 nuclear warheads (of which slightly under half are deployed). Of these, a few hundred are at bases in Belgium, Italy, the Netherlands, Germany, and Turkey, according to the lobby group Federation of American Scientists. Five hundred sit atop missiles in underground silos scattered across Montana, North Dakota, and Wyoming, and 1,152 are in submarines based in Georgia and Washington state (about 20 miles from Seattle). About 300 warheads are assigned to B-2 and B-52 bombers based in North Dakota and Missouri. The bulk of the remainder—the largest single concentrations of nuclear weapons in the country—are stockpiled underground at Kirtland Air Force Base in New Mexico and Nellis Air Force Base in Nevada. In 2010, the Air Force reprimanded the squadron in charge of more than 2,000 nuclear weapons at Kirtland for reasons it has yet to make public.

The Department of Energy, which bears official responsibility for the maintenance of nuclear-weapons technology, has built several of the world’s most powerful supercomputers in order to simulate nuclear explosions and obviate the need for testing. Robert Meisner, a Department of Energy official in charge of these simulations, says that computer simulation is joining experiment and theory as a “third critical leg of science.”

The Energy Department’s computers are indeed more powerful than they were a generation ago. They are better able to simulate the dynamics of a nuclear explosion, when a bomb’s conventional explosives are set off, squeezing its plutonium core until its density provokes a chain reaction. But more powerful parallel processors cannot model the unknown shortcomings of an Air Force squadron at Kirtland.

“The fallibility of human beings guarantees that no technological system will ever be infallible,” Schlosser writes. Throwing increased computing power at that fallibility does not eliminate it. Without testing, we can’t be certain that computer programs accurately reflect the physical world.

The strategic coherence of nuclear deterrence was oversold even at the Cold War’s height. Its virtues in the present are even less apparent. The U.S. has no great need for weapons it never intends to use; any enemy who cannot be deterred by America’s overwhelming conventional military might is simply not deterable. Nonetheless, if America is to have nuclear weapons as anything other than totemic bulwarks against a narrative of national decline, it is inevitable that they must one day again be tested. In this, hawks like Senator Mike Lee of Utah who wish to retain America’s arsenal and also the right to test it are at least consistent.

In a geopolitical climate in which America is seeking to persuade Iran not to develop nuclear weapons and to contain North Korea’s small nuclear program, testing a nuclear weapon after such a long hiatus would be provocative and destabilizing. In a June speech in Berlin, President Barack Obama reaffirmed his goal of reducing America’s nuclear arsenal to zero. But that goal seems unlikely to be met. The de facto compromise we are left with—a large, aging arsenal that hasn’t been tested for decades and likely won’t be for many more—is deeply flawed. The claims of some in the Department of Energy to be able to assure reliability through simulation alone are simply not borne out by the experience of any other complex technological artifact. America’s nuclear weapons are now principally a threat, not to Russia or China, not to Iran or North Korea, but to ourselves.

About the Author

Konstantin Kakaes, author of The Pioneer Detectives: Did a distant spacecraft prove Einstein and Newton Wrong?, is a Future Tense fellow at the New America Foundation writing about science and technology, and is the former Mexico City bureau chief for The Economist. His work has been published in The Wall Street Journal, Foreign Policy, and The Washington Post and appears frequently in Slate. He was a 2010 Knight Science Journalism Fellow at the Massachusetts Institute of Technology. Before becoming a journalist, he studied physics at Harvard University. He lives in Washington, D.C.