The Conundrum of Close Calls: Lessons Learned for Securing Nuclear Weapons

A paper prepared for the Nonproliferation Policy Education Center conference on “Securing Nuclear Arsenals for the Next Half Century: What Does History Recommend?”

The papers prepared for this conference are valuable contributions to the literature on nuclear security, as they bring to light new evidence of instances when nuclear test sites, weapons in transit, and deployed weapons were threatened during times of political instability. The authors did not, of course, discover instances in which nuclear weapons were actually stolen or used by rogue officers, revolutionary mobs, or terrorists. So there is a significant puzzle about how best to interpret the “close call” incidents highlighted in these papers.

The most obvious learning problem with near-histories [is] the necessary ambiguity of interpretation…Every time a pilot avoids a collision, the event provides evidence both for the threat and for its irrelevance. It is not clear whether the learning should emphasize how close the organization came to disaster, thus the reality of danger in the guise of safety, or the fact that disaster was avoided, the reality of safety in the guise of danger.[1]

A “systems safety” approach to this conundrum, however, focuses not in the “inherent ambiguity” of nuclear close calls, but rather on three related details about the incident in question and the organization’s reaction to it afterward. First, how “close” was the “close call?” Can one provide an assessment of the probability that the incident under investigation would have led, under somewhat different but plausible conditions, to the theft or use of a nuclear weapon? For example, if a nuclear power plant has five redundant safety devices and four fail, that is a closer call to an accident than if only two fail. Second, what “saved the day” and prevented unauthorized individuals from getting control of a nuclear weapon? To the degree that the events were anticipated and appropriate safety mechanisms were therefore built into the system, the incident should be placed on the safety, not the danger, side of the ledger. To the degree that the events were not anticipated, however, and good fortune, not good design, saved the day, a more pessimistic assessment is warranted. Third, what was learned from the incident? If the organization appropriately adjusted its policies and procedures after a close call, one should predict that the likelihood of a reoccurrence has decreased. If that is not the case, however, then one should predict that the likelihood of a second security incident, like the first one with similar risks, has not been reduced.[2]

This paper will address each regional study through this lens of “normal accident theory” and organizational learning and will draw broader lessons for the security of nuclear weapons and material in each case. Unfortunately, the project authors do not always focus on these three dimensions of the problem in their case studies. Thus we have provided our own assessment, when evidence exists, or alternatively simply pointed out where more research is needed to provide an appropriate assessment.

Russia

Nikolai Sokov provides accounts of procedural dilemmas and some cases of near-losses of control over nuclear weapons. The stories are chilling, but we lack crucial details. Did the tactical weapons on the aircraft in Baku in 1990 have PALs (Permissive Action Links) on them, and if so, how effective were the devices? (Even the best PALs are not a panacea, for they only delay the ability to use the weapon or the material inside it.) As far as we can tell from the Sokov paper, the Soviet government had not pre-planned for the possibility of a large-scale insurrection on Azerbajani territory and the emergency withdrawal of the tactical weapons was an improvised security procedure. The Azerbaijan case also raises an important question about how to assess the probability of a system failure when there is deep uncertainly about what would happen if the crowd had attacked the transport aircraft. A senior DOE official expressed his sense of the risks involved in in such dangerous situations, and the need to avoid them, when he argued that “once the firefight starts, it is a crapshoot.”[3] The 1990 incident also reveals the paradox of how efforts to protect weapons – transferring them to more safe locations – can create vulnerabilities. Finally, Sokov could trace whether there was “trial and error” learning from the incident and whether subsequent operations to remove nuclear weapons from former Soviet republics were conducted under more effective operational security.

Sokov’s research into the August 1991 coup attempt highlights the degree to which even sophisticated command and control systems are vulnerable to failure in times of political instability. His account implies that there was a combined technical and political “checks and balances” control arrangement under Soviet Kazbek system: the General Secretary of the Communist Party held the authority and capability to launch nuclear weapons, on his own, if but only if the warning system had indicated that a U.S. attack was underway. If true, this description suggests: a) that the Soviet leader could not use nuclear weapons first; and b) that the system was highly vulnerable to the risk of coup leaders seizing and disabling Gorbachev’s Cheget, as occurred in 1991. The first point appears unlikely, however, as, according to David Hoffman, the General Secretary did have the authority to order a first strike.[4] The “check” in this case was that the General Secretary’s “permission order” would have to be transformed into a “direct command” by the General Staff.[5] There is still ambiguity in the 1991 case about who had the authority versus the capability to launch nuclear weapons. But if Hoffman is right, it seems plausible that the coup leaders could have launched nuclear weapons on their own for the three days in August. Sokov’s account demonstrates that the coup leaders were able to place elements of the Soviet nuclear arsenal in a heightened state of alert, but also suggests that personal intervention by some anti-coup officers (Varennikov) could countermand their orders to specific units. The 1991 case is thus an illustration of the precarious “always/never” balance, whereby the loss of checks and balances in a command and control system increases the likelihood of unauthorized use.

Pakistan

Feroz Khan identifies Pakistan’s Chief of the Army Staff (COAS) as a key stabilizing factor in the Pakistani nuclear weapons security system. Khan argues that the COAS played a key role in ensuring the undisrupted security of nuclear forces regardless of political posturing and changes of leadership. We accept the notion that nuclear security procedures can be bolstered by decreasing their dependence on what could be rapidly shifting and unpredictable political conditions. However, many other questions about nuclear security in Pakistan are not sufficiently addressed in the paper. Pakistan has reportedly received nuclear security assistance from the United States, but faces extreme challenges: the “vulnerability/invulnerability paradox” and insider threats.[6] We encourage Khan to address the following issues in subsequent revisions.

First, is there an enduring risk of an Islamist Coup? In 1995, the Pakistani Army arrested 40 officers, who were implicated in a coup plot led by Maj. Gen. Zahirul Islam Abbasi, who had alleged links to Islamic fundamentalist groups.[7] Another example came to light in June 2011, when Brig. Gen. Ali Khan was arrested in Pakistan on charges of suspected ties to Islamic fundamentalists.[8] If all authority and capability is in the COAS, and that officer is replaced by a military leadership with radical fundamentalist beliefs, wouldn’t the lack of checks and balances actually make nuclear security more problematic in Pakistan?

Second, how effective is the Personnel Reliability Program (PRP) in Pakistan to address the insider threat? We lack evidence about the record inside the Pakistani Strategic Plans Division, but certainly the record inside the Pakistani personal security organizations does not engender confidence. In January 2011, Punjab Governor Salman Taseer was assassinated by one of his own bodyguards, who later told police that he had murdered the governor for his opposition to Pakistan’s blasphemy law.[9] Two assassination attempts in 2003 against President Pervez Musharraf also involved insiders, though in these cases the security guards were not “lone wolves” but were tied to jihadist terror organizations.[10]

Third, how does Pakistan address the vulnerability/invulnerability paradox: when nuclear weapons are de-mated from delivery vehicles and locked inside a guarded facility, they are more secure from theft or seizure, but more vulnerable to an enemy strike; when nuclear weapons are taken out of the base, mated with road mobile missiles, and dispersed into the country, they are less vulnerable to a first strike, but more vulnerable to theft.[11] The insider threat problem and this vulnerability/invulnerability paradox could be mutually reinforcing in a dangerous way: as Jihadist groups become more active in Pakistan, the likelihood of an ISI supported group attacking India might increase. That event, however, would then increase the likelihood that Pakistani weapons would be taken out of more secure locations on military bases and deployed into the field.[12]

Finally, how will the likely deployment of tactical nuclear weapons and short-range missiles in Pakistan influence nuclear security? This trend puts pressures on commanders to delegate authority to launch down the chain of command, or worse, pre-delegate authority to launch. This is a situation that increases the risk of accidents and of unauthorized use.

China

We received this case study late and therefore can provide only a short assessment of the three factors discussed above. Mark Stokes brings to light examples of the fracturing of command and control systems during the Cultural Revolution, most notably the conducting of a risky nuclear missile test in October 1966. Stokes asserts that China learned from its experience during the Cultural Revolution, has prioritized security and safety over operation readiness, a situation which “could result in self-imposed constraints on the size of its arsenal.” This is a key finding, but it is unclear whether Chinese nuclear doctrine or security concerns are the driving force behind the limited size of the arsenal.

France

Bruno Tertrais presents new evidence in his case study of Algeria in 1961 and is careful not to overstate the vulnerability of the French nuclear device. This is laudable, but was this a case of danger in the guise of safety? An assessment should address whether redundant safety mechanisms were in place and whether the key ones were designed into the system or were unanticipated or improvised. The French case demonstrates, like others in this workshop, how the security of nuclear arsenals can depend on the personal loyalties of individuals in times of political instability. Thierry’s decision not to side with the coup leaders was critical to the protection and eventual destruction of the nuclear device. The lesson from the Algerian case is not that there was no danger of a nuclear weapon falling into the wrong hands, but rather that the choice of whether or not the nuclear device would fall into the wrong hands came down to Thierry, who actively considered the request of the rebellious generals.

The Algerian case brings us to another important insight: the nuclear device to be tested was just that, a device, not a bomb. Tertrais makes clear that the device to be tested would not have been able to be detonated (at least not promptly) without the automatic arming mechanisms that were located at the testing tower. It is worth determining whether this set up was designed into the testing program to enhance nuclear security and safety, or whether it was done for reasons of technical convenience. It would also be helpful to know how often such redundancies were applied beyond the Algerian case.

Just the beginning

These papers focus on the record of close-calls from the past. But this safety record may be less reliable as a guide for the future if new nuclear weapons states have increased inherent risk characteristics. Unfortunately, two studies of such national risk indicators published by the World Bank and Polity IV database scores from 2009 and the 2012 NTI Nuclear Materials Security Index, suggest that dangers are growing. The chart below visually compares current nuclear weapons states and Iran on World Bank indicators of control of corruption, political stability, and Polity IV scores. Taken as a whole, the nuclear powers and Iran fail to inspire confidence in every category. But the data clearly suggests that newer nuclear states pose higher challenges regarding such risk factors.

The Nuclear Threat Initiative published their Nuclear Materials Security Index (below) in January 2012 and rightly emphasized problems of political stability and institutional corruption heavily in their “societal factors” category.[13] One glance at their rankings reveals that there is nothing special about nations that actually possess nuclear weapons when it comes to nuclear materials security. Six out of the nine nuclear powers rank in the bottom third of states with weapons-usable nuclear materials, and none of them rank in the top ten. Of further note to this conference is that the overall score of every nuclear weapons state except North Korea is brought down by their scores in the societal factors category, which includes political stability and corruption indices. And potential future proliferants (e.g. Iran) exhibit similar patterns.

Other Cases

This project has usefully encouraged research into events that have until now been shrouded in secrecy. There are, however, a number of other cases of nuclear “close calls” worth examining. First, an attack on the 42nd Field Artillery Brigade at the U.S. Army Base in Giessen, West Germany on January 4, 1977 was reportedly carried out by The Revolutionary Cells (RZ), although the extent to which the stored nuclear weapons were targeted is unknown.[14] Second, the Japanese cult of Aum Shinrikyo is known to have sought nuclear weapons and biological weapon before settling for sarin gas. The extent of Aum’s penetration of the Russian military, and its efforts to acquire nuclear materials through that pathway remains understudied.[15] Finally, in 1981 four members of the Red Brigades kidnapped Brigadier General Dozier, NATO Deputy Chief of Staff at Southern European land forces. They held him for 42 days time, during which time they interrogated him about the location and security measures for nuclear weapons. The details of their plans remain largely unexamined.[16]

[3] Scott D. Sagan, “New Ideas for Strengthening the Design Basis Threat (DBT) Process,” a report of the American Academy of Arts and Sciences’ workshop on strengthening physical protection of nuclear weapons and materials, January 12, 2009.

[6] Pakistan has receivedapproximately $100 million worth of technical assistance since 2001 according to David Sanger of The New York Times. See David E. Sanger and William J. Broad, “US Secretly Aids Pakistan in Guarding Nuclear Arms,” New York Times, November 18, 2007.

[12] There are two extremes in public discussions of Pakistani nuclear security. The extreme western view claims that Pakistani forces are exceedingly vulnerable to seizure, exemplified by the National Journal article “Nuclear Negligence” by Jeffrey Goldberg and Marc Ambinder, who reported that nuclear materials in Pakistan were transported in unmarked civilian vans with negligible security in an attempt to hide their movements from the United States. (Jeffrey Goldberg and Marc Ambinder, “Nuclear Negligence,” National Journal, November 9, 2011. <http://www.nationaljournal.com/magazine/the-pentagon-s-secret-plans-to-secure-pakistan-s-nuclear-arsenal-20111104>.) At the other extreme, some Pakistani officials make absurdly optimist claims about security. For example, the chairman of Pakistan’s National Engineering and Scientific Commission (NESCOM), Samar Mubarakmand, claimed: “I will put a nuclear weapon on the road, you can keep it there for 10 months and I guarantee you that no one can use it or detonate it or cause any destruction from it.” (Samar Mubarakmand, “Capital Talk Special,” Geo TV, March 5, 2012. <http://www.pakdef.info/forum/showthread.php?9214-Dr.-Samar-Mubarakmand-s-Interview-with-Geo-TV>, [Date Accessed: February 20, 2012].) Furthermore, in June 2011, Interior Minister Rehman Malik claimed that “[Pakistan’s] nuclear weapons are 200 percent safe.” (“Rehman Malik assures Pakistan nukes are 200 percent safe,” Jagran Post, June 5, 2011. <http://post.jagran.com/Rehman-Malik-assures-Pakistan-nukes-are-200-percent-safe-1307287729>, [Date Accessed: February 23, 2012].)

[15] For work on Aum’s biological and chemical weapons programs see Richard Danzig, et al., “Aum Shinrikyo: Insights into How Terrorists Develop Biological and Chemical Weapons,” Center for a New American Security Report, July 20, 2011.

The Nonproliferation Policy Education Center (NPEC), is a 501 (c)3 nonpartisan, nonprofit,
educational organization
founded in 1994 to promote a better understanding of strategic weapons proliferation issues. NPEC educates
policymakers, journalists,
and university professors about proliferation threats and possible new policies and measures to meet them.