03/14/2011

The Japanese Nuclear Power Plant Crisis and Evaluating High Cost/Low Risk Events

The problems at Japanese nuclear power plants in the wake of the tsunami are likely to put a serious dent in the nascent political efforts to restart the nuclear power plant. The accident doesn't change the science. But it does change the politics. It does so because people make systematic errors in assessing what Richard Zeckhauser and Cass Sunstein aptly called "fearsome risks."

Such risks, which usually involve high consequences, tend to have extremely low probabilities, since life today is no longer nasty, brutish and short. We aim to show here that in the face of a fearsome risk, people often exaggerate the benefits of preventive, risk-reducing, or ameliorative measures. In both personal life and politics, the result is damaging overreactions to risks.

This happens because of a systematic error most uninformed members of the public make when it comes to such risks. To quote Agent Kay, "A person is smart. People are dumb, panicky dangerous animals and you know it."

The predictable result of the Japanese accidents thus is that people will freak out again, just as they did after Three Mile Island, overestimating the risks and ignoring the benefits. Zeckhauser and Sunstein opine:

Overreaction to risk is frequently found in the environmental realm. A dramatic example is provided by the Three Mile Island accident in 1979. Significant amounts of reactor coolant were leaked to the environment, including releases of cancer-inducing agents. The Kemeny Commission Report, created under Presidential order, concluded that in expectation less than one case of cancer would be created. Yet the accident affected public and political opinion sufficiently to terminate the construction of any new nuclear plants in the United States for 30 years. The coal- and oil-fueled plants built in their stead surely caused many more health problems, looking only at the air pollution they produced. (Today, nuclear power is poised to make a comeback given concerns about global warming due significantly to CO2 emissions from conventional power plants. [Ed.: Or it was, until this episode.]) The impact of Three Mile Island was reinforced due to the release of the movie The China Syndrome the same month as the accident. The movie made a catastrophic accident – narrowly avoided due to courageous action of the movie’s heroes -- “available” in the public mind. When an example of an event can readily be brought to mind, it is judged to be much more likely (Tversky and Kahneman, 1973 and 1974).

Now the Japanese reactor accidents will have precisely the same effect. People will panic and ignore the very real benefits of nuclear power. For example, you want to reduce your carbon fotprint by buying a plug-in hybrid? What about the carbon emitted by the oil- or gas-powered electric plants that produced the electricity that flows through your plug? Nuclear power offers an alternative to carbon-based fuels that could provide, at the very least, a transition power source as we search for the Holy Grail of cost-effective renewables.

The politicians know--or at least should know--better. Hell, Cass Sunstein is a top Obama adviser. But Obama hasn't lead on Libya, health care reform (left it to Congress), fiscal reform (ditto), or practically anything since he took office. Why should we expect an different on nuclear power? Which is not to say that the rest of the political class--Republican or Democratic--is likely to do better. They simply don't have the stones to fight the kind of massive, systemic error in risk evaluation that's about to unfold in front of them.

Zekchauser and Sunstein put the political problem less polemically, but make basically the same point:

If probability neglect characterizes individual judgment under certain circumstances, government and law are likely to be neglecting probability under those same circumstances. If people show unusually strong reactions to low-probability catastrophes, a democratic government is likely to act accordingly, either because it is responding to the public, or because its officials suffer the same proclivities.

More likely the former in this case.

What to do? Back to Zeckhauser and Sunstein:

The government should not swiftly capitulate if the public is demonstrating action bias and showing an excessive response to a risk whose expected value is quite modest. A critical component of government response should be information and education. But if public fear remains high, the government should determine which measures can reduce it most cost effectively, almost in the spirit of looking for a placebo that may do little for risk but do a lot to reduce fear. Valued attributes for such measures will be high visibility, low cost, and perceived effectiveness. Reducing fear offers two major benefits: (1) Fear itself imposes significant costs. (2) Both private and public responses in the face of fearsome risks are likely to be far from rational.

It'll take leadership. Unfortunately, as I noted above, leadership is not an attribute I find to be very common amongst our modern governing class.

When our emotions overtake our reasoning we worry about sensational events which are statistically unlikely to harm us — such as airline disasters, shark attacks, or terrorism — rather than everyday dangers that kill thousands. John Graham, who spent four years as administrator of the federal Office of Information and Regulatory Affairs, says news of SUV tire failures left him besieged with demands for tire pressure warning systems even though government reports listed 41 car-crash deaths per year due to under-inflated tires, versus 9,800 deaths from side-impact crashes. "People's capacity to visualize a risk is an important part of the attention they give to it," says Graham. "If you're within six months of a Three Mile Island, a Love Canal, or a 9/11, the policymakers and the public don't have the patience for the kind of cerebral risk analysis we need."

That falls in line with what Princeton professor Daniel Kahneman coined "the availability heuristic": the concept that if people can think of an incident in which a risk has come to fruition, they will exaggerate its likelihood. "Somehow the probability of an accident increases [in one's mind] after you see a car turned over on the side of the road," says Kahneman, who won a 2002 Nobel prize for his work. "That's what availability does to you: it plants an image that comes readily to mind, and that image is associated with an emotion: fear."

Update: I'm having a flame war on Twitter with a tweeter who claims that we're dealing here with one of Nassim Taleb's black swans. But so what? As has been observed in defense of the Bataan reactor in the Philippines:

A nuclear accident, such as those mentioned above, is a “black swan” or an outlier, an event that lies beyond the realm of normal expectations, as Nassim Taleb puts it in his self-same book. Key to reducing, if not, eliminating the fear of nuclear power is understanding the risk associated with it Why are we willing to risk our health, the environment, and our very existence with the use of fossil fuels over an improbable catastrophic nuclear meltdown of a reactor?

Taleb, in his book, says the focus of the investigation should not be on how to avoid any specific black swan, for we don't know where the next one is coming from. The focus should be on what general lessons can be learned from them.

And indeed we learned. It is estimated that the probability for a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the equivalent frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year, and this is according to a study by Jussi vaurio in 1984!

Another positive development is through a player in this nuclear industry, Thorium Power based in Russia which currently qualifying its proprietary thorium (a silvery metal which is thought to be between three and four times more abundant) fuel designs for use in existing and future commercial nuclear reactors. These designs have three major benefits: no production of nuclear weapons-usable materials in spent fuel, reduced nuclear waste, and improved industry operating economics. The technology will be commercially available in 2013.

For 40 years, they've quietly done their work. Three days ago, they were hit almost simultaneously by Japan's worst earthquake and one of its worst tsunamis. Not one reactor container has failed. The only employee who has died at a Japanese nuclear facility since the quake was killed by a crane. Despite this, voices are rising in Europe and the United States to abandon nuclear power. Industry analysts predict that the Japan scare, like Chernobyl, will freeze plant construction. ...

If Japan, the United States, or Europe retreats from nuclear power in the face of the current panic, the most likely alternative energy source is fossil fuel. And by any measure, fossil fuel is more dangerous. The sole fatal nuclear power accident of the last 40 years, Chernobyl, directly killed 31 people. By comparison, Switzerland's Paul Scherrer Institute calculates that from 1969 to 2000, more than 20,000 people died in severe accidents in the oil supply chain. More than 15,000 people died in severe accidents in the coal supply chain—11,000 in China alone. The rate of direct fatalities per unit of energy production is 18 times worse for oil than it is for nuclear power.

Even if you count all the deaths plausibly related to Chernobyl—9,000 to 33,000 over a 70-year period—that number is dwarfed by the death rate from burning fossil fuels. The OECD's 2008 Environmental Outlook calculates that fine-particle outdoor air pollution caused nearly 1 million premature deaths in the year 2000, and 30 percent of this was energy-related. You'd need 500 Chernobyls to match that level of annual carnage. But outside Chernobyl, we've had zero fatal nuclear power accidents.

That doesn't mean we can ignore what has happened in Japan. Precisely because nuclear accidents are so rare, we have to study them intensely. Each one tells us what to fix in the next generation of power plants. The most obvious mistake in Japan was parking the diesel generators in an area low enough to be flooded by a quake-driven tsunami. The batteries that backed up the generators weren't adequate, either. They lasted only eight hours, and power outage fallback plans at U.S. reactors are even shorter. Moreover, this is the second time an advanced nuclear facility has had to vent radioactive vapor (Three Mile Island was the first). Maybe it's time to require filtration systems that scrub the vapor before it's released.

Sen. Joe Lieberman of Connecticut says we should "put the brakes" on nuclear power plant construction until we figure out what went wrong in Japan. Rep. Ed Markey of Massachusetts wants a moratorium on new reactors in "seismically active areas" while we study the problem. That's fine. But let's not block construction indefinitely while we go on mindlessly pumping oil. Because nuclear energy, for all its risks, is safer.

Comments

The Japanese Nuclear Power Plant Crisis and Evaluating High Cost/Low Risk Events

The problems at Japanese nuclear power plants in the wake of the tsunami are likely to put a serious dent in the nascent political efforts to restart the nuclear power plant. The accident doesn't change the science. But it does change the politics. It does so because people make systematic errors in assessing what Richard Zeckhauser and Cass Sunstein aptly called "fearsome risks."

Such risks, which usually involve high consequences, tend to have extremely low probabilities, since life today is no longer nasty, brutish and short. We aim to show here that in the face of a fearsome risk, people often exaggerate the benefits of preventive, risk-reducing, or ameliorative measures. In both personal life and politics, the result is damaging overreactions to risks.

This happens because of a systematic error most uninformed members of the public make when it comes to such risks. To quote Agent Kay, "A person is smart. People are dumb, panicky dangerous animals and you know it."

The predictable result of the Japanese accidents thus is that people will freak out again, just as they did after Three Mile Island, overestimating the risks and ignoring the benefits. Zeckhauser and Sunstein opine:

Overreaction to risk is frequently found in the environmental realm. A dramatic example is provided by the Three Mile Island accident in 1979. Significant amounts of reactor coolant were leaked to the environment, including releases of cancer-inducing agents. The Kemeny Commission Report, created under Presidential order, concluded that in expectation less than one case of cancer would be created. Yet the accident affected public and political opinion sufficiently to terminate the construction of any new nuclear plants in the United States for 30 years. The coal- and oil-fueled plants built in their stead surely caused many more health problems, looking only at the air pollution they produced. (Today, nuclear power is poised to make a comeback given concerns about global warming due significantly to CO2 emissions from conventional power plants. [Ed.: Or it was, until this episode.]) The impact of Three Mile Island was reinforced due to the release of the movie The China Syndrome the same month as the accident. The movie made a catastrophic accident – narrowly avoided due to courageous action of the movie’s heroes -- “available” in the public mind. When an example of an event can readily be brought to mind, it is judged to be much more likely (Tversky and Kahneman, 1973 and 1974).

Now the Japanese reactor accidents will have precisely the same effect. People will panic and ignore the very real benefits of nuclear power. For example, you want to reduce your carbon fotprint by buying a plug-in hybrid? What about the carbon emitted by the oil- or gas-powered electric plants that produced the electricity that flows through your plug? Nuclear power offers an alternative to carbon-based fuels that could provide, at the very least, a transition power source as we search for the Holy Grail of cost-effective renewables.

The politicians know--or at least should know--better. Hell, Cass Sunstein is a top Obama adviser. But Obama hasn't lead on Libya, health care reform (left it to Congress), fiscal reform (ditto), or practically anything since he took office. Why should we expect an different on nuclear power? Which is not to say that the rest of the political class--Republican or Democratic--is likely to do better. They simply don't have the stones to fight the kind of massive, systemic error in risk evaluation that's about to unfold in front of them.

Zekchauser and Sunstein put the political problem less polemically, but make basically the same point:

If probability neglect characterizes individual judgment under certain circumstances, government and law are likely to be neglecting probability under those same circumstances. If people show unusually strong reactions to low-probability catastrophes, a democratic government is likely to act accordingly, either because it is responding to the public, or because its officials suffer the same proclivities.

More likely the former in this case.

What to do? Back to Zeckhauser and Sunstein:

The government should not swiftly capitulate if the public is demonstrating action bias and showing an excessive response to a risk whose expected value is quite modest. A critical component of government response should be information and education. But if public fear remains high, the government should determine which measures can reduce it most cost effectively, almost in the spirit of looking for a placebo that may do little for risk but do a lot to reduce fear. Valued attributes for such measures will be high visibility, low cost, and perceived effectiveness. Reducing fear offers two major benefits: (1) Fear itself imposes significant costs. (2) Both private and public responses in the face of fearsome risks are likely to be far from rational.

It'll take leadership. Unfortunately, as I noted above, leadership is not an attribute I find to be very common amongst our modern governing class.

When our emotions overtake our reasoning we worry about sensational events which are statistically unlikely to harm us — such as airline disasters, shark attacks, or terrorism — rather than everyday dangers that kill thousands. John Graham, who spent four years as administrator of the federal Office of Information and Regulatory Affairs, says news of SUV tire failures left him besieged with demands for tire pressure warning systems even though government reports listed 41 car-crash deaths per year due to under-inflated tires, versus 9,800 deaths from side-impact crashes. "People's capacity to visualize a risk is an important part of the attention they give to it," says Graham. "If you're within six months of a Three Mile Island, a Love Canal, or a 9/11, the policymakers and the public don't have the patience for the kind of cerebral risk analysis we need."

That falls in line with what Princeton professor Daniel Kahneman coined "the availability heuristic": the concept that if people can think of an incident in which a risk has come to fruition, they will exaggerate its likelihood. "Somehow the probability of an accident increases [in one's mind] after you see a car turned over on the side of the road," says Kahneman, who won a 2002 Nobel prize for his work. "That's what availability does to you: it plants an image that comes readily to mind, and that image is associated with an emotion: fear."

Update: I'm having a flame war on Twitter with a tweeter who claims that we're dealing here with one of Nassim Taleb's black swans. But so what? As has been observed in defense of the Bataan reactor in the Philippines:

A nuclear accident, such as those mentioned above, is a “black swan” or an outlier, an event that lies beyond the realm of normal expectations, as Nassim Taleb puts it in his self-same book. Key to reducing, if not, eliminating the fear of nuclear power is understanding the risk associated with it Why are we willing to risk our health, the environment, and our very existence with the use of fossil fuels over an improbable catastrophic nuclear meltdown of a reactor?

Taleb, in his book, says the focus of the investigation should not be on how to avoid any specific black swan, for we don't know where the next one is coming from. The focus should be on what general lessons can be learned from them.

And indeed we learned. It is estimated that the probability for a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the equivalent frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year, and this is according to a study by Jussi vaurio in 1984!

Another positive development is through a player in this nuclear industry, Thorium Power based in Russia which currently qualifying its proprietary thorium (a silvery metal which is thought to be between three and four times more abundant) fuel designs for use in existing and future commercial nuclear reactors. These designs have three major benefits: no production of nuclear weapons-usable materials in spent fuel, reduced nuclear waste, and improved industry operating economics. The technology will be commercially available in 2013.

For 40 years, they've quietly done their work. Three days ago, they were hit almost simultaneously by Japan's worst earthquake and one of its worst tsunamis. Not one reactor container has failed. The only employee who has died at a Japanese nuclear facility since the quake was killed by a crane. Despite this, voices are rising in Europe and the United States to abandon nuclear power. Industry analysts predict that the Japan scare, like Chernobyl, will freeze plant construction. ...

If Japan, the United States, or Europe retreats from nuclear power in the face of the current panic, the most likely alternative energy source is fossil fuel. And by any measure, fossil fuel is more dangerous. The sole fatal nuclear power accident of the last 40 years, Chernobyl, directly killed 31 people. By comparison, Switzerland's Paul Scherrer Institute calculates that from 1969 to 2000, more than 20,000 people died in severe accidents in the oil supply chain. More than 15,000 people died in severe accidents in the coal supply chain—11,000 in China alone. The rate of direct fatalities per unit of energy production is 18 times worse for oil than it is for nuclear power.

Even if you count all the deaths plausibly related to Chernobyl—9,000 to 33,000 over a 70-year period—that number is dwarfed by the death rate from burning fossil fuels. The OECD's 2008 Environmental Outlook calculates that fine-particle outdoor air pollution caused nearly 1 million premature deaths in the year 2000, and 30 percent of this was energy-related. You'd need 500 Chernobyls to match that level of annual carnage. But outside Chernobyl, we've had zero fatal nuclear power accidents.

That doesn't mean we can ignore what has happened in Japan. Precisely because nuclear accidents are so rare, we have to study them intensely. Each one tells us what to fix in the next generation of power plants. The most obvious mistake in Japan was parking the diesel generators in an area low enough to be flooded by a quake-driven tsunami. The batteries that backed up the generators weren't adequate, either. They lasted only eight hours, and power outage fallback plans at U.S. reactors are even shorter. Moreover, this is the second time an advanced nuclear facility has had to vent radioactive vapor (Three Mile Island was the first). Maybe it's time to require filtration systems that scrub the vapor before it's released.

Sen. Joe Lieberman of Connecticut says we should "put the brakes" on nuclear power plant construction until we figure out what went wrong in Japan. Rep. Ed Markey of Massachusetts wants a moratorium on new reactors in "seismically active areas" while we study the problem. That's fine. But let's not block construction indefinitely while we go on mindlessly pumping oil. Because nuclear energy, for all its risks, is safer.