Friday, June 25, 2010

Risky Business

Bucky Fuller was one of the first to observe and document the various "gestation periods" that different industries have with respect to adopting innovations. He noted that the housing industry was among the slowest to innovate while aviation and aerospace industries were constantly seeking ways to enhance performance while reducing resource/energy inputs.

Apparently different industries look at risks through different lenses as well. (GW)

While there never has been an oil spill in the Gulf of Mexico quite as large as the current disaster, there have been other terrible mishaps and, as in every industry, near misses.

These close calls are what Scott Shappell, professor of industrial engineering at Clemson University, looks for when he works with airlines on quantifying their risk from human errors.

"All you hear about are crashes, but it's the near misses that are telling," Prof. Shappell says. "If you only knew how many near misses there are in aviation, you would never fly again."

Near misses can be studied by statisticians to estimate the probability of an event that hasn't occurred before. Estimating the probability of unlikely disasters has become standard practice for nuclear and space regulators. Such an exercise, experts say, could help companies involved in deep-sea drilling evaluate risks and possibly prevent catastrophes like the Gulf oil spill.

Nuclear-power companies have used a tool for quantifying risk known as probability risk assessment, or PRA, since the 1970s; the National Aeronautics and Space Administration adopted the process in the 1980s. PRA takes into account the many possible failures of machines and people. It can be used to set a ceiling on allowable risk, and to identify and repair trouble spots.

"It's been really handy," says Robert Doremus, manager of the NASA shuttle program's safety and mission assurance office. "It has helped us step back and say, where are our high-risk areas?"

BP PLC didn't analyze its drilling in this way before the Deepwater Horizon explosion created a massive oil spill in the Gulf of Mexico, says spokesman Jon Pack. Instead, the company concluded, based on the history of drilling in the region, that a blowout was a "low-likelihood" event. "We don't grade the risk in terms of percent likely to happen," Mr. Pack says. "It's not quite as scientifically hugely detailed." Minerals Management Service, which oversees offshore drilling, didn't respond to requests for comment.

The probability estimate is just one piece of a risk assessment. The other is quantifying just how much damage a rare event, like a massive oil spill, would cause. "What is important is, what is the damage?" says Manfred Gilli, an economist at the University of Geneva. "If something happens with high probability but just a few fishes die, well, sorry for the fishes."

The nuclear-power industry takes a somewhat different approach, one that analyzes the engineering of the plant and tabulates every possible series of unfortunate events that could lead to the release of dangerous radioactive material, including equipment failure, operator error and extreme weather. Statisticians tabulate the probability of each disastrous scenario and add them together.

For the 104 nuclear reactors in commission in the U.S., the probability of core damage ranges from "several chances in a thousand per year of operation for a reactor, all the way down into the one-in-10-million-per-reactor-year range," says Scott Burnell, spokesman for the Nuclear Regulatory Commission. Core damage doesn't guarantee release of radioactivity, which the agency sets a goal of keeping below a one-in-a-million long shot at each plant.

While the probability of any single reactor melting down in any one year may be low, the chance of a nuclear accident somewhere at some time is far higher. The same applies to the BP spill.

For NASA, the risks are much higher. The agency puts the chance of any one shuttle mission ending in disaster—loss of crew and vehicle—at one in 89. That's better than the demonstrated reliability of one in 66—two shuttle disasters, Challenger and Columbia, in 132 completed missions, as Mr. Doremus notes.

Yet the estimated risk for shuttles has crept up at times over the years, even as NASA has worked on safety, because the agency has incorporated more possible risks and changed the statistical formula it used. The latest calculation, produced by a team of NASA staffers, takes into account the risk that orbiter-flight software might fail or that a bad event could unfold during a space walk. That creates the seemingly paradoxical situation in which missions might get safer even as reported risk increases.

There also is some danger in producing risk assessments that are too precise. "Most laypeople want a single number," says Todd Paulos, chief technology officer for Alejo Engineering Inc. in Huntington Beach, Calif., but "we can't predict anything to that accuracy."

The wild card in all these estimates is human error. Nuclear-power companies for years have sought to calculate and reduce risk by evaluating how people respond to simulations of potentially dangerous situations.

In light of the spill in the Gulf, it wouldn't be surprising if drilling companies examined their own risk levels. "Now there is going to be a burst of PRA in the oil business," Dr. Paulos predicts. Of course, "doing it after the fact doesn't help you before the fact."