The 25th and last flight of the shuttle Endeavour has come and gone. Which means there’s just one shuttle flight left: July 8’s Atlantis launch will be the 135th and final mission for the program, 30 years after the first shuttle test flights occurred.

Shuttle liftoff from Cape Kennedy, FL (Photodisc)

For anyone who was around on Tuesday, January 28, 1986, it’s difficult to watch a shuttle launch without remembering the Challenger disaster, when the space shuttle disintegrated 73 seconds after launch, killing all seven crew members. While the most commonly referenced explanation for what went wrong focuses on the technological failures associated with the O-rings, an examination of the decision process that led to the launch through a modern day “behavioral ethics” lens illuminates a much more complicated, and troubling, picture. One that can help us avoid future ethical disasters.

On the night before the Challenger was set to launch, a group of NASA engineers and managers met with the shuttle contracting firm Morton Thiokol to discuss the safety of launching the shuttle given the low temperatures that were forecasted for the day of the launch. The engineers at Morton Thiokol noted problems with O-rings in 7 of the past 24 shuttle launches and noted a connection between low temperatures and O-ring problems. Based on this data, they recommended to their superiors and to NASA personnel that the shuttle should not be launched.

According to Roger Boisjoly, a former Morton Thiokol engineer who participated in the meeting, the engineers’ recommendation was not received favorably by NASA personnel. Morton Thiokol managers, noting NASA’s negative reaction to the recommendation not to launch, asked to meet privately with the engineers. In that private caucus, Boisjoly argues that his superiors were focused on pleasing their client, NASA. This focus prompted an underlying default of “launch unless you can prove it is unsafe,” rather than the typical principle of“safety first.” (See Boisjoly’s detailed take of what happened here, from the Online Ethics Center for Engineering).

The engineers were told that Morton Thiokol needed to make a “management decision.” The four senior managers present at the meeting, against the objections of the engineers, voted to recommend a launch. NASA quickly accepted the recommendation, leading to one of the biggest human and technical failures in recent history.

An examination of this disaster through a modern day “behavioral ethics” lens reveals a troubling picture of an ethical minefield loaded with blind spots that are eerily similar to those plaguing contemporary organizational and political decision processes:

To those of us who study behavioral ethics, the statement, “We need to make a management decision” is predictably devastating. The way we construe a decision has profound results, with different construals leading to substantially different outcomes. Framing a resource dilemma in social versus monetary goods, despite identical payoffs, produces substantially greater resource preservation. Ann and her colleague Dave Messick, from Northwestern’s Kellogg School of Management, coined the term “ethical fading” to illustrate the power that the framing of a decision can have on unethical behavior.

We found that when individuals saw a decision through an ethical frame, more than 94% behaved ethically; when individuals saw the same decision through a business frame, only about 44% did so. Framing the decision as a “management decision” helped ensure that the ethics of the decision—saving lives—were faded from the picture. Just imagine the difference if management had said “We need to make an ethical decision.”

Despite their best intentions, the engineers were at fault too. The central premise centered on the relationship between O-ring failures and temperature. Both NASA and Morton Thiokol engineers examined only the seven launches that had O-ring problems. No one asked to see the launch date for the 17 previous launches in which no O-ring failure had occurred. Examining all of the data shows a clear connection between temperature and O-ring failure, with a resulting prediction that the Challenger had greater than a 99% chance of failure. These engineers were smart people, thoroughly versed in rigorous data analysis. Yet, they were bounded in their thinking. By limiting their examination to a subset of the data—failed launches—they missed a vital connection that becomes obvious when you look at the temperatures for the seven prior launches with problems and the 17 prior launches without problems.

Chances are that the reward system at Morton Thiokol also contributed to the disaster. Most likely, managers at Morton Thiokol were rewarded for pleasing clients, in this case NASA. When it became clear that NASA didn’t like the “don’t launch” recommendation, Morton Thiokol managers consciously (or subconsciously) realized that their reward was in jeopardy. And they reacted to protect that reward. This works out well for the behaviors that are rewarded; not so well for behaviors—such as safety or ethical decisions—that are not.

Management teams at Morton Thiokol and NASA appear to have utilized a powerful but deadly technique, which we refer to in the book as the “smoking gun.” They insisted on complete certainty that low temperatures and O-ring failure were related; certainty that was impossible, due not only to the bounded limitations of the engineers, as described above, but also to the statistical improbability that such data would ever be possible.

Smoking guns are used to prevent change and reinforce the status quo. The tobacco industry used it for years, insisting that the evidence linking smoking to cancer was inconclusive. We see the same strategy used today by climate change deniers. In the case of Morton Thiokol, the smoking gun was particularly effective because it was used in combination with a very specific status quo: “Launch unless you prove it is unsafe to do so.” And that combination proved to be deadly.

There are parallels between the fatal Challenger launch decision and more “ordinary” unethical behavior in corporations, politics and in society. We see them when we look at the way decisions are framed: “No harm intended, it’s just business,” or “That’s the way politics operate.” We see similarities in the limits of analysis, examining the legal but not the ethical implications of a decision. We see the power of rewards on Wall Street, where shareholder value is focused on to the exclusion of nearly everything else. And we see the smoking gun strategy utilized over and over, in statements from corporations that “the impact of a diverse workforce are decidedly mixed,” to politicians and Supreme Court justices claiming there is no clear evidence to suggest that their financial and social relationships bias their judgments. If we realize the power of these hidden forces and identify our blind spots, we can usually stop ethical disasters before they launch.

D Smeaton

June 1, 2011 @ 3:06pm

Another interesting aspect of this case is the political decision that led to the boosters being built at Morton in the interior of the country rather than on a coastal city. That decision resulted in the engineering design of a segmented booster using o-rings so the boosters could shipped by rail, rather than a single solid booster with no o-rings, that could be shipped in one piece by barge to Florida. Thus the accident was also a result of the political decision on where the booster was built rather than basing the decision on the best design.

caleb b

June 1, 2011 @ 3:06pm

The wording of ‘climate change denier’ is carefully designed. It could just as easily be, ‘climate change doubter’, but adding the denier makes it sound more like ‘holocaust denier’, which means that if you doubt one, you probably doubt the other and are therefore evil.

Climate change doubter encompasses a wide variety of people. Some doubt that man has as much impact on the environment as they are told. Some think that the problem cannot and will not be solved because the GLOBE would need to coordinate efforts, and that would be impossible.

And some people are like me…I think that humans are very similar to cockroaches, in that we can survive in almost any environment. So if we end up making our environment more difficult to live in, so what? Isn’t it our planet to ruin anyway?

Wait, wait….if you listen closely you can hear steam coming out of the greenies ears now.

Read more...

James

June 1, 2011 @ 4:36pm

Except that the only climate change "doubters" are those who don't trouble to understand the science, or who deliberately refuse to look at it, because they don't WANT it to be true. The parallelism between them and the tobacco & Holocaust deniers is apt and accurate.

jblog

June 1, 2011 @ 4:57pm

I don't doubt climate change, but I do doubt the dire consequences of it that some climate change zealots espouse, which are not supported science.

And I suspect they doubt them as well, or they would change their own behavior.

Want me to really believe it's as bad as you say it is? Give up all forms of fossil-fuel-based energy -- right now.

To quote one of my favorite Zen philosopher-poets: I'll believe it's a crisis when the people who say it's a crisis act like it's a crisis.

John B

June 1, 2011 @ 4:44pm

Great article on the need for ethics in decision-making.

BUT a bad analogy: "We see the power of rewards on Wall Street, where SHAREHOLDER (my emphasis) value is focused on to the exclusion of nearly everything else."

Actually, shareholder value is the item most generally excluded. When boards of directors provide for outrageous management compensation, golden parachutes, stock options, insider dealings---all of which are at the expense of the shareholders.

The writer shows a lack of knowledge or bias in making such a statement.

Joel Upchurch

June 1, 2011 @ 5:16pm

From my point of view, the problem was that the decisions are made by people who didn't understand the risks involved. In a rational universe, the decisions should be made with people with the most information. In this case, we ended up with people who didn't understand the o-rings falling back on things they did understand, which was the need to keeping the customer happy.

Bethany B

June 1, 2011 @ 8:38pm

One man's comfort is another man's hell. It is pure hubris to think that humans can stop global climate change.

Joshua Northey

June 2, 2011 @ 3:00am

A) I am sort of struck by the fact that the world isn't perfect. Given what they are trying to do, the space program seems relatively safe. Sure there are mistakes and poor decision making. Welcome to every single institution on earth! I understand the desire to improve, but in some ways I feel the "safety first" culture actually increases bad decision making. Because "safety first" is an unreasonable standard in a lot of endeavors, and a frankly silly one in space. Safety is one concern among many, and by always trying to pretend it is "first" you risk actually losing touch with what its actual proper, and likely functional/practucal place is.

B) If you don't think the climate is changing you don't follow science at all. At the same time if you DO follow science closely. There is a pretty clear and consistent pattern of exaggeration and sensationalization of the likely effects of climate change. Some of this is probably intentional fear mongering as the public seems so unmoved by even these alarmist scenarios. But some of it has to do with the fact that the same people likely to study and find data relating to climate change are generally the people who value those things in their current state the most.

Read more...

JJS3

June 2, 2011 @ 3:14pm

In response to "A": I don't think the author of the article is trying to re-prioritize safety in the space program, although I believe that for manned flight, safety should have a high priority. Of course space travel is inherently dangerous, and if we required absolute safety, we would not do it.

The article's point was to analyze what factors contributed to an erroneous decision.

Allen

June 2, 2011 @ 3:05pm

I liked the rewards driven decision element. I learned in the Army that the troops do what the commander checks. In this case, rewards given or withheld are the check in their check-and-balance system. Obviously it didn't work. This would imply a review of the decision process at both NASA and Morton Thiokol. I wonder what the internal process review (if any) looked like and what the decision process changes were.

yo

June 4, 2011 @ 12:28pm

Climate change deniers?

Funny.

The people who went to great lengths to "hide the decline" in temperatures, call their opponents "climate change deniers".

It would be funny if it weren't so catastrophic to our country.

Milan

June 7, 2011 @ 5:14pm

I think what is even more telling is society's decision to not show support for those who operate with an ethical lens. The all-important whistleblower law has not yet passed due to inertia, apathy, and downright bad decision making by our lawmakers.

John B.

July 5, 2011 @ 6:39pm

My recollection, as a former journalist who covered the aftermath of the disaster, was that Larry Malloy, who ran the solid rocket booster program, unilaterally overrode the concerns of the Morton-Thiokol engineers and authorized the launch. Although M-T management and engineers met privately to discuss the engineers' concerns, Malloy was well aware of those concerns, as the engineers had spoken directly to him about them. They told him that an o-ring failure due to cold air temperatures could cause "a catastrophic loss of life." He dismissed that warning and authorized the launch.