I have always been fascinated by the forces within organisations that come together to create outcomes – especially catastrophic outcomes. I had a parallel career in crisis management, which gave me the opportunity to dream up terrible situations for my colleagues to cope with in live exercises. It helped us prepare for the unexpected. Now that I am continuing my crisis prevention work in the field of ethics, I focus on the human and cultural factors that cause ethical misconduct.

Astronaut Garrett Reisman

For one of my ‘Why Good People Do Bad Things’ sessions a couple of years ago I studied the destruction of the Challenger Space Shuttle. Brought down by a seemingly tiny part – the O rings.

At #SCCEecei I was fascinated to hear from Garrett Reisman, a real space man. He singled out the factors responsible for the Challenger explosion, and other tragedies, such as Apollo 1 and Columbia.

Garrett found that these tragedies had certain things in common:

Normalisation of Deviance – the mission team had previously gotten away with things that should have been problematic for some time. In the case of Apollo 1 it was using 100% oxygen at a certain pressure without blowing up the capsule. In the case of the Challenger, it was the fact that previous flights had shown O-ring erosion without incident; and foam had been falling off the space shuttle external tank since the very first flight. Long before it punched a hole in Colombia’s wing.

Garrett concludes, “Just because you get away with something over and over again, doesn’t mean it is not a danger.” I tell my husband this regularly as he walks out our front door with earphones listening to a book on Audible. Like most humans, he feels secure, since he has never been hit by a car coming from the wrong direction or had his pocket picked (except by me once, to prove a point!).

I love the way Garrett put his second common factor, “None of us is as dumb as all of us”. This is another way of raising the spectre of groupthink, and the tendency of humans to want to belong, and therefore to conform even if they hold a different opinion.

Watch the Conformity Experiments carried out by Solomon Asch if you don’t believe that this is a powerful force. (I warn you the haircuts will take you back!)

The third learning has a couple of facets. The one closest to my heart is the importance of encouraging dissent.

In the case of Challenger, the engineers did not think it was safe to launch at such low temperatures, but they were literally filtered out of the call where the launch decision was being made. When running crisis management exercises, I regularly observed someone who had correct information that went counter to the “wisdom” of the group being drowned out or marginalised, resulting in strategies that worsened the situation being “decided” by the crisis team.

A corollary of this is the importance of free and open communication – a topic we discuss at length in Ethical Business Practice and Regulation.

As Garrett pointed out, all of these situations occurred in an atmosphere of time pressure. Unrealistic goals, whether they be related to time or sales, are a catalyst for misconduct. So, watch yourself – it is so easy to fall into these traps (and others) without realising it.

Being self-aware as a leader, and organisational culture-aware can help prevent some very nasty consequences.

Learning from our mistakes, both individually and organisationally, is critical, if we are to create effective ethical cultures. Cultures that result in an acceptable level of risk.