To err is human – so why won’t we learn from our mistakes?

The search for Malaysian Airlines Flight MH370 goes on. According to Angus Houston, the retired Air Chief Marshal coordinating the hunt, it could take two years to locate the wreckage. The phrase of choice is “guarded optimism”, but one thing is certain: if and when the mission finally succeeds, even if the black box recorders aren’t found, lessons will be learned.

This, after all, is the nature of the commercial aviation industry. Air travel is the safest form of mass transportation. We’re far more likely to die on a train or a passenger boat. We’re safer sitting in an aeroplane than in a car. And it gets better year on year. A generation ago fatalities occurred around once in every 140 million miles flown; since then there’s been a tenfold improvement.

In short, this is an industry that learns from its mistakes. It’s no exaggeration to say that every plane crash makes the next flight safer.

One could argue, of course, that plane crashes tend to be high-profile and costly – both in terms of money and in terms of lives lost – and that consumer pressure alone is sufficient to compel the industry to take heed. But it really wasn’t until the last quarter of the 20th century that the industry as a whole learned one of the most valuable lessons of all: that to err is human.

It was particularly galvanised by a series of disasters in the 1970s. Carefully studied and dissected, these illustrated that most tragedies were caused not by technical faults but by people.

Take Eastern Air Lines Flight 401, which plunged into the Florida Everglades on December 29 1972 after every member of the flight crew became preoccupied with a landing-gear indicator light that hadn’t illuminated. Nobody noticed that the autopilot facility had inadvertently been switched off and that the plane was gradually descending to its doom. In fact, there was nothing wrong with the landing gear. The indicator light’s bulb had simply burnt out. More than a hundred people died.

The subsequent introduction of crew resource management (CRM), a sweeping new set of principles and procedures, represented a pivotal chapter in aviation history. Mistake acknowledged; lesson learned; problem solved. This ethos endures and will no doubt be reflected again if and when the mystery of Flight MH370 is at last unravelled.

Now compare this approach with the culture of so many other domains, whose response to failure is habitually characterised by stout denial, buck-passing and, after a convenient period of silence, a conspicuous keenness to “move on”. These stages are customarily encapsulated in the following hollow pronouncements:

Denial

“That’s not what happened.”

“I would dispute those figures.”

“I can’t comment on individual cases.”

Buck-passing

“We’re going to get to the bottom of this.”

“Lessons will be learned.”

“I can’t comment while the inquiry is ongoing.”

Move on

“It’s a very different situation these days.”

“It’s time to draw a line.”

“There’s no point in dwelling on the past. We need to look forward.”

Even if culpability is ultimately established, it’s usually the quality and sincerity of the apology that are deemed paramount. Often the scapegoats identified at the end of a protracted process are no longer even there: they might have “moved on to new challenges” or taken early retirement to “spend more time with their family”.

An alternative and equally contemptible philosophy is to avoid such drawn-out, debilitating sagas by imposing swift and terrible punishment on those who dare to blunder. Strict guidelines are instituted, and woe betide anyone who doesn’t follow them to the letter. Key Performance Indicators are introduced, and anyone who doesn’t measure up doesn’t progress. This is sometimes called Prozac leadership: fear of failure is all-pervasive but disguised by relentless optimism.

As the Enron scandal illustrated in spectacular fashion, the cost can be high. Enron exercised zero tolerance of failure with its “rank-and-yank” management policy. Shareholders lost $74 billion when the corporation went bust; the CEO got 24 years in prison. Ruthless monitoring of KPIs encouraged corrupt and dysfunctional practices.

Metrics have their place. In many instances, as the old adage says, you can’t manage what you don’t measure. You can’t tell if things are getting better if you don’t have a basis for comparison. But it’s important to keep a sense of proportion when enforcing targets.

On April 25 2005 a seven-carriage commuter train carrying around 700 pasengers derailed in Amagasaki, Japan, and smashed into an apartment building. As with Eastern Air Lines Flight 401, more than a hundred people were killed; around 550 were injured. Most witnesses agreed the train had been travelling very fast.

The official inquiry concluded the driver had been trying to make up lost time. Lateness would have brought him severe penalties – not just a financial sanction but attendance at an “education” course whose methods were rooted more in humiliation than in re-training. Even when he realised he had to slow down he applied the service brake, probably because to use the emergency brake would have invited further retribution. The train was just 80 seconds behind schedule. The driver was among those who lost their lives.

The trouble with any organisation in which to err is not human but literally intolerable is that mistakes are never acknowledged. And if mistakes aren’t acknowledged then it’s impossible to learn from them. The only lesson that ever really emerges is that you should cover your back at all times.

Few people would dispute that this mindset, along with its corrosive repercussions, remains uncomfortably familiar in the world of business. Nowadays the balance between safety and pressure, between support and punishment, is frequently tilted firmly in the latter’s favour – in many cases almost overwhelmingly so.

This ignores the basic truth that learning from mistakes is crucial to how we increase our knowledge. We strive for some kind of wisdom by conceding and correcting our cock-ups. In the words of Richard Feynman, winner of the 1965 Nobel Prize in Physics: “We are trying to prove ourselves wrong as quickly as possible, because only in that way can we find progress.”

The tragedy of Flight MH370 will show once again that failure, even at its most devastating, is seldom entirely in vain. To reject this fundamental truth – to forbid it – merely condemns us to make the same mistakes over and over again; and surely that, rather than failure itself, is what’s genuinely inexcusable.