Pages

Wednesday, April 1, 2015

“It Can’t Happen Here” – Distancing through Differencing

Recently, while doing research for a project, we happened
upon a news article following a train derailment in Spain where a train hit a
curve too fast and derailed. In the article an expert in US rail explains why such a train derailment is unlikely to happen in the US. For example, he
explains that, particularly in the Northeast of the country, there are a lot of
curves and tunnels. This, the expert points out, makes it harder for a train
engineer to become “complacent or bored.” Additionally, US trains have an
auto-stop feature, which should stop the train if a train engineer misses a
signal. So train riders in the US have nothing to fear.

Less than 5 months later,
a train operating in the Northeast part of the country derailed, killing four
and injuring 67. The cause? An engineer fell asleep and didn’t slow down in
time for a curve.

What gives? We thought trains in the US were supposed to be
safe.

The problem is not really with the expert’s judgment.
Everything he said is essentially true (to a degree). The problem is with a
common reaction many people have to accidents. Cook and Woods call this “distancing through differencing.” Essentially, it’s the tendency for
people, following a major accident, to look for reasons why the same accident
could not happen here. They do this by highlighting how details specific to the
accident are not present in their environment. Essentially, they distance
themselves from the accident by highlighting how different they are.

Others have called this “playing the ‘it can’t happen here’
bingo card”. John Downer described this phenomenon related to the Fukushima
nuclear disaster – the tendency for those conducting risk assessments for the nuclear industry
to “disown” the fact that such a disaster calls into question their methods.

This tendency to separate ourselves from the accidents and
disasters we hear about is not really surprising. If we admit that the bad
accident that happened over there can happen here, it has several unsettling implications.
Our world may be more uncertain and scarier than we hope it is. We might have
to admit that our own implicit and explicit assessments of our safety are not
accurate. We might have to change the way we think and act as a result (oh the
horror!).

But this need to believe we’re safe and secure, that all the
problems are over there, with those other people, may be making us less safe. We
get into a false sense of security because we start to think that the
differences are what makes us safe. Take the example of the rail expert above.
He listed reasons why, in his opinion, the US rail industry was safer than the
one in Spain. Essentially, he posited a theory about rail safety – that more
curves, tunnels, and automated stopping mechanisms were what was missing in the
Spain accident and therefore must be what’s keeping the US safe. So, instead of
looking at the accident in Spain and trying to figure out how we are similar
here in the US (e.g. we both have engineers who are humans, subject to significant
boredom when doing rote tasks), instead of going out and asking our train
engineers if something like that could happen here, and then making adjustments
to reduce risk, we decided it couldn’t happen. And then we did nothing.

But we were wrong. It not only could happen, it did happen.

This is a consistent problem we see in the safety profession
– believing that something is keeping us safe with very little evidence to back
up that assumption. Think of all the things you do as a safety professional. Where
is the evidence that what you’re doing actually will achieve the intended
results? What if all the things that you think are making you and your workers
safe actually aren’t having any effect at all? What if, as Peter Senge says, bad things are happening not in spite of what we’re doing, but because of
what we’re doing?

If we play our “it can’t happen here bingo card” when
accidents happen in other places we limit learning. It is time for the safety
profession to wage war on anything that gets in between us, the organizations
we serve, and learning. This idea that we have it all figured out, that we know
how our organizations are operating and what they need to do to be safe needs
to go the way of the dodo. This means we need to take advantage of any
opportunity we have to learn about how systems and people operate. Here are some
recommendations to get you started on the learning process:

The next time a major accident happens have a discussion
with your employees about what happened. Have them identify the similarities
and what it would take for a similar accident to happen at your organization.

Look at your organization and identify any policies,
procedures, or processes that may discourage learning. For example, any policy
that ties incentives to accident numbers, or any requirement that puts blame on
workers for an accident. Get rid of them. If you’re scared to do this, just try
it on an experimental basis. Get rid of them for a few months and see what
happens.

Most days in your organization go by without an accident.
Why? We all have our theories about what’s keeping us safe, but we don’t know
for sure until we get out and identify how work is actually happening. Go out
and observe normal work. Look at all the things that could go wrong and right.
What’s causing those things to happen or not happen? What is really keeping you
and your workers safe?