Pages

Tuesday, September 9, 2014

Human Performance – Busting the Myth of Human Error

Human Performance –
Busting the Myth of Human Error

This blog is a brief introduction to a topic that two of our team
members, Paul Gantt and Ron Gantt, are presenting at the National Safety
Council Congress and Expo in San Diego on September 15th, session 14

At a client’s site the other day they were discussing an
incident where a rail car loader overloaded a rail car past a visual cue that
should have told him when to stop loading. No spill resulted, luckily, but the
incident was not caught until after leaving the facility and therefore had to
be reported to the appropriate regulatory authority, per legal requirements. In
the investigation the site determined that it was “human error, plain and
simple.”

Pretty typical right? Many safety “researchers” such as
Heinrich warned us about this with warnings that somewhere between 80%-90% of
all accidents were caused by unsafe acts. Our systems would be perfect if it
wasn’t for all these unreliable people! But still, you can’t fix stupid. Being
a safety professional would be an easy job if we didn’t have to daily deal with
the reality of having to “protect you, from you, in spite of you.”

This fact, the fact of the unreliable human that is the
problem we must control, informs most of the interventions that safety
professionals use to this day. Think about our go to interventions – training,
safety rules and regulations, procedures. All designed to change behavior. And
if we want to push deeper and go for real “world-class” safety status? Then we
have behavior-based safety, incentive structures, strategies to go after hearts
and minds. Once again, all designed to change behavior. The idea that we need
to change people’s behavior because it is inherently deficient provides a
poignant foundation to most of safety practice.

But is it really true that humans are a problem to control,
that they are so unreliable? Let’s look at the above example with the
overflowed rail car again. Surely, it is easy to look at a visual cue and fill
a container up to that cue – cooks and bartenders do it all the time. So how
could the guy be so stupid? Let’s look closer by looking at ourselves for a
second – do you ever take your mind off of what you’re doing when you’re doing
something routine and mundane? Of course you do! How do we know? Because you’re
a human!

So, imagine someone passively monitoring the filling of rail
cars day in and day out. Would we be surprised if their attention ever wanders?
Of course not. In fact, we would expect that person’s mind to wander a lot. And
really, it was just a matter of time before the mind wandering led to
overfilling of the container. With that being the case, the interesting
question is no longer “why did the loader overfill the rail car?” The
interesting question is “why aren’t we having many more instances of loaders
overfilling rail cars?” The current work system requires that the loader to be
paying attention at the right time to ensure that the car isn’t overfilled. Put
another way – the work system requires that the human operator ensure safety in
an environment with a design that encourages human limitation (i.e. attention
span). Yet, somehow, the loaders almost never overload rail cars. If we look
closer at the overflowing incident the story is not really one of “human error,
plain and simple,” instead it is one of human reliability. The overfilled rail
car is evidence of the remarkable ability of our workers to adapt to poorly
designed, unsafe work environments and pull success and safety from the jaws of
failure.

Safety scientist, Sidney Dekker, presents view of the state
of the safety profession in regards to human error as two opposing models – the
old view and the new view. The beliefs of each view, as identified by Dekker,
can be seen in the table below.

We agree with Dekker in that we believe that the safety
profession is at a crossroads. We can either continue to use methods of safety
management that we’ve used before and are comfortable with, which focus on
controlling unreliable people, or we can begin to make real progress on
preventing “human error” by adopting the new view of human error, which sees
people as normally reliable sources of success, particularly in environments
and contexts that are properly designed to maximize human performance. Or, as
we put it to our client regarding the rail car incident – you can either come
to terms with the fact that it will happen again, or you can actually fix the
problem.

1 comment:

Always a tricky subject for engineers (my experience over the last few years). Here's another exposition in the context of safety and justice http://www.safetydifferently.com/the-use-and-abuse-of-human-error/

(Longer magazine version at http://www.skybrary.aero/bookshelf/books/2568.pdf)