Don't Outsource the Human

This
sounds like the midnight call that all safety managers hate to get.A call that indicates an injury occurred and
a whole litany of steps need to be taken:Comfort the victim and family, shut down the area, and deal with the
paperwork.Then there is the
investigation and remediation.Typically
the remediation is focused on the humans involved by issuing more procedures,
training, and supervision.Indeed,
looking at an existing procedure after an injury often can lend clues as to how
the system let the victim down.The
development of a new procedure, more defined and refined, can help avoid the
incident in the future.

Something
went wrong…people got hurt.But this
headline was reporting on something less substantial.People lost money, a company may go bottom
up.

Knight Capital Group, a “market maker” in the
stock market is one of the largest executer of stock buys & sells in the
United States.Knight Capital had
increasingly relied on computerized trading to execute its exchanges.On August 1st, a bug in the
software started randomly buying and selling about 150 companies sending the
stock market into disarray as others started to react to these trades.

Knight
Capital announced it lost $440 million that day.Not measured are the losses of untold others
who also lost money due to the debacle.You can read the account in a New
York Times article.

So
what did the U.S. Securities and Exchange Commission do in response?Immediately they instituted a new procedure.All companies involved in computerized
trading must fully test their software before putting them in place in the
market or before making any changes.

Sound
like a new procedure that may be put in place at your site after an injury?

An Op-Ed
in the New York Times caught my eye.It was written by a former software engineer named Ellen Ullman.Based on her experience she tells us that modern
computer software is often very complex involving many different software
modules developed by many different companies who don’t share code; it is not
possible to “fully test” any software of this magnitude.No computer code is without bugs.Even fixes, called “patches”, may add a new
bug.

Is
it possible that a new safety procedure at your site may have “bugs” and cannot
anticipate all that can go wrong or worse cause an injury itself?

Ms.
Ullman suggests that Wall Street look at what credit card companies have
done.By law, if someone steals your
card information and makes fraudulent purchases, the credit card company must
cover the cost.This motivates these
companies to try and be on the lookout for purchases that seem atypical.How do they do that?They have humans
watching.

When
an unusual pattern of purchases occur, like when you travel to a foreign
country and use your card, a human will freeze the card and give you a phone
call (which can be quite inconvenient when your out of country with a frozen
credit card!).Ms. Ullmans suggests that
Wall Street should have human observers to catch errant trade patterns too.

Her
conclusion:

“we
need code and attentive human beings.”

Translation
for safety professionals:

“we
need safety procedures and attentive human beings.”

Lets not fool ourselves like Knight
Capital and think that our procedures can keep us safe.Yes, we need sound procedures and we need to
keep refining them.But procedures will
not catch all safety hazards and potential at-risk behaviors.Furthermore, the environment, people, and
other procedures are constantly changing all around us.A once-sound procedure may turn into a
hazardous one.

Therefore,
we need to foster a safety culture that fosters “attentive human beings” who are
on the alert for something out of the ordinary that may result in an injury or
process event.Here are some strategies:

1)Train your employees on how to identify
hazards.Play monthly “Hazard ID” games
with your teams.

2)Give employees permission to alert each other
and supervision when something seems wrong, even when that something may shut
down operations or be embarrassing.

3)Create a user-friendly reporting system to
capture near misses, minor injuries, hazards, and procedural improvement
suggestions.Publicly use the
information you get.

4)Help employees institute a peer-to-peer observation
and feedback system like People-Based
Safety.Remove barriers to participation
by making it no name & no blame.Trend the data to look for something out of the ordinary.

5)Reinforce attentiveness.When an injury is diverted or process
disaster averted make a big deal of it.This will encourage more in the future by more people.

Certainly…
write the code… but don’t outsource the human.

Timothy Ludwig’s website is Safety-Doc.com where
you can read more safety culture stories and contribute your own.Dr. Ludwig is a senior consultant with Safety
Performance Solutions (SPS: safetyperformance.com),
serves as a commissioner for Behavioral Safety Accreditation at the non-profit
Cambridge Center for Behavioral Studies (CCBS: behavior.org) and teaches behavioral psychology at
Appalachian State University, in Boone, NC.If you want Tim to share his stories at your next safety event you can
contact him at TimLudwig@Safety-Doc.com.