Error

Human error is not a cause!

Error, mistake, faux pas, gaffe, blunder, lapse, slip, goof, oops, blooper; how many phrases do we have to express the idea that things don’t always happen as we expect or as we would prefer? At the 2009 CEO Conference of the Institute of Nuclear Power Operations (INPO), one CEO stated that the most important change in the commercial nuclear industry in the past decade was the recognition that people do not intentionally commit errors. INPO’s training reference guide that introduced the commercial nuclear power industry Human Performance Improvement (HPI) initiative stated that HPI represented “a new way of thinking.” So the question is, how might we think differently about this concept of error that seems an inevitable aspect of the human condition?

The “fact” that some 80% of accidents are “caused” by human error appears in much of the safety literature. Formal accident investigation attributions of error as cause have been used for justification of blame and punishment, ostensibly to “prevent” recurrence of similar accidents. Yet after decades of labeling human error as cause, what do we really know scientifically about this fundamental human concept?

Much of the scientific work on accident causation can be traced to the aftermath of the Three Mile Island accident. Woods and Cook1 explain the situation as: “At that time, the folk model of accident causation was firmly in place among researchers and error seemed a plausible target for work on safety. It was only after a long period of empirical research on human performance and accidents that it became apparent that answering the question of what is error was neither the first step nor a useful step, but only a dead end.”

As James Reason explains in his book Human Error, error means different things to different people, and depends on context. In Latin the meaning of error is “to wander”. In baseball an error is the act, in the judgment of the official scorer, of a fielder misplaying a ball in a manner that allows a batter or base runner to reach one or more additional bases, when such an advance should have been prevented given ordinary effort by the fielder. In computer operation, an error is when an unexpected condition occurs.

The utility of error as causation is further complicated since error cannot be isolated as a particular psychological or behavioral phenomenon. Addressing efforts by cognitive psychologists to identify error types, Reason states that “Far from being rooted in irrational or maladaptive tendencies, these … error forms have their origin in fundamentally useful psychological processes.” He continues quoting Ernest Mach (1905) “knowledge and error flow from the same mental sources, only success can tell one from the other”.

So it seems that what may be called error is distinguishable only retrospectively in the presence of an undesirable outcome. Absent such an outcome, error is not observable. So if error is not observable sans outcome, is there any utility to this concept which is so rooted in the cultural views of causality yet so lacking in scientific validity?

Returning to Woods and Cook, “Error is not a fixed category of scientific analysis. It is not an objective, stable state of the world. Instead, it arises from the interaction between the world and the people who create, run, and benefit (or suffer) from human systems for human purposes-a relationship between hazards in the world and our knowledge, our perceptions, and even our dread of the potential paths toward and forms of failure.” …”To use “error” as a synonym for harm gives the appearance of progress where there is none“

If the concept of error has no particular value in analysis of failure, and indeed, that such use may be counterproductive, perhaps its value lies elsewhere. Viewing error as a fuzzy concept, rather than an absolute one, provides a basis for proceeding. William James’ philosophy of Pragmatism relates meaning to a concept’s purpose. Operationalization is the process of defining a fuzzy concept so as to make the concept measurable in form of variables consisting of specific observations. W. Edwards Deming explains that “An operational definition is a procedure agreed upon for translation of a concept into measurement of some kind.”

How might we understand error in a purposeful sense that promotes the human condition. Consider, as an example, physical pain. Pain may be understood as a negative consequence; something to be avoided or even feared. Alternatively, pain may be understood as one of the body’s key defense mechanisms, the purpose of which is to alert us of a threat to the body’s safety or survival. Similarly we may shift the meaning of error as harm, to error as warning of harm. Thus error becomes a signal to prompt protective actions.

Reason offers three related “working” definitions of error, each predicated on a retrospective judgments of not achieving the desired outcome from pursuing a predetermined course of action. He then suggests that error be understood in terms of intentions, actions and consequences. He also suggests that error be extended from purely an individual phenomenon to include organizational phenomena. So if we understand error as signal operating with intentions, actions and consequences, we can view this formulation equivalent to Deming’s description of the Shewhart’s Cycle of ‘Plan, Do, Study, Act.’ In this way, error become signals that enable individuals and organizations to monitor the relationship of doing with the plan in relationship to anticipated outcomes and then adjusting the plan and actions based on the feedback provided by error.

Error is life providing feedback on our interactions with the environment. By shifting our paradigm of error from one of fear and blame to one of system feedback, we find that error is nature’s way of helping us proceed incrementally toward our goals while coping with an uncertain universe.

Like this:

LikeLoading...

Related

This entry was posted on Tuesday, March 2nd, 2010 at 8:22 am and is filed under 1. You can follow any responses to this entry through the RSS 2.0 feed.
You can skip to the end and leave a response. Pinging is currently not allowed.

When I was a Shift Supervisor at the Susquehanna Steam Electric Station, a two unit boiling water reactor, we recognized that errors were going to be made, even in an industry that strived towards zero errors and zero defects. It was extremely important that we learned from our mistakes and were candid about our problems and weaknesses. Everyone was open to evaluation and everyone was encouraged to identify potential problems. Everyone’s professional opinion was respected and no one was discouraged from raising concerns. Our approach was simple – we can learn from anyone.

So Skip, what do you think we need to learn to create the type of culture you had at Susquehanna that encouraged this learning approach? As a shift supervisor, how did you communicate and reinforce that this openess was desirable?

I think the points laid out by Bill and Earl are a key point as we go forward. Language is very important and the term “error” has many definitions depending on the community. I have heard Earl discuss this many times, some with me, and I understand the very fine point that he makes. In many ways understanding and accepting that point helps one move forward in the HPI concept framework. Sometimes I wish we had a new word that we could use to describe the concept as I do wonder whether there is such a thing as error if there is not consequence.

The key concept to me is that error is inadvertent and therefore should not be used for blame. A blame culture is a cancer that eats away at otherwise healthy organizations.

All of the errors I have seen in practice were the result of a set of factors that resulted in the exact nature, magnitude, location, and timing of the error.

These factors included what set-up the person for the error, what triggered it, what made it as bad as it was, and what kept it from being worse.

Once the direct factors and the underlying factors that resulted in them are known it is possible to consider corrective measures intelligently.

Error should be used as an entree into peeling the onion to find out the basic fundamental underlying vulnerabilities of the organization. There should be no sacred cows, not even the regulators or the victims.

Take care,

Bill Corcoran
Mission: Saving lives, pain, assets, and careers through thoughtful inquiry.
Motto: If you want safety, peace, or justice, then work for competency, integrity, and transparency.

Why a Blog?

This blog is designed to elicit feedback and discussion between the HPI team and the DOE Complex. After reading each posting, please take a moment to provide feedback, and discuss what topics you would like to see covered in future postings.

If you have any general questions or comments about this blog, please go to the "About" tab and fill out the Contact Form at the bottom. Thank you!