Pathological complacency stems from an over-reliance on assumptions rooted in past experiences. As the examples of the Macondo deepwater oil well and Humber Refinery incidents in the preceding article showed, operating an oil and gas facility with a focus on "what's worked well enough in the past" can lead to devastating results.

In the case of Macondo well blowout in the Gulf of Mexico, emergency response plans were inadequate. As the Humber Refinery chain of explosions showed, the facility owner's system for managing change failed to properly correct the pipe corrosion problem that led to the incident. Both instances underscored the need, throughout an organization, to identify potential problems in a timely manner and to correct them before it is too late.

According to Maxine Fawcett, an expert in health safety and environment behavior change with Intertek Consulting & Training, being alert to warning signs at every level is critical step toward remedying pathological complacency.

"Individuals need to be open to surprise in their daily tasks and actively look out for it," Fawcett said. "Equally, senior managers should respond to the organizational surprises with vigor."

At all levels of the organization, this means paying attention to even subtle changes and questioning why they happened at a particular location and in a particular timescale.

A 'Reliability Mentality'

Fawcett urges companies wishing to mitigate pathological complacency in their organizations to embrace a more proactive and engaging way of thinking from the shop floor to the board room. She calls this new mindset a "reliability mentality," but it is actually not so new. In fact, another industry that regularly encounters potentially hazardous situations has widely integrated it into their daily operations: aviation. Like the oil and gas industry, the aviation sector values checklists to ensure that nothing is forgotten. Fawcett contends, however, that the checklist's level of importance differs considerably in these two industries.

"Pilots are highly trained in working with the unusual, looking for failure and dealing with errors and emergencies and ought never to let down their guard while piloting an aircraft," Fawcett explained. "They shouldn't rely on the checklist."

In the upstream and downstream industries, Fawcett maintains the checklist is becoming an insurance against litigation rather than serving as a true risk management tool.

"Checklists are often put into place because there isn't enough trust in the workforce that they will 'do the right thing,'" she asserted. "There is a tendency to assume that when someone is given a working instruction or a checklist that it will be followed. Unfortunately, there isn't enough training given on their use or enough coaching to embed learning and new procedures in the workplace."

Comparing the different ways in which the aviation and oil and gas industries use checklists serves as but one example of how integrating a reliability mentality can change everyday tasks.

Below are the four basic steps for making this more proactive mindset part of your organization.

Step 1: Watch for Warning Signs

Fawcett said that adopting a reliability mentality begins by being alert to warning signs at the task, asset, behavioral and organizational levels.

"There needs to be much more risk consciousness at a senior level, as well as greater dynamic risk awareness at the worksite," she explained.

Using predictive models can help individuals and groups at all levels become more competent in anticipating where danger lies and doing something about it.

Step 2: Combat Apathy and Build Trust

Targeting the malaise of apathy that can settle on a stable, well-trained workforce is another key to achieving a reliability mentality.

"Apathy can arise because of a lack of challenge or from a sense that even if people raised safety concerns, nothing much would change -- because nothing much changed in the past," Fawcett commented.

Fawcett illustrated this scenario by recalling an episode at a liquefied natural gas terminal where she did consulting work. She explained that it is common in the industry to assign "rising stars" with top senior manager potential to a placement on plants or installations around the world for valuable experience. Their placement is usually approximately two years -- long enough to groom themselves for the next short-term assignment but not long enough to implement and bed down significant changes.

Fawcett found that the front-line workers had largely stopped raising safety concerns because local middle managers would resist interference from the short-term senior and not implement the necessary changes. They knew that it would not be too long before that senior had gone and a new incumbent would arrive.

"Embedding longer-term, legacy thinking in senior managers and ensuring that the workforce can trust them to listen to concerns is essential" in such cases, Fawcett advised. She pointed out there needed to be much more effort put into developing middle managers in risk awareness, and broaden out their appreciation of the interconnectedness of people and operations.

Step 3: Seek Failures

A third way to incorporate a reliability mentality into an organization is to actively seek out failures before something actually goes wrong. The health, safety and environmental benefits of preventing failures from occurring in the first place are obvious. Fawcett admits, however, that maintaining support for this strategy on a corporate level can be difficult because there is no concrete evidence that it works.

"Reliability, then, is an act of faith," Fawcett continued. "We seek out what might fail in order that failure won't happen. But this is a hard sell at a board room level when cost-cutting, rather than investment in reliable systems of work, might be at the front of people's thinking."

Step 4: Out with 'Risk' and In with 'Danger'

Fawcett said the fourth key component of integrating a reliability mentality deals with an organization's lexicon.

"Lastly, and most importantly, we need to revert back to using the word 'danger' instead of 'risk,'" she said. She reasoned that "danger" conveys greater urgency than "risk," which implies a more remote possibility of a harmful event occurring.

"In our risk assessments, we distance ourselves from something dangerous happening with risk assessment matrices which categorize the possibility of loss," Fawcett explained. "In our minds, we lessen danger in this process by finding ways of mitigating it in our minds and in our language."

"The reality is that we rarely look for danger when we plan or execute a job," Fawcett continued. "If we were looking for danger, we'd see real threats. But when we make our hazardous workplaces a 'safe and routine' environment, we stop looking for danger, we don't see threats."

Bringing Pathological Complacency to the Fore

Fawcett acknowledges that the oil and gas and petrochemical industries have made real strides in terms of their overall HSE record over the past four decades. She adds, however, that there is room for improvement.

"I think the issue is that both upstream and downstream don't yet actively use these strategies [to adopt a reliability mentality]," Fawcett said. To illustrate her point, she noted that her work with one operator to elicit warning signs has yielded mixed results.

"They have no system, yet, to capture these, nor enough faith in predictive thinking to build one," she said. "However, they are introducing a reliability-focused maintenance system based on replacement before failure."

"The trouble with pathological complacency is that it is unconscious," Fawcett said. "You have to actively bring it into awareness for the organization to look at."

"And be warned: those who say they are not complacent are probably complacent," Fawcett concluded.

Matthew V. Veazey has written about the upstream and downstream O&G sectors for more than a decade. Email Matthew at mveazey@downstreamtoday.com

WHAT DO YOU THINK?

Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Your comments have been received and will be posted after being reviewed. If you have any questions, please feel free to contact us.

Michael Davis | Feb. 7, 2012

Pathological complacency! Yes.
Fear comes before wisdom. Parents that want to protect children teach them fear. Fear of the street. Fear of drugs. Fear comes before wisdom.
There are examples of teaching pathological complacency. How many well control courses teach the student to get "comfortable" using the choke? The answer is they all do. They teach to be "less afraid" sometimes.
A good well control course should show video of blowouts and yet teach the basics of the decisions and actions that need to be made at that crucial time. Perhaps take the student to a well unloading with some gas component and watch the gas expand at supersonic speeds. To be truly wise we need to have a visceral understanding of the absolute power that is buried beneath the earth. Then as we remove layers of dirt we will be wary. People that are wary usually seek more barriers to the pressure.
Someone that has seen a valve leak and unleash terrible pressure is more apt to close one valve behind another. Someone not afraid is more apt to leave a "dead" well open without supervision or perhaps not watch a trip tank close enough.
Its time we give more respect to the personell and engineers that have true fear of the unknown. That fear can lead to the wise decisions that we need at all times because this is one of the most dangerous businesses in the world.
Fear is the begining of the wisdom we need and the antedote against pathological complacency.

Tobias Ford | Feb. 7, 2012

I thought this was a great article. My first thought was to email it around the office so that everyone else could read it and hopefully benefit from the brief but important insights raised.
I agree that complacency is a huge issue, and i really just want to highlight and print out the part about lists being a barrier against litigation rather than harm. Granted good systems and procedures will help you avoid incidents but having a list in your pocket wont do too much in an actual crisis.
Great article!