Well worth reading is the British Medical Journal (BMJ), March 2000 paper, ‘Human error: models and management’. This paper gives an excellent explanation of the model, along with the graphic I’ve used here.

Who uses it? The Swiss Cheese Model has been used extensively in Health Care, Risk Management, Aviation, and Engineering. It is very useful as a method to explaining the concept of cumulative effects.

The idea of successive layers of defence being broken down helps to understand that things are linked within the system, and intervention at any stage (particularly early on) could stop a disaster unfolding. In activities such as petrochemicals and engineering it provides a very helpful visual tool for risk management. The graphic from Energy Global who deal with Oilfield Technology, helpfully puts the model into a real context.

Other users of the model have gone as far as naming each of the Slices of Cheese / Layers of Defence, for example:

Organisational Policies & Procedures

Senior Management Roles/Behaviours

Professional Standards

Team Roles/Behaviours

Individual Skills/Behaviours

Technical & Equipment

What does this mean for Learning from Failure? In the BMJ paper Reason talks about the System Approach and the Person Approach:

Person Approach – failure is a result of the ‘aberrant metal processes of the people at the sharp end’; such as forgetfulness, tiredness, poor motivation etc. There must be someone ‘responsible’, or someone to ‘blame’ for the failure. Countermeasures are targeted at reducing this unwanted human behaviour.

System Approach – failure is an inevitable result of human systems – we are all fallible. Countermeasures are based on the idea that “we cannot change the human condition, but we can change the conditions under which humans work”. So, failure is seen as a system issue, not a person issue.

This thinking helpfully allows you to shift the focus away from the ‘Person’ to the ‘System’. In these circumstances, failure can become ‘blameless’ and (in theory) people are more likely to talk about it, and consequently learn from it. The paper goes on to reference research in the aviation maintenance industry (well-known for its focus on safety and risk management) where 90% of quality lapses were judged as ‘blameless’ (system errors) and opportunities to learn (from failure).

It’s worth a look at the paper’s summary of research into failure in high reliability organisations (below) and reflecting, do these organisations have a Person Approach or Systems Approach to failure? Would failure be seen as ‘blameless’ or ‘blameworthy’?

High Reliability Organisations: Source BMJ, 2000 Mar 18:320(7237): 768-770It’s not all good news. The Swiss Cheese Model does have a few criticisms. I have written about it previously in ‘Failure Models, how to get from a backwards look to real-time learning’. It is worth looking at the comments on the post for a helpful analysis from Matt Wyatt. Some people feel the model represents a neatly engineered world and is great for looking backwards at ‘what caused the failure’, but is of limited use for predicting failure. The suggestion is that organisations need to maintain a ‘consistent mindset of intelligent wariness’. That sounds interesting…………

There will be more on this at #LFFdigital, and I will follow it up in another post.

So, What’s the PONT?

Failure is inevitable in Complex Human Systems (it is part of the human condition).

We cannot change the human condition, but we can change the conditions under which humans work.

Moving from a Person Approach to a System Approach to failure helps move from ‘blameworthy’ to ‘blameless’ failure, and learning opportunities.

10 Responses

I can’t help thinking that when Reason says “complex human systems” he really means ‘complicated human designs’. Afterall the whole premise of a complex system is that it’s non-linear. Things don’t have to line up and the relationship between cause and effect can be oblique. It would be like one layer of cheese being a baked Camembert (it just slows problems down), one is an American Slice (bouncing problems off in all directions) and one a thin wedge of unbreakable Parmesan. What’s more the line of failure could be the equivalent of a hot wire that simply slices through, holes or not. That’s enough of the metaphor.

The Cognitice Science has moved on in the past 25 years and what is described as failures in human cognition are now more clearly recognised as contextual strengths, not failures. If you work in a widget factory full of machines designed for specific purposes, then we expect them to do exactly what they are supposed to do. In this context, it’s a big machine with a few annoying biological bits mucking up the teleological perfection. Health is not that.

Health is an ecosystem, a Biology with the odd stupid inert mechanical bit doing the boring stuff. In this sense, we don’t want high reliability – quantitative efficiency – out of the qualitative context. For example, consistently giving every third person an infection is highly reliable. In health we’ve suffered from the bell curve effect. NICE set up most of their advice for the middle line on an effecient normal distribution of idealised patients. What that means is that the perfectly designed best practice works perfectly for a tiny proportion of the world. The job is actually more about tailoring every decision to fit the individiual. Sounds mad doesn’t it, but that’s why it takes 14 years to become a Doctor. Unlike factories and boats, in complex systems there are different outcomes, in different directions for people with different wants and needs. In the end every body dies, so in Reason terms the whole health system is one massive failure.

Health doesn’t need to be highly reliable, like a machine. Albeit some parts like labs and radiology and theatre are more like the Ships and Powerstations of the research. The majority of health needs to be resilient. Going wrong is all part of being alive, the trick is, as you say, to be sensitive to the present, spot inevitable variations early and make a choice each time. It’s why zero harm campaigns don’t work. We’re in the business of harm, we exchange one harm (apendicitis) for a lesser harm (appendectomy). So harm, can’t be a failure.

Just stirring up your head ready for the conversation.
Thanks for the montion, “axe weilding” made me laugh out loud.

I haven’t got past the Camembert , Ameican Cheese Slice and Parmasan metaphor for the minute.
The Matt Wyatt Cheese of The World / Exotic Cheese Board Model of Failure could be the 21st Century version.
You should work on the graphic.
It would be brilliant.
Welcome form holidays, I’ve missed you.

[…] done a fair bit of work around failure over the last couple of years through our Manager Chris Bolton. This work has underpinned a lot of our information sharing and our focus on improvement. So it’s […]

[…] Swiss Cheese and Failure. Previously I’ve written about the James Reason Swiss Cheese Model which is widely used to illustrate how failure happens in complex systems. I’ve even had a go at trying to explain it in 300 seconds (link here). […]