“No situation is too complex to be reduced to this simple, pernicious notion. ‘Human error’ has become a shapeshifting persona that can morph into an explanation of almost any unwanted event” Image: Steven Shorrock https://flic.kr/p/woKmc5 CC BY-NC-SA 2.0

Again, a familiar smoke pattern has emerged from the ashes of a high-profile accident. The National Transportation Safety Board held a hearing in Washington D.C. on 28 July 2015 on the Virgin Galactic crash over California on October 31, 2014. VSS Enterprise, an experimental spaceflight test vehicle, crashed in the Mojave Desert folowing a catastrophic in-flight breakup during a test flight. Michael Alsbury, the co-pilot, was killed and Peter Siebold, the pilot, was seriously injured.

The NTSB’s press release headlined “Lack of Consideration for Human Factors Led to In-Flight Breakup of SpaceShipTwo” and opened the press release with “The National Transportation Safety Board determined the cause of the Oct. 31, 2014 in-flight breakup of SpaceShipTwo, was Scaled Composite’s failure to consider and protect against human error and the co-pilot’s premature unlocking of the spaceship’s feather system as a result of time pressure and vibration and loads that he had not recently experienced.” (07/28/2015).

In the NTSB’s Human Performance presentation, various “Stressors Contributing to Copilot’s Error” were cited, including memorisation of tasks, time pressure (to complete tasks within 26 seconds and abort at 1.8 Mach if feather not unlocked), and a lack of recent experience with SS2 vibration and loads. And concerning “Lack of Consideration for Human Error“: the system was not designed with safeguards to prevent unlocking feather; manuals/procedures did not have warnings about unlocking feather early; the simulator training did not fully replicate operational environment, and; hazard analysis did not consider ‘pilot-induced hazards’.

Board member Robert Sumwalt stated that “A single-point human failure has to be anticipated,” and that Scaled Composites, an aerospace company, “put all their eggs in the basket of the pilots doing it correctly.” The NTSB also criticised the FAA, which approved the experimental test flights, for failing to pay enough attention to human factors or to provide necessary guidance to the commercial space flight industry on the topic. NTSB chairman Christopher Hart said, “Many of the safety issues that we will hear about today arose not from the novelty of a space launch test flight, but from human factors that were already known elsewhere in transportation.” Pressure – a consistent feature in accidents – was cited from some FAA managers to approve experimental flight permits quickly, without a thorough understanding of technical issues or the details of the spacecraft – an efficiency-throughness trade-off (Hollnagel, 2009).

As is so often the case, it’s a complex and messy picture. But most people are fairly unlikely to view PowerPoint presentations by the NTSB, listen to public hearings or read official press releases. Instead, they refer to the news media, who nearly never go into the same sort of detail, and instead point to ‘human error’. No situation is too complex to be reduced to this simple, pernicious notion. “‘Human error’ has become a shapeshifting persona that can morph into an explanation of almost any unwanted event” (Shorrock, 2013).

Journalists tend to write in a particular way in order to catch and keep reader attention. Unlike other stories (such as novels), news articles usually use an inverted pyramid format, where the most important moment is right at the start in the headline and ‘lede’ – the first and most important paragraph. The headline attracts reader’s attention and the lede gives them the main points. It must do this in as few words as possible, typically a sentence or two – less than 40 words. In today’s 140 character economy, the headline and lede are probably more important than ever, because it is very easy not only to turn the page, but to click to a different news outlet altogether – losing the reader.

From a psychological point of view, this approach makes a lot of sense. Very early psychological research on human memory found that the first few items in a list of words are recalled more frequently than the middle items. This phenomenon is known as the primacy effect. It has been repeated in several fields (including clicking URLs in a list – Murphy, Hofacker and Mizerski, 2006). In journalism this is more important than the recency effect – the tendency to recall recall items at the end of the list best, since people less often get to the end of a news story.

Additionally, initial reports of news stories are more likely to be read and remembered. These ‘first stories’ (Cook, Woods and Miller, 1998) are also likely to focus on surface features of events, such as ‘human error’. In advertising and public communications, the law of primacy in persuasion holds that the side of an issue presented first will have greater effectiveness than the side presented subsequently. It makes sense, then, to look at some early reports of Virgin Galactic spacecraft’s deadly crash, focusing on the headline and lede (including the first paragraph). So consider the following, which cite ‘pilot error’ as the ’cause’ (or led to, due to, traced to) of Virgin Galactic. Some of the ledes do not mention any other factors.

BBC

USA Today

NTSB finds pilot error in Virgin Galactic spaceship crash – The Virgin Galactic spaceship crash last year occurred after a co-pilot prematurely unlocked a flap assembly that’s only supposed to be deployed at almost twice the speed, well past the speed of sound, the National Transportation Safety Board reported Tuesday.

New York Times

Virgin Galactic SpaceShipTwo Crash Traced to Co-Pilot Error – A single mistake by the co-pilot led to the fatal disintegration of a Virgin Galactic space plane during a test flight in October, the National Transportation Safety Board concluded Tuesday, and the board strongly criticized the company that designed and manufactured the plane for not building safeguards into the controls and procedures.

Other reports used the word ‘blamed’, which may have been used innocently, but conveys a more emotive message.

Sky News

Braking Error Blamed In Virgin Galactic Crash – Investigators find the co-pilot prematurely unlocked a system which repositions the spacecraft’s wings to slow its re-entry. An official investigation into a Virgin Galactic spacecraft’s deadly crash over California last October has found a co-pilot unlocked a braking system too early.

CNN

Virgin Galactic crash blamed on human error – Human error was responsible for the catastrophic 2014 crash of an experimental Virgin Galactic rocket ship that left the co-pilot dead. Specifically, the “probable cause” of the crash was the “pilot’s premature unlocking of the SpaceShipTwo feather system,” the National Transportation Safety Board said Tuesday. The feather locks are essentially a braking system designed to allow the rocket to safely descend from space.

Some media channels chose a more mysterious click-bait headline, revealing the ‘human error’ cause in the lede.

CBS

Cause of Virgin Galactic spaceplane crash identified – The fatal in-flight breakup of Virgin Galactic’s futuristic SpaceShipTwo rocket plane during a test flight last October was the result of pilot error, possibly triggered by a high workload, unfamiliar vibration and rapid acceleration, the National Transportation Safety Board concluded Tuesday.

Other headlines did not mention ‘human error’, but this dominated the lede.

The Times

Safety flaws that doomed Branson’s spaceship – Pilot error and a failure to protect against it by the company making Virgin Galactic’s SpaceShipTwo rocket ship led to its catastrophic break-up over the Mojave Desert, investigators concluded last night.

In human factors, we do not to think so simplistically, and tend to concern ourselves with the second story. And it is fair to say that the study of ‘human error’ (at least according to some definitions) has helped the development of the understanding of human performance. But with ever increasing system complexity and obscure causation, might our focus on ‘human error’ as an anchoring object in our analyses and narratives have unintended consequences, perhaps serving to reinforce a populist view of ‘human error’?

This is one way in which a focus on ‘human error’ might be seen as the ‘handicap of human factors‘. The argument here is not an ontological one concerning the reality of ‘human error’ as a separate and clearly definable phenomenon. That is a different issue, and there is much written about that. For the moment, there is a need to reflect on how our language and way of thinking is conveyed, translated and perceived by the press, by others in industry and by society. Our subtle interpretations and arguments are of little interest to the media, who tend to strip the concept down to a person view (e.g. carelessness or a lapse of some kind), devoid of systemic narrative – at least in the all-important headline and lede of initial reports.

So how can we encourage a different narrative? Here are three things that those of us with an interest in human factors and ergonomics (HF/E) can do:

Consider your mindset: Reflect on how you react internally to failure, and how you subsequently make sense of what situations. You are human after all! It is natural to have some very human reactions. But how do these translate into judgements? Explore systems thinking. HF/E is a systems discipline, but writings by Deming, Ackoff, Meadows and other systems thinkers are rarely part of HF/E syllabi. Consider alternative or complementary perspectives such as Safety-II (Hollnagel, 2014), but remain aware of your worldview – mind your mindset.

Mind your language:Reflect on how you talk or write about failure events. Enrich your vocabulary when writing reports or using social media. You don’t have to avoid using the term ‘error’ or variants, but consider others which may describe the situation: performance variability, ordinary variability, adjustment, adaptation, trade-off, compromise – or just ‘decision’ or ‘action’. If you use the term ‘human error’ (or ‘pilot error’), use scare quotes/inverted commas. This gives a signal that there is more to it. Avoid phrases such as ‘the pilot failed to’ or ‘the surgeon neglected to’, since these words have heavy emotive connotations. Bear in mind that some words translate quite differently to other languages, sometimes with more personal connotations.

Take action: Contact journalists and reporters via social media, or write to the letters sections, to challenge simplistic narratives. Look for influential journalists who report on matters concerning safety, psychology and human factors. Cite simplistic reporting or language to highlight the problem within your industry, and on social media. If all else fails, consider installing Jon Cowie‘s Chrome Plug-in, which replaces instances of ‘human error‘ on webpages with ‘a complex interaction of events and factors‘. Mostly, that is what we are talking about…

We can all play a part in encouraging better reporting of major accidents and other adverse events. This can improve the awareness of many stakeholders (e.g. public, policy makers, regulators) by exposing real system weaknesses and help to improve safety by refocusing attention on the system. It is only fair to those caught up in systems accidents, whose natural and to-be-expected (and required) performance variability is so often inadequately considered in system design and operation. And often, they are the victims, as was the case with Virgin Galactic.

Advertisements

Share this:

Like this:

LikeLoading...

Related

About stevenshorrock

I am a systems ergonomist/human factors specialist and work psychologist with a background in practice and research in safety-critical industries. My main interest is human and system behaviour in the context of safety-related organisations. I seek to enable improvement via a combination of systems thinking, design thinking and humanistic thinking. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock