CRM, ADM, BLA, BLA, BLA: What do they really mean? Lets get specific, look at ourselves, and discover if we are accident-prone.

Safety Introspection

We all work in inherently dangerous environments. Will you take a five-minute journey into self-discovery? If 65-80% of all aviation accidents are related to human error, let’s attack the statistics - We can learn from others.

Scenario: It is 1997. You’re a co-pilot for Korean Airlines and partially in command of a Boeing Airbus A300 enroute to Guam. Your captain has been here several times; you want to trust his experience and his integrity, but you’ve done all the necessary pre-flight planning required for the successful completion of this mission and things just don’t add up. The weather is bad, and the captain doesn’t have an alternate airport planned in the event that the aircraft is still in the clouds when the aircraft reaches the decision altitude on final approach. Even if he did, you know that there isn’t enough fuel onboard to do a missed approach and redirect to a nearby airport. Turbulence is heavy, lightning strikes, and your heart is racing. You want to voice your concerns, but the cultural legacy angel sitting on your shoulder reminds you that there is a hierarchy that you must obey. You try to ignore the little voice, but only manage to squeak out one little sentence, “Boy, it sure is raining a little more than usual here.” Your captain glares at you; you wince. Maybe he will smack you in the face again, like he did last month. The altitude voice alerts catch your attention: “500 feet to decision altitude, 400 feet, 200 feet, 100 feet . . .” You look outside; you're nervous. What if we don’t pop out of the clouds? You have reached the altitude, and you must make a decision. Do you see the runway?

No. The captain takes the controls and begins a climb up into the abyss. You know what is about to happen, but it’s too late.

Culture

We need to be specific when discussing culture. Cockpit Resource Management (CRM) and Aeronautical Decision Making (ADM) were developed to protect us from making the mistakes we inherited from aviation culture and our human nature of forgetfulness.

A low power-distance culture versus a high power-distance culture can mean life or death when working in aviation. Sadly, In the case of Korean Air Flight 801, it meant death. For centuries, the Asian culture has had a high power-distance culture in the form of a hierarchical superior/subordinate pyramid in business and families. Disciplinarian action would be taken if one disrespected his or her superior. If an individual broke the code of honor (which we will discus later) that person would stab himself or herself in the gut with a sword. In the case of Korean Air Flight 801, a subordinate crewmember knew there would be an accident, yet could not overcome the cultural discipline that would have changed the course of this tragic event.

Analyze

Take a look at the way you were raised. You may not be an extreme case like the one in the Korean Airlines scenario, but there may be a fraction of that mentality built into your personality through your upbringing.

In aviation, low power-distance behavior is preferred. If you’re a pilot or crewmember, you will respectfully challenge authority if something seems suspicious or dangerous. For example, if you’re a pilot and ATC instructs you to go around, you will follow the ADM models, decide it would place you in a critical fuel situation, and speak up. If you are a crewmember on an EMS helicopter and notice that the pilot is proceeding into marginal weather, you immediately speak your concern rather than waiting until it is too late. A large percentage of aviation accidents have been attributed to this one fact: a crewmember failed to notify the pilot of a known hazardous situation. This is human error.

Incongruence: the gap between our inaccurate self-perceptions and reality.
Incongruence is magnified through:

Self-fulfilling prophecies
Filtering messages
Media images

Most people only skim the hazardous attitudes chart. This is the very definition of invulnerability and rationalization.

Culture of Honor

Honor in aviation is not a personal reflection. It is the ability to admit fault, fix it, and share it with others. Even if it costs you your job!

When one family has a prolonged discrepancy with another, it is called a feud. When it happens between families in every village across 12 states in the Appalachian region of the U.S., it is called an epidemic. Harlan Kentucky had roughly 15,000 settlers between the late 1800s and early 1900s. There were over 1,000 documented homicides. Why?

An estimated 90% of Appalachia’s earliest European settlers originated from the Anglo-Scottish border country (Newhall, 2006, p. 253). They were herders, as opposed to being farmers, due to the topographical mountainous similarities to their native country. You see, herders must be protective; they must defend their name at all cost lest their livelihood be robbed in the night. In cultures of participation, such as farmers in the north, it takes an entire community to raise a crop. Naturally, farmers do not need to defend their honor as rigorously as herders do.

The characteristics of a culture of honor are still present today. In the ‘90s, two psychologists at the University of Michigan decided to conduct an experiment on the culture of honor. They gathered a group of young men from the north, and a group from the south, and insulted them using the word “asshole.” For the northerners, there was almost no effect. The southerners, however, were downright offended. Call a southerner an asshole, and he’s itching for a fight (Gladwell, 2008).
Knowing how you were raised may make you more likely to participate in standard CRM procedures. You’re not being insulted when a teammate asks you to consider another opinion – it may save your life.

If you are not offended by a crewmember pointing out something you should be doing, you are more likely to do it. If you know this about yourself, you may be more apt to do the same for your fellow crewmembers. If you have a gut feeling that the weather is not right, maintenance needs a double check, or a procedure needs to be developed, then you must act without delay. Criticism of ideas is productive, criticism of people is not.

To Help or Not to Help:
How the presence of others can reduce helping.

Late at night on March 13, 1964, 28-year-old Kitty Genovese was murdered within a few yards of her apartment building in New York City after a violent fight with her killer.

When the police interviewed Kitty’s neighbors about the crime, they discovered that 38 of the neighbors indicated that they had seen or heard the fight occurring but not one of them had bothered to intervene, and only one person had called the police.

Two social psychologists, Bibb Latané and John Darley, were interested in the factors that influenced people to help (or to not help) in such situations. They found that people are less likely to notice, interpret, and respond to the needs of others when they are with others.
Even if we have recognized an emergency situation, this does not necessarily mean that we will come to the rescue of the other person. We still need to decide that it is our responsibility to do something.

The problem is that when we see others around, it is easy to assume that they are going to do something; therefore, we assume that we don’t need to do anything ourselves. Diffusion of responsibility occurs when we assume that others will take action, and therefore do not take action ourselves.

The irony, of course, is that people are more likely to help when they are the only ones in the situation. There seems to be a “group inhibition of bystander intervention in emergencies” (Latané & Darley, 1968, p 215–221). The goal is to override that nagging voice that may prevent you from advising your crew of a potentially hazardous situation.

Altruism:
refers to any behavior that is designed to increase another person’s welfare, and particularly those actions that do not seem to provide a direct reward to the person who performs them (Dovidio, Piliavin, Schroeder, & Penner, 2006).

Identify Hidden Risk Factors

Most operators mandate risk assessments before a flight can be dispatched, yet these assessments are often vague and do not account for certain psychological factors.

ADM and CRM are generalized methods of how to operate and conduct oneself, but offer little on recognizing hidden threats to flights.

The handbook defines psychological stress as “social or emotional factor, such as a death in the family, a divorce, a sick child, or a demotion at work.” What if a crewmember in an EMS helicopter mentioned that they felt uncomfortable due to deteriorating visibility, yet the pilot continued the flight, assuming a 100% chance of safely returning to base? Is that pilot a hero because he was able to make it back? How do you think that crewmember felt toward the pilot when flying subsequent missions? There is now an added risk factor because the trust has been broken.

Crew combinations are usually different. CRM and ADM will help mitigate hazardous probabilities and provide a common communication platform.

Take a look again at the hazardous attitudes below. Consider how you have been raised and the experiences you have gained - good and bad - and compare each attitude to flights you have had in the past. Any close calls? Can you tag one or more of these attitudes to any one specific flight? Is this a recurring attitude? How do your crewmembers view you when you display these traits? If the crew relationship is fragile due to your repetitive hazardous attitude(s), is that relationship repairable? Can you place a risk value on that flight? What will they say to the other crewmembers about your practices? Flight psychology is a complex subject, and self-discovery is the start of your route applying ADM and CRM procedures, and ultimately, to saving lives.