Making Sense and Taking Risks: Human Behavior in the Shipping Industry

By Matti Bargefried

The guide helps to identify countermeasures to avoid human errors and bad decisions. Discover how to manage the human element on all levels – from the engine room, to the bridge, to the shore.

From our summary of “The Human Element – A Guide to Human Behavior in the Shipping Industry,” we examine the chapters “Making Sense of Things” and “Risk Taking,” breaking down the most relevant information. Dirk Gregory and Paul Shanahan of the UK Maritime and Coastguard Agency developed the original guide.

Making Sense of Things
People are surrounded by vast amounts of information and need to make sense of it all. We need that information to support our goals and plans, as they cannot withstand a changing world without adapting to the current situation. Nevertheless, before we can use information to modify our plans, we need to choose the information to process and the sense that we give to it. This process is influenced by a number of things, most importantly culture, experience, social needs and character. As in the office ashore, the manager’s door might be open to welcome everyone or it might be open in order to spy on his underlings. How do you judge? It is one situation, but according to your perception of the person’s character and attributes and to your experience in other companies, your judgment alters.

To share the sense that we make of information, we need empathy and communication skills; otherwise, we will not be able to transfer reason and meaning. However, even when we can transfer this successfully, we might have been betrayed by our own minds and chosen the wrong thing to transfer. Our minds are picky; they like to find evidence that supports our current assumptions and decisions. Sense-making plays a vital role in almost all shipping accidents.

On average there are 182 large vessels lost per year, according to Lloyd’s Register, and between 1995 and 2007 this amounted to 160 million gt.

The guide continues with a case study for sense-making: specifically that of a U.S. Coast Guard training cutter being rammed in 1978 by a vessel four times the cutter’s size. (See story below “USCGC Cuyahoga: The Last Voyage)

How did it happen? It happened when the captain of the Coast Guard vessel noticed an approaching ship. Both vessels were running at full speed and quickly closed up to each other. The captain saw only two signal lights on the vessel, and therefore assumed that it was heading in the same direction as he was. His crewmembers saw three signal lights and knew the ship was coming towards them, but did not attempt to communicate this, as it seemed obvious. Meanwhile, the captain rationalized to himself the fast-closing speed (as seen on the radar) with the simultaneous overtaking of a fishing boat. Eleven men died. The captain’s sense-making was wrong as well as the communicational behavior of the crew.

The Problem with Making Sense
As we can understand from the example, the patterns and situations that create a problem primarily exist in the heads of people, hence they are unique. Regulators today tend to close the exposed gaps with stricter regulation and new technical procedures. Therefore the rule books grow bigger, creating uncertainty and greater complexity. At the same time, people become lulled into a false sense of safety provided by the technical crutches. The guide quotes: “Automation creates new human weaknesses…and amplifies existing ones,” reminding us that humans need to keep pace.

Risk Taking
In the last part, we learned that people have to make sense of things (information) in order to make decisions or plans. This sense making is heavily influenced by a number of factors, such as culture, past experience, ability to communicate, empathy and one’s character.

Even we make a decision we can never be certain that we have made the right one and that we have interpreted all relevant data in the way that is favorable for a positive outcome. This is partly because we want our plan to work or our decision to be right, hence, we are tricked by our brain into selectively finding assumptions and interpretations that are good from our personal point of view.

This all involves risks and we need to accept risks, but sometimes we know that we do not have sufficient information or we feel a false sense of safety and still head for our conclusion / plan / decision.

What Affects Risk Taking?
Risks are determined by our feeling about a given situation, which of course can be easily wrong. The feeling might be influenced by an incorrect perception of control. This imagination of having control is biased by thinking positively about our skills, experience, technical equipment, hard training and a familiarity with the situation. People forget that missing knowledge and over-estimation can then lead to bad decisions.

“The Human Element – A Guide to Human Behavior in Shipping Industry” gives an example of a deckhand who was washed overboard – he only secured himself in heavy weather by wrapping an arm around the pulpit rail instead of using the harness. Therefore he took a risk, and based on his perception his decision was good enough, but it was clearly proven otherwise. This situation involved perceived familiarity, it was not the first time the deckhand had secured himself like this; therefore the situation seemed to be familiar and hence controllable. Another point of influence is perceived value – when something supports a higher goal and could bring one a big step closer to achieving it, so the more we desire it, the less risky it appears.

How Decisions are Made
To make a good decision based on the information we have we need to work through all the options, thinking about alternatives and interpreting all facts. Therefore, decision-making is a very time-consuming task and we need to decide how efficient or thorough to be, as time is a valuable and rare commodity in shipping.

That presents a problem because it means there is likely to be a tradeoff between safety (by thorough investigation) and profitability (by deciding quickly). In reality, companies need to be both at the same time. The company’s culture dictates which of the qualities is more favored. Pressure from the company’s demand for efficiency leads to a shift in perception where thoroughness is valued less and seamen feel the need to work as efficiently and as quickly as they can. It is difficult to consciously act against this urge and people need adequate training in order to make proper decisions that determine when it is better to be more thorough.

USCGC CUYAHOGA: The Last Voyage
At about 2100 hours on 20 October 1978, in an area about 3.5 miles northwest of Smith Point, which marks the mouth of the Potomac River as it empties in the Chesapeake Bay, catastrophe occurred.

The Argentine coal freighter Santa Cruz II, a 521-foot bulk carrier, hit the Cuyhoga on her starboard side between amidships and the stern. A consensus of accounts indicated that the cutter was dragged backwards for a minute and then fell away from the tanker, rolled on her side, and sank within a couple of minutes. The Santa Cruz rescued 18 survivors from the water and stayed on the scene until help arrived. The remaining 11 men embarked on the Cuyhoga were lost. Four days after the accident, a Marine Board of Inquiry convened in Baltimore, Maryland, at the Marine Safety Office to investigate the accident.

After some delay due to heavy seas and high winds, two massive floating cranes were used to raise the Cuyhoga, which was in 57 feet of water. After an initial inspection, the ship was placed on barges and towed 65 miles to Portsmouth for a full inspection.

The Marine Casualty Report, number USCG 16732 / 92368 and dated July 31, 1979, concluded: The Commandant has determined that the proximate cause of the casualty was that the commanding officer of the USCGC CUYAHOGA failed to properly identify the navigation lights displayed by the M/V SANTA CRUZ II. As a result he did not comprehend that the vessels were in a meeting situation, and altered the CUYAHOGA’s course to port taking his vessel into the path of the SANTA CRUZ II. The Cuyahoga was later sunk off the coast of Virginia as an artificial reef. (Source: US Coast Guard)

The Author
Matti Bargfried (M.A.) is Head of Marketing in the maritime IT-company “CODie software products e.K.” Specialized in Sales Management, Strategic Marketing and SEO, he has served the maritime industry for 10 years. CODie is Germany’s second biggest vendor for fleet, crew and safety management software.