Decision Traps and Ethical Thinking

The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.

By Terry NewellNovember 8, 2016

On May 6, 2004, the FBI arrested an American, Brandon Mayfield, as a material witness in the deadly March 11bombings of commuter trains in Madrid. Partial fingerprints on a bag containing detonating devices had been shared with the FBI. An examiner, using the FBI’s computerized fingerprint database, identified the prints as Mayfield’s (“100 percent verified”). Two additional examiners concurred.

Once arrested, the ensuing investigation revealed Mayfield was an attorney and a Muslim (he had converted after meeting his wife, an Egyptian national). He had offered legal aid to Jeffrey Leon Battle in a child custody case. Battle, one of the “Portland Seven,” had been convicted of trying to travel to Afghanistan to help the Taliban.

On April 13, Spanish authorities had told the FBI that their examination of Mayfield’s fingerprints did not yield a match. The FBI sent an examiner to Madrid to justify its own conclusion.

Mayfield was held for two weeks, with no access to family and limited access to legal counsel. On May 17, the court appointed an independent expert to review the FBI’s fingerprint identification. The expert concurred. That same day, the Spanish National Police informed the FBI it had positively identified the fingerprint as belonging to an Algerian national named Ouhnane Daoud. The court released Mayfield to home detention the next day.

On May 24, the FBI withdrew its identification of Mayfield. A formal apology and a $2 million settlement followed. What went wrong?

Two Department of Justice investigations found several errors in the Mayfield case. The FBI was under intense pressure. Contrary to policy, the verification by later examiners was tainted by knowing what the initial examiner found. Unable to accept the possibility of error, the FBI doubled down on its insistence that Mayfield was a terrorist. The fact that Mayfield was a Muslim hardened its stance.

The FBI did not set out to act unethically. Yet it did. Its actions cast light on silent traps that face everyone involved in making ethical (and other) decisions.

Attribution error: We sometimes attribute facts or characteristics to someone because of physical, religious or other characteristics. The FBI considered Mayfield a terrorist because he was a Muslim and had defended a Taliban supporter.

Confirmation bias: Once a decision has been made, we look for confirming evidence. After the first examiner found a “match,” so did the other two “independent” examiners and even a court-appointed examiner.

Overconfidence: We can become too confident, ignoring contrary views.

Threats to status: Research shows that our brains react to status threats in the same way we react to physical pain. They hurt, and the tendency to feel socially isolated makes us defensive.

Sunk costs: Once we put a lot of effort into a decision, we’re reluctant to change course.

Thinking Too Fast

Answer this question, posed by Nobel laureate and behavioral economist Daniel Kahneman. Write down the first answer that comes to you before reading further:

Together, a bat and ball cost $1.10. If the bat costs one dollar more than the ball, how much does the ball cost?

If you answered 10 cents, so do most people. But the correct answer is five cents ($.05 plus $1.05 = $1.10).

Kahneman says we get this wrong because we have two systems with which we think. The fast system uses mental heuristics to come up with quick answers. In routine situations, it works, saving us time and mental effort. But the slow system, where we question our assumptions and engage in more logical thinking, is sometimes needed. Under pressure, the FBI used fast thinking (Mayfield was a Muslim = likely terrorist) when it needed to slow down.

Avoiding Decision-Making Traps

These are just some decision-making traps that can lead to sloppy ethical behavior. To avoid them:

Probe your emotions. They may lead you to mental errors (e.g., thinking fast).

Take more time in order to “think slow.”

Test assumptions, meaning your mental model of the world. It can include stereotypes that prejudice your thinking.

Bring in independent viewpoints from those outside the decision chain and reward dissent to diminish confirmation bias.

Raise the status of lower level staff. Give them the independence and confidence to disagree. Protect their positions and latitude in doing so.

Consider the full, downstream fiscal and organizational costs of a course of action. This may short-circuit the sunk costs tendency.

Use a “devil’s advocate” and/or host a “second chance” meeting to revisit initial decisions that may have been prone to poor thinking.

Keep core values in mind. If you subordinate them to the desire for conformity and following the chain of command, the likelihood of unethical action increases.

Author: Terry Newell is President of his training firm, Leadership for a Responsibility Society and is the former Dean of Faculty of the Federal Executive Institute. His latest book is To Serve with Honor: Doing the Right Thing in Government. He can be reached at [email protected]

(1 votes, average: 5.00 out of 5)

Loading ...

About

The American Society for Public Administration is the largest and most prominent professional association for public administration. It is dedicated to advancing the art, science, teaching and practice of public and non-profit administration.