Thinking about thinking: risk analysis edition

There is a great chapter in Morgan D. Jones’s (1998) book The Thinker’s Toolkit. It is called “Thinking about Thinking,” and its primary thesis is that the human mind is not analytical by nature. He explores the fallibility of human reasoning and suggests that the best remedy for the mind’s ineffectiveness is to impose some structure on the way we think. This is what risk analysis does well. Jones argues, persuasively in my opinion, that structuring our analyses is at odds with the way our minds work naturally. He argues that, left to our own devices, we often begin a problem analysis by formulating our conclusions, thus beginning where we ought to hope to end our analysis.

With an intuitive preference for one solution, often the first one that seems satisfactory, we tend to give insufficient consideration to alternative solutions. We tend to confuse the gathering and analysis of data with the process of thinking about a problem. Jones identifies seven traits that get in the way of our ability to analyze and solve problems. Risk analysis attempts to influence our thinking by making it more analytical. This can simultaneously limit the “damage” our fallible human reasoning can inadvertently do to decision making.

The seven traits that tend to skew our analysis:

Letting our emotions run wild and distort our thinking. Which is where the age old advice on “sleep on big decisions” comes from. With distance things get clearer, provided that we keep our biases in check.

Heuristics are helpful in many cases, but left alone and unchecked, without attempting to analyse outcomes they can lead us astray. This is the drum that Kahneman keeps on beating on, and one that Gigerenzer for one rightfully disputes.

Superficial logic constrained by our biases. Our biases play a major role in all the other traits, so being aware of them is a good start.

Looking for explanations even when there isn’t one. Finding causations in correlations. We need to explain things to ourselves in order to move on. No matter if the explanation is correct or not.

Confirmation bias. Every evidence, piece of information that confirms our existing belief is good. We will marshal evidence to confirm our chosen explanation, discard all information that makes us doubt it.

Clinging to what we know or have, no matter how correct, good, or useful it is. This often plays out in organisations where people “throw good money after bad” to keep patching things that should’ve rationally be scrapped already.