4 Outcomes of Lazy Thinking

Using heuristics to understand why people fall prey to fake news.

In a recent article in the New York Times, Why Do People Fall for Fake News?, psychologists Gordon Pennycook and David Rand discuss people’s susceptibility to (strategic) misinformation. Two leading explanations for why people believe such misinformation are presented, which essentially boil down to people’s rationalizations and laziness. Much like the focus of many past posts in this blog, the article tackles the concept of faulty thinking and barriers to Critical Thinking.

With respect to rationalization, what is discussed is in many ways associated with confirmation bias, in that when presented with (mis)information, particularly if it’s a politically-charged issue, people will rationalize the information in a manner that persuades them into believing what they want to be true. What can make this difficult to differentiate from fact—assuming one is not privy to the facts—is that rationalizing in this manner does require reasoning; so, to say to someone that they aren’t using their reasoning when they infer an inaccurate conclusion through rationalization isn’t correct either. Rather, they’re applying their reasoning incorrectly and uncritically.

The concept of the laziness perspective is by no means new—it’s consistent with the work of two Nobel Prize winners: according to Daniel Kahneman (2011), we are lazy thinkers and cognitive misers (i.e. given that it takes additional cognitive effort to assimilate information and simultaneously assess its truth); and according to Herbert Simon (1957), rather than engage in extensive, reflective decision-making, people generally settle for a decision or solution that is satisfactory, or simply, "good enough" (i.e. satisficing). This is why people go with their gut—their intuition; which is a big no-no for critical thinking. But with that, intuition is generally good at what it does; and helps us save our cognitive energy for things that matter and stave off decision fatigue. However, people often apply it to ‘things that matter’ as well; hence the problem of people falling for misinformation.

An important aspect of the laziness we’ve been discussing is the tendency to apply a general mental framework to information—a heuristic. A large body of research indicates that the manner in which intuitive judgments come to mind and why they are often limited or grossly incorrect result from heuristic-based thinking (Kahneman, 2011; Tversky & Kahneman, 1974). A heuristic is a "simple" experience-based protocol for problem-solving and decision-making, which acts as a mental shortcut—a “procedure that helps find the adequate, though often imperfect, answers to difficult questions” (Kahneman, 2011, p. 98). The misapplication of heuristics can lead to fallacious reasoning and cognitive biases (Gilovich, Griffin, & Kahneman, 2002; Kahneman, 2011; Kahneman, Slovic, & Tversky, 1982; Slovic, Fischhoff, & Lichtenstein, 1977; Tversky & Kahneman, 1974).

Let’s now turn our attention to four specific heuristics—three classic ones famously identified by Tversky and Kahneman (1974) and a more modern, though commonly encountered one—that is, four outcomes of lazy thinking.

1. The Availability Heuristic (Tversky & Kahneman, 1974) refers to a mental framework whereby people base a judgment on the ease with which they can bring relevant information, such as an example, to mind. Consider the classic example:

Are there more words that begin with the letter ‘K’ than those with ‘K’ as the third letter?

Without adequate reflection, many answer that there are more words that begin with the letter "K" given the relative ease with which these words came to mind with words that have "K" as their third letter (i.e. when searching your memory for words with the letter "K," kitten comes to mind faster than hake). In reality, however, there are substantially more words in the English language with "K" as their third letter than there are with "K" as their first letter. Now, in a more real-world context, how often things appear on the news can lead us to think that such occurrences are much more common than is actually the case.

2. The Representativeness Heuristic (Tversky & Kahneman, 1974) is a judgment-making shortcut for the likelihood of a phenomenon. When individuals rely on this heuristic, they are generally wrong as a result of substituting what they perceive as representative of the real-world for the actual likelihood of something. For example:

Which outcome is more likely when playing at a fair roulette table?

Black, Red, Red, Black, Red, Black

or

Black, Black, Black, Red, Red, Red

Again, without adequate reflection (or perhaps statistical knowledge), many fail to realize that both occurrences are equally likely; instead, reporting the first is more likely because it represents what they consider as being "random." Intuition has a very poor understanding of statistics and, in particular, the nature of true randomness (Kahneman, 2011). In a more real-world context, stereotyping is a prime example of how we use the representativeness heuristic, by applying the information we hold as representative to a novel situation or information (e.g. librarians are quiet and organized).

3. The Anchoring (and Adjustment) Heuristic (Tversky & Kahneman, 1974) refers to a mental rule of thumb that uses the initial piece of information presented to them (often a number value) as a starting point and, subsequently, judgment or decision is made by adjusting away from this anchor. For example, consider the following two questions in turn:

Was Winston Churchill more or less than 40 when he died?

How old was Churchill when he died?

A vast majority realize that Churchill was well over 40 when he died, but nevertheless, the first question seems to imply that he was around 40 when he died, which primes people to consider that he was not as old as they initially thought. Perhaps 55–65 years old is reasonable to suggest? However, this would be wrong. Now, consider the following questions:

Was Winston Churchill more or less than 120 when he died?

How old was Churchill when he died?

Though 120 years old may seem like a preposterous age to many (many will, of course, have said younger), it may have allowed you to adjust your answer to an age somewhere between 85 and 95, which, in its own right, would be a big age. Interestingly, this would be accurate—Churchill died at 90 and it seems that the more preposterous anchor was in fact that the more reasonable one (being a spin on the classic Gandhi Anchoring & Adjustment example). Herein lies the contradictory nature of heuristics—people very often fall prey to following an anchoring heuristic because, in many situations, anchoring is the reasonable thing to do. Sometimes, when we are presented with difficult questions, we have a tendency to clutch at straws and the anchor is a plausible straw (Kahneman, 2011).

Again, in a more real-world context, imagine shopping for a used car and finding one in which you’re interested. You ask the salesman for the price and he tells you that the car costs $8,900. Though you realize that used car salesmen are stereotyped as being notorious for overcharging on cars (ahem, representativeness heuristic), it remains that the counter-offer you propose is based on the initial suggestion and may still be very much over the actual value of the car. So, in situations where negotiation is likely, be sure to be the first to offer a price!

4. The Affect Heuristic (Kahneman & Frederick, 2002) refers to a mental shortcut in which judgments are made in light of the thinker’s current emotion (e.g. disappointment or pleasure). Kahneman and Frederick’s research on this heuristic based are in part on a study conducted by Strack, Martin, and Schwarz (1988), in which college students were asked the following questions:

How happy are you with your life in general?

How many dates did you have last month?

The size of the correlation between the two questions was small when asked in this order, but was significantly large when the order was switched, suggesting that individuals who were asked the dating question first were primed by being asked a question about their romantic life, which elicited an emotional reaction. This reaction heavily influenced the manner in which the following question was answered. Like the availability heuristic, individuals who were just asked about how frequently they date have information related to their romantic life readily available and easily accessible and are likely to draw more heavily upon this information than they are to search for other facets of their life for consideration. Once again, in a more real-world example, it is a common observation of educators that when students are asked difficult questions and they do not know the answer (i.e. when they lack the required knowledge), they more often than not respond with a related, emotion-based belief or attitude (Slovic et al., 2002).

Since the work of Tversky and Kahneman (1974), numerous other heuristics have been proposed and researched (e.g. recognition, similarity, fluency and effort heuristics, to name a few)—all of which are utilized as a result of lazy thinking. Likewise, their use has been rationalized. In order to overcome lazy thinking, rationalising biases and avoid falling prey to misinformation, Pennycook and Rand, along with many others in the field of Critical Thinking, recommend spending more time, effort, and resources to the spread of accurate information; as well as the promotion of encouraging and training people to think critically.

Strack, F., Martin, L. L., & Schwarz, N. (1988). Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 18, 5, 429–442.

Much of today's news is not directly (in the short term) related to individual survival. The quantity of "news" offered to us today is essentially impossible for most (or anyone) to digest in a truly rational and thoughtful manner. How could we? There isn't enough time. Politics. Economy. Trade. Science. Technology. Sports.

So it is reasonable that drawing incorrect conclusions much of the time has little or no noticeable impact on our survival. Of course, there are negative consequences for some of our decisions, and even then, we may not make the connection between the decision and the outcome. Smart politicians and marketers are very good at tricking others into certain decisions that if carefully explained, would go the other way.

I believe that we internalize certain "key words and phrases" which evoke emotional responses based on our "experiences." Hearing/reading them seems to short-circuit the thinking process and lead us to a premature conclusion. Somehow "Watch out for the saber tooth tiger!" has the same immediate emotional response as "He is a SOCIALIST!" We didn't need to think much about the saber tooth tiger being "bad" but hearing the word "socialist" immediately shuts-down the reasoning of so many people. One requires no reasoning at all, but the other requires much more. Or should.

There certainly is a bombardment of information in the news! You're right, too, there isn't enough time to consider it all! But just because news is there, doesn't mean it requires critical thinking. If we don't care about certain news, why apply CT?
So, if we don't apply it to a certain piece of news, we can let inuition address it - and again, you're right if the intution is wrong, it will probably have little or no noticeable impact on us (unless we decide we want to starting discussing the topic as if we know about it). Again, it's true that people can fail to readily make the connection between a decision and its outcome; and schemas can play a role in that.

I do not have your credentials, but I do follow science news regularly, and I think that there is as much diversity in brain physiology as there is in any other organ. People do think differently in part because their brains are wired differently. So one could postulate that one person's Critical Thinking may be perceived by another as laziness, or something like that. But in fact it may be a person' best effort. The decisions made in the modern world may be quite different from what evolution prepared us for.

I observe that conflict easily arises over what appears to me as rather straight-forward and "obvious" statements, and I observe that two parties may be using different definitions of words and/or history, and fail to reconcile those differences before they begin their "debate." I bet you have observed this as well. Some of this may also be a reflection of differences in brain physiology. I hope someone is looking into this.

Thanks for the interest and the chat Mike! I have another post, "No Such Thing as 'Good' Critical Thinking", which addresses this very issue. Essentially, you either think critically or you don't (in a given situation). Critical thinking consists of dispositions (inclinations, motivations and willingness to think critically) and the knowledge of particular approaches and skills. If you know of such approaches and are willing to apply them, then you are likely to think critically. Sure, there are different levels of knowledge, but if people are educated in such specific approaches, that's the main hurdle. With that, I do think there exists something similar to a threshold for critical thinking, but there's no research on this, to my knowledge, and so it is simply reasoned speculation. But, with that, in reality, you either think critically about a particular topic or you don't. It's not like, 'Oh, I did think critically, but it wasn't good enough'. That's more like the description of someone who used an intuitive judgment that failed!

If we thought critically all the time, people wouldn't do extraordinary things. Huge accomplishments in the history of mankind would never happen. Because logically the odds were overwhelmingly against.

We start families, projects, businesses because our desire and emotions behind them are stronger than our logical thinking.

Also if it wasn't for intuition there would not be entrepreneurs.
There is no data, prior history, logical arguments, statistical support to use to justify many business decisions that have changed the world (while yes many also failed miserably).

So we need more / better critical thinking and less of it. It's one of those wonderful paradox.

We can't think critically all of the time, because we neither have the energy nor the time. However, we need to think critically about the things we care about. Yes, we start families, projects, businesses because of emotional influence, but if we want to see them flourish and succeed, then critical thinking (of which logical thinking is part) is necessary.

It's not just entrepreneurs - intuition hits us all upon every decision. The thing about intuition is that it never shuts off - it always gives us its opinion. Critical thinkers engage their intuition, but don't impulsively act on it. They reflect on and evaluate it. If the initial, intuitive judgment turns out to be correct, then the critical thinker will apply it. Yes, gut instincts are sometimes right, but it would be foolish to apply them without appropriate evaluation. Recent research indicates that delaying decisions by as much as .10 of a second can signicantly increase judgment accuracy.

So yes, intuition can be right. If it is, our critical thinking will have copped that and apply it. However, in important sitations, such as business settings (where money is on the line), intuition is never and should never be applied without evaluation.