Two common (and recent) mistakes about dual process reasoning and cognitive bias

"Dual process" theories of reasoning -- which have been around for a long time in social psychology -- posit (for the sake of forming and testing hypotheses; positing for any other purpose is obnoxious) that there is an important distinction between two types of mental operations.

Very generally, one of these involves largely unconscious, intuitive reasoning and the other conscious, reflective reasoning.

Kahneman calls these "System 1" and "System 2," respectively, but as I said the distinction is of long standing, and earlier dual process theories used different labels (I myself like "heuristic" and "systematic,” the terms used by Shelley Chaiken and her collaborators; the “elaboration likelihood model” of Petty & Cacioppo uses different labels but is very similar to Chaiken’s “heuristic-systematic Model”).

Kahneman's work (including most recently his insightful and fun synthesis “Thinking Fast, Slow”) has done a lot to focus attention on dual process theory, both in scholarly research (particularly in economics, law, public policy & other fields not traditionally frequented by social psychologists) and in public discussion generally.

Still, there are recurring themes in works that use Kahneman’s framework that reflect misapprehensions that familiarity with the earlier work in dual process theorizing would have steered people away from.

I'm not saying that Kahneman — a true intellectual giant — makes these mistakes himself or that it is his fault others are making them. I'm just saying that it is the case that these mistakes get made, with depressing frequency, by those who have come to dual process theory solely through the Kahneman System 1-2 framework.

Here are two of those mistakes (there are more but these are the ones bugging me right now).

1. The association of motivated cognition with "system 1" reasoning.

"Motivated cognition," which is enjoying a surge in interest recently (particularly in connection with disputes over climate change), refers to the conforming of various types of reasoning (and evenperception) to some goal or interest extrinsic to that of reaching an accurate conclusion. Motivated cognition is an unconscious process; people don't deliberately fit their interpretation of arguments or their search for information to their political allegiances, etc. -- this happens to them without their knowing, and often contrary to aims they consciously embrace and want to guide their thinking and acting.

The mistake is to think that because motivated cognition is unconscious, it affects only intuitive, affective, heuristic or "fast" "System 1" reasoning. That's just false. Conscious, deliberative, systematic, "slow" "System 2" can be affected be affected as well. That is, commitment to some extrinsic end or goal -- like one's connection to a cultural or political or other affinity group -- can unconsciously bias the way in which people consciously interpret and reason about arguments, empirical evidence and the like.

One way to understand this earlier and ongoing work is that where motivated reasoning is in play, people will predictably condition the degree of effortful mental processing on its contribution to some extrinsic goal. So if relatively effortless heuristic reasoning generates the result that is congenial to the extrinsic goal or interest, one will go no further. But if it doesn't -- if the answer one arrives at from a quick, impressionistic engagement with information frustrates that goal -- then one will step up one's mental effort, employing systematic (Kahneman's "System 2") reasoning.

But employing it for the sake of getting the answer that satisfies the extrinsic goal or interest (like affirmation of one's cultural identity defining group). As a result, the use of systematic or "System 2" reasoning will thus be biased, inaccurate.

But whatever: Motivated cognition is not a form of or a consequence of "system 1" reasoning. If you had been thinking & saying that, stop.

2. Equation of unconscious reasoning with "irrational" or biased reasoning, and equation of conscious with rational, unbiased.

This leads lots of people to think that that heuristic or unconscious reasoning processes are irrational or at least "pre-rational" substitutes for conscious "rational" reasoning. System 1 might not always be biased or always result in error but it is where biases, which, on this view, are essentially otherwise benign or even useful heuristics that take a malignant turn, occur. System 2 doesn't use heuristics -- it thinks things through deductively, algorithmically -- and so "corrects" any bias associated with heuristic, System 1 reasoning.

Wrong. Just wrong.

Indeed, this view is not only wrong, but just plain incoherent.

There is nothing that makes it onto the screen of "conscious" thought that wasn't (moments earlier!) unconsciously yanked out of the stream of unconscious mental phenomena.

Accordingly, if a person's conscious processing of information is unbiased or rational, that can only be because that person's unconscious processing was working in a rational and unbiased way -- in guiding him or her to attend to relevant information, e.g., and to use the sort of conscious process of reasoning (like logical deduction) that makes proper sense of it.

But the point is: This is old news! It simply would not have occurred to anyone who learned about the dual process theory from the earlier work to think that unconscious, heuristic, perceptive or intuitive forms of cognition are where "bias" come from, and that conscious, reflective, systematic reasoning is where "unbiased" thinking lives.

The original dual process theorizing conceives of the two forms of reasoning as integrated and mutually supportive, not as discrete and hierarchical. It tries to identify how the entire system works -- and why it sometimes doesn't, which is why you get bias, which then, rather than being "corrected" by systematic (System 2) reasoning, distorts it as well (see motivated systematic reasoning, per above).

Even today, the most interesting stuff (in my view) that is being done on the contribution that unconscious processes like "affect" or emotion make to reasoning uses the integrative, mutually supportive conceptualization associated with the earlier work rather than the discrete, hierarchical conceptualization associated (maybe misassociated; I'm not talking about Kahneman himself) with System 1/2.

Ellen Peters, e.g., has done work showing that people who are high in numeracy -- and who thus posses the capacity and disposition to use systematic (System 2) reasoning -- don't draw less on affective reasoning (System 1...) when they outperform people who are low in spotting positive-return opportunities.

On the contrary, they use more affect, and more reliably.

In effect, their unconscious affective response (positive or negative) is what tells them that a "good deal" — or a raw one — might well be at hand, thus triggering the use of the conscious thought needed to figure out what course of action will in fact conduce to the person's well-being.

People who aren't good with numbers respond to these same situations in an affectively flat way, and as a result don't bother to engage them systematically.

This is evidence that the two processes are not discrete and hierarchical but rather are integrated and mutually supportive. Greater capacity for systematic (okay, okay, "system 2"!) reasoning over time calibrates heuristic or affective processes (system 1), which thereafter, unconsciously but reliably, turns on systematic reasoning.

So: if you had been thinking or talking as if System 1 equaled "bias" and System 2 "unbiased, rational," please just stop now.

Indeed, to help you stop, I will use a strategy founded in the original dual process work.

As I indicated, believing that consciousness leaps into being without any contribution of unconsciousness is just incoherent. It is like believing in "spontaneous generation."

Because the idea that System 2 reasoning can correct unconscious bias without the prior assistance of unconscious, system 2 reasoning is illogical, I propose to call this view "System 2 ab initio bias.”

The effort it will take, systematically, to figure out why this is an appropraite thing for someone to accuse you of if you make this error will calibrate your emotions: you'll come to be a bit miffed when you see examples; and you'll develop a distinctive (heuristic) aversion to becoming someone who makes this mistakes and gets stigmatized with a humiliating label.

And voila! -- you'll be as smart (not really; but even half would be great!) as Shelly Chaiken, Ellen Peters, et al. in no time!