More Skepticism of Behavioral Social Science

Here at O&M we have been somewhat skeptical of the behavioral social science literature. Sure, in laboratory experiments, people often behave in ways inconsistent with “rational” behavior (as defined by neoclassical economics). Yes, people seem to use various rules-of-thumb in making complex decisions. And yet, it’s not clear that the huge literature on such biases and heuristics tells us much we don’t already know.

An interesting essay by Steven Poole argues the behavioralists’ claims are overstated, mainly by relying on a narrow, superficial notion of rationality as the benchmark case. Contemporary psychology suggests that people interpret the questions posed in laboratory experiments in a nuanced, contextual manner in which their seemingly “irrational” answers are actually reasonable.

There are many other good reasons to give ‘wrong’ answers to questions that are designed to reveal cognitive biases. The cognitive psychologist Jonathan St B T Evans was one of the first to propose a ‘dual-process’ picture of reasoning in the 1980s, but he resists talk of ‘System 1’ and ‘System 2’ as though they are entirely discrete, and argues against the automatic inference from bias to irrationality. . . . In general, Evans concludes that a ‘strictly logical’ answer will be less ‘adaptive to everyday needs’ for most people in many such cases of deductive reasoning. ‘A related finding,’ he continues, ‘is that, even though people may be told to assume the premises of arguments are true, they are reluctant to draw conclusions if they personally do not believe the premises. In real life, of course, it makes perfect sense to base your reasoning only on information that you believe to be true.’ In any contest between what ‘makes perfect sense’ in normal life and what is defined as ‘rational’ by economists or logicians, you might think it rational, according to a more generous meaning of that term, to prefer the former. Evans concludes: ‘It is far from clear that such biases should be regarded as evidence of irrationality.’

Poole also argues strongly against the liberal-paternalist “nudges” advocated by Cass Sunstein and Richard Thaler, noting that “there is something troubling about the way in which [nudging] is able to marginalise political discussion.” Moreover, “nudge politics is at odds with public reason itself: its viability depends precisely on the public not overcoming their biases.” Poole concludes:

[T]here is less reason than many think to doubt humans’ ability to be reasonable. The dissenting critiques of the cognitive-bias literature argue that people are not, in fact, as individually irrational as the present cultural climate assumes. And proponents of debiasing argue that we can each become more rational with practice. But even if we each acted as irrationally as often as the most pessimistic picture implies, that would be no cause to flatten democratic deliberation into the weighted engineering of consumer choices, as nudge politics seeks to do. On the contrary, public reason is our best hope for survival. Even a reasoned argument to the effect that human rationality is fatally compromised is itself an exercise in rationality. Albeit rather a perverse, and – we might suppose – ultimately self-defeating one.

Worth a read. Even climate-change skepticism gets a nod, in a form consistent with some reflections here.

I think it makes sense to question our rationality and to view us as we actually are, which is a confederation of parts that more or less work in tandem (emphasis on more or less). Some of the criticisms leveled at this in Poole’s paper seem justified, but he only refutes laboratory experiments.

Studies on motivated reasoning and political partisanship have shown that the more you know, the better you are at justifying your previously held position, that you’re less likely to take conflicting information seriously, that you’re less likely to be critical of something that agrees with you. Because this holds true even if your beliefs are internally contradictory, I think it is more damning to the idea of rationality than Poole lets on;

But also…
“—that despite the tendency of motivated reasoners to ignore
evidence inconsistent with their preferred beliefs (here, a candidate they viewed positively)—
given enough negative information, attitude change would occur. They find some support for
this notion. Although the research is encouraging, it remains to be seen whether a tipping point
exists to the extent that ideology concedes to evidence. Learning that a favored candidate is not
as admirable as once thought is one thing; accepting evolution or climate change and adjusting
one’s literalist approach to religion accordingly is quite another. ”

And since they correctly used “method” and not “methodology”, you have to believe them!