The introspection illusion is a cognitive illusion in which people wrongly think they have direct insight into the origins of their mental states. In certain situations, this illusion leads to people making confident but false explanations of their own behavior or predictions about their future mental states. The illusion has been examined in a number of different psychological experiments, and suggested as a basis for other biases. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behavior.[1]

When people mistake unreliable introspection for genuine self-knowledge, the result can be an illusion of superiority over other people, so for example each person thinks they are less biased and less conformist than the rest of the group.[2] Even when experimental subjects are provided with reports of other subjects' introspections, in as detailed a form as possible, they still rate those other introspections as unreliable while treating their own as reliable.[2] Although the hypothesis of an introspection illusion informs some psychological research, the existing evidence is arguably inadequate to decide how reliable introspection is in normal circumstances.[3]

Contents

[I]ntrospection does not provide a direct pipeline to nonconscious mental processes. Instead, it is best thought of as a process whereby people use the contents of consciousness to construct a personal narrative that may or may not correspond to their nonconscious states.—Timothy D. Wilson and Elizabeth W. Dunn (2004)[4]

The idea of an introspection illusion was first put forward in a controversial 1977 paper by psychologists Richard Nisbett and Timothy D. Wilson,[5] which became one of the most cited papers in the scientific study of consciousness.[6] They had performed a series of experiments in which subjects had verbally explained why they had a particular preference, or how they arrived at a particular idea. On the basis of these studies and existing attribution research, they concluded that reports on mental processes are confabulated. They wrote that subjects had, "little or no introspective access to higher order cognitive processes".[7] They distinguished between mental contents (such as feelings) and mental processes, arguing that while introspection gives us access to contents, processes remain hidden.

Although some other experimental work followed from the Nisbett and Wilson paper, difficulties with testing the hypothesis of introspective access meant that research on the topic generally stagnated.[6] A ten-year-anniversary review of the paper raised several objections, questioning the idea of "process" they had used and arguing that unambiguous tests of introspective access are hard to achieve.[3]

Updating the theory in 2002, Wilson admitted that the 1977 claims had been too far-reaching.[7] He instead relied on the psychological idea of the adaptive unconscious, which does much of the moment-to-moment work of perception and behavior. When people are asked to report on their mental processes, they cannot access this unconscious activity.[4] However, rather than acknowledge their lack of insight, they confabulate a plausible explanation, being in effect "unaware of their unawareness".[8]

Inspired by the Nisbett and Wilson paper, Petter Johansson and colleagues investigated subjects' insight into their own preferences using a new technique.[9] Subjects saw two photographs of people and were asked which they found more attractive. They were given a closer look at their "chosen" photograph and asked to verbally explain their choice. However, using sleight of hand, the experimenter had slipped them the other photograph rather than the one they had chosen. A majority of subjects failed to notice that the picture they were looking at did not match the one they had chosen just seconds before. Many confabulated offered explanations of their preference. For example, a man might say "I preferred this one because I prefer blondes" when he had in fact chosen (and pointed to) the dark-haired woman, but was handed a blonde.[6] These must have been confabulated because they explain a choice that was never made.[10]

The large proportion of subjects who were taken in by the deception contrasts with the 84% who, in post-test interviews, said that hypothetically they would have detected a switch if it had been made in front of them.[11] The researchers coined the term choice blindness for this failure to detect a mismatch.

A follow-up experiment involved shoppers in a supermarket tasting two different kinds of jam, then verbally explaining their choice while taking further spoonfuls from the "chosen" pot. The pots were rigged so that when explaining their choice, the subjects were tasting the jam they had previously rejected. A similar experiment was also done with tea. Another variation involved subjects choosing between two objects displayed on powerpoint slides, then explaining their choice when the description of what they chose has been altered.[12]

Research by Paul Eastwick and Eli Finkel at Northwestern University also undermined the idea that subjects have direct introspective awareness of what attracts them to other people.[13] These researchers examined male and female subjects' reports of what they found attractive. Men typically reported that physical attractiveness was crucial while women identified earning potential as most important. These subjective reports did not predict their actual choices in a speed dating context, or their dating behavior in a one-month follow-up.

Consistent with choice blindness, Henkel and Mather (2007) found that people are easily convinced by false reminders that they chose different options than they actually chose and that they show greater choice-supportive bias in memory for whichever option they believe they chose.[14]

Research by Emily Pronin has found that people regard their own introspections as trustworthy indicators of their own actions, motives or preferences, but do not trust other people's introspections to be similarly reliable.[15]
In a review article, she argues that over-reliance on introspected feelings and intentions is a factor in a number of different biases. For example, by focusing on their current good intentions, people can overestimate their likelihood of behaving virtuously.[16] Some studies have used the introspection illusion to explain specific biases.

The bias blind spot is an established phenomenon in which people rate themselves as less susceptible to bias than their peer group. Emily Pronin and Matthew Kugler have argued that this phenomenon is due to the introspection illusion.[17] In their experiments, subjects had to make judgments about themselves and about other subjects.[2] They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.

Pronin and Kugler's interpretation is that when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether or not they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives.[17] Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias.

Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias.[2] Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers

When asked what it would mean to be biased, subjects were more likely to define bias in terms of introspected thoughts and motives when it applied to themselves, but in terms of overt behavior when it applied to other people.[2] When subjects were explicitly told to avoid relying on introspection, their assessments of their own bias became more realistic.[2]

Another series of studies by Pronin and colleagues examined perceptions of conformity.[18] Subjects reported being more immune to social conformity than their peers. In effect, they saw themselves as being "alone in a crowd of sheep". The introspection illusion appeared to contribute to this effect. When deciding whether others respond to social influence, subjects mainly looked at their behavior, for example explaining other student's opinions on a political issue in terms of following the group. When assessing their own conformity, subjects treat their own introspections as reliable. In their own minds, they found no motive to conform, and so decided that they had not been influenced.

Psychologist Daniel Wegner has argued that an introspection illusion contributes to belief in paranormal phenomena such as psychokinesis.[19] He observes that in everyday experience, intention (such as wanting to turn on a light) is followed by action (such flicking a light switch) in a reliable way, but the processes connecting the two are not consciously accessible. Hence though subjects may feel that they directly introspect their own free will, the experience of control is actually inferred from relations between the thought and the action. This theory, called "apparent mental causation," acknowledges the influence of David Hume's view of the mind.[19] This process for detecting when one is responsible for an action is not totally reliable, and when it goes wrong there can be an illusion of control. This could happen when a external event follows, and is congruent with, a thought in someone's mind, without an actual causal link.[19]

As evidence, Wegner cites a series of experiments on magical thinking in which subjects were induced to think they had influenced external events. In one experiment, subjects watched a basketball player taking a series of free throws. When they were instructed to visualise him making his shots, they felt that they had contributed to his success.[20]

If the introspection illusion contributes to the subjective feeling of free will, then it follows that people will more readily attribute free will to themselves rather than others. This prediction has been confirmed by three of Pronin and Kugler's experiments. When college students were asked about personal decisions in their own and their roommate's lives, they regarded their own choices as less predictable. Staff at a restaurant described their co-workers' lives as more determined (having fewer future possibilities) than their own lives. When weighing up the influence of different factors on behaviour, students gave desires and intentions the strongest weight for their own behavior, but rated personality traits as most predictive of other people.[21]

↑ 6.06.16.2Johansson, Petter, Lars Hall, Sverker Sikström, Betty Tärning, Andreas Lind (2006). How something can be said about telling more than we can know: On choice blindness and introspection. Consciousness and Cognition15 (4): 673–692.

↑Pronin, Emily, Jonah Berger, Sarah Molouki (2007). Alone in a Crowd of Sheep: Asymmetric Perceptions of Conformity and Their Roots in an Introspection Illusion. Journal of Personality and Social Psychology92 (4): 585–595.