Friday, March 29, 2013

Got cognitive dissonance? Try hypocrisy! or Rationalizing Animal #6

Call someone a hypocrite and they may think you're calling them a liar, but hypocrites usually aren't liars. Or if they are, they're people who have fallen for their own lies.The most famous example of calling out hypocrisy may be Jesus's "Thou hypocrite, first cast out the beam out of thine own eye; and then shalt thou see clearly to cast out the mote out of thy brother's eye." That's not addressed to a liar. That's addressed to someone who can't see what's blocking his vision. It's a description of someone suffering from cognitive dissonance, the conflict that comes from having irreconcilable beliefs and needs, a problem many people solve with hypocrisy.The classic example: The fox wants the grapes, the fox can't get the grapes, the fox decides it doesn't want the grapes. It's easier for the fox to have a new belief than to live with the knowledge that what it wants is beyond its grasp. The fox isn't lying. The fox is deluding itself. Though their actions may have horrible consequences, hypocrites deserve our pity. From The Psychology of Hypocrisy:

Hypocrisy is among the most universal and well-studied of psychological phenomena, and the research suggests that Craig, Haggard and the others may be guilty not so much of moral hypocrisy as moral weakness. The distinction may sound trivial at first, but as a society, we tend to forgive the weak and shun the hypocritical. As psychologists Jamie Barden of Howard University, Derek Rucker of Northwestern and Richard Petty of Ohio State have shown, we often use a simple temporal cue to distinguish between the weak and the hypocritical: if you say one thing and then do another, you are much less likely to be forgiven than if you do one thing and then say another. Barden, Rucker and Petty use this example: a radio host says on-air that he's joining a fitness organization but then eats pizza for a week and gains five pounds. Hypocrite! Now consider the reverse order: the host eats pizza for a week and then publicly joins a fitness group. "In each case," the psychologists wrote in a 2005 paper in the Personality and Social Psychology Bulletin, "the statements and behaviors are equally inconsistent." But we see something almost noble about the second scenario.

To investigate cognitive dissonance, neuroscientists at the University of California, Davis, led by Cameron Carter used functional magnetic resonance imaging (fMRI) to study the brains of volunteers who were made to experience the psychological pain of clashing beliefs and actions. Specifically, the volunteers spent 45 minutes doing a boring task inside the cramped fMRI tube, after which they answered written questions indicating how they felt about the experience, which they did not enjoy. To induce cognitive dissonance, the subjects were then asked to answer the questions again, and to say this time that they enjoyed being in the scanner. Some of them were told their answers were being read by a nervous patient who needed reassurance. The other participants were told that they would get $1 each time they answered the questions as though they were enjoying the scanner, but they were not given the worried-patient cover story.

While faking it, two brain regions were particularly active in both groups: the dorsal anterior cingulate cortex (dACC) and the anterior insula. One of the functions of the dACC is to detect conflicts between incompatible bits of information; it is especially active when a person lies. The anterior insular has a similar job description, monitoring psychological conflicts such as a clash between stated beliefs and true ones. The scientists, writing in Nature Neuroscience, call this extra activity in the dACC and insula "the neural representation of cognitive dissonance." Basically, "the more that participants in the dissonance group 'lied' [about enjoying the fMRI], the greater was…activation" of these regions: they detected when beliefs and actions parted ways.