Why people act out of line with their beliefs

Tom Stafford

About the author

Tom is a Lecturer in Psychology and Cognitive Science for the Department of Psychology, University of Sheffield, UK. He is the co-author of the bestselling popular science book Mind Hacks and writes for the award-winning blog Mind Hacks which reports on psychology and neuroscience. You can follow him on Twitter at @tomstafford.

Related

Despite all the predictions of Mayan apocalypse, the world will probably not end by Saturday morning. How will the believers cope when life carries on? Psychology has some answers.

If at first you don't succeed, lower your standards. And if you find yourself acting out of line with your beliefs, change them. This sounds like motivational advice from one of the more cynical self-help books, or perhaps a Groucho Marx line (“Those are my principles, and if you don't like them... well, I have others…”), but in fact it is a caricature of one of the most famous theories in social psychology.

Leon Festinger's Dissonance Theory is an account of how our beliefs rub up against each other, an attempt at a sort of ecology of mind. Dissonance Theory offers an explanation of topics as diverse as why oil company executives might not believe in climate change, why army units have brutal initiation ceremonies, and why famous books might actually be boring.

The classic study on dissonance theory was published by Festinger and James Carlsmith in 1959. You can find a copy thanks to the Classics in the History of Psychology archive. I really recommend reading the full thing. Not only is it short, but it is full of enjoyable asides. Back in the day psychology research was a lot more fun to write up.

Festinger and Carlsmith were interested in testing what happened when people acted out of line with their beliefs. To do this, they made their participants spend an hour doing two excruciatingly boring tasks. The first task was filling a tray with spools, emptying it, then filling it again (and so on). The second was turning 48 small pegs a quarter-turn clockwise; and then once that was finished, going back to the beginning and doing another quarter-turn for each peg (and so on). Only after this tedium, and at the point which the participants believed the experiment was over, did the real study get going. The experimenter said that they needed someone to fill in at the last minute and explain the tasks to the next subject. Would they mind? And also, could they make the points that "It was very enjoyable”, “I had a lot of fun”, “I enjoyed myself”, “It was very interesting”, “It was intriguing”, and “It was exciting"?

Of course the “experiment” was none of these things. But, being good people, with some pleading if necessary, they all agreed to explain the experiment to the next participant and make these points. The next participant was, of course, a confederate of the experimenter. We're not told much about her, except that she was an undergraduate specifically hired for the role. The fact that all 71 participants in the experiment were male, and, that one of the 71 had to be excluded from the final analysis because he demanded her phone number so he could explain things further, suggests that Festinger and Carlsmith weren't above ensuring that there were some extra motivational factors in the mix.

Money trap

For their trouble, the participants were paid $1, $20, or nothing. After explaining the task the original participants answered some questions about how they really felt about the experiment. At the time, many psychologists would have predicted that the group paid the most would be affected the most – if our feelings are shaped by rewards, the people paid $20 should be the ones who said they enjoyed it the most.

In fact, people paid $20 tended to feel the same about the experiment as the people paid nothing. But something strange happened with the people paid $1. These participants were more likely to say they really did find the experiment enjoyable. They judged the experiment as more important scientifically, and had the highest desire to participate in future similar experiments. Which is weird, since nobody should really want to spend another hour doing mundane, repetitive tasks.

Festinger's Dissonance theory explains the result. The “Dissonance” is between the actions of the participants and their beliefs about themselves. Here they are, nice guys, lying to an innocent woman. Admittedly there are lots of other social forces at work – obligation, authority, even attraction. Festinger's interpretation is that these things may play a role in how the participants act, but they can't be explicitly relied upon as reasons for acting. So there is a tension between their belief that they are a nice person and the knowledge of how they acted. This is where the cash payment comes in. People paid $20 have an easy rationalisation to hand. "Sure, I lied", they can say to themselves, "but I did it for $20". The men who got paid the smaller amount, $1, can't do this. Giving the money as a reason would make them look cheap, as well as mean. Instead, the story goes, they adjust their beliefs to be in line with how they acted. "Sure, the experiment was kind of interesting, just like I told that girl", "It was fun, I wouldn't mind being in her position" and so on.

So this is cognitive dissonance at work. Normally it should be a totally healthy process – after all, who could object to people being motivated to reduce contradictions in their beliefs (philosophers even make a profession of out this), but in circumstances where some of our actions or our beliefs exist for reasons which are too complex, too shameful, or too nebulous to articulate, it can lead to us changing perfectly valid beliefs, such as how boring and pointless a task was.

Fans of cognitive dissonance will tell you that this is why people forced to defend a particular position – say because it is their job – are likely to end up believing it. It can also suggest a reason for why military services, high school sports teams and college societies have bizarre and punishing initiation rituals. If you've been through the ritual, dissonance theory predicts, you're much more likely to believe the group is a valuable one to be a part of (the initiation hurt, and you're not a fool, so it must have been worth it right?).

For me, I think dissonance theory explains why some really long books have such good reputations, despite the fact that they may be as repetitive and pointless as Festinger's peg task. Get to the end of a three-volume, several thousand page, conceptual novel and you're faced with a choice: either you wasted your time and money, and you feel a bit of a fool; or the novel is brilliant and you are an insightful consumer of literature. Dissonance theory pushes you towards the latter interpretation, and so swells the crowd of people praising a novel that would be panned if it was 150 pages long.

Changing your beliefs to be in line with how you acted may not be the most principled approach. But it is certainly easier than changing how you acted.

If you would like to comment on this article or anything else you have seen on Future, head over to our Facebook page or message us on Twitter.