'Honest Truth' About Why We Lie, Cheat And Steal

Chances are, you're a liar. Maybe not a big liar — but a liar nonetheless. That's the finding of Dan Ariely, a professor of psychology and behavioral economics at Duke University. He's run experiments with some 30,000 people and found that very few people lie a lot, but almost everyone lies a little.

Ariely describes these experiments and the results in a new book, The (Honest) Truth About Dishonesty: How We Lie To Everyone — Especially Ourselves. He talks with NPR's Robert Siegel about how society's troubles aren't always caused by the really bad apples; they're caused by the scores of slightly rotting apples who are cheating just a little bit.

Interview Highlights

On the traditional, cost/benefit theory of dishonesty

"The standard view is a cost/benefit view. It says that every time we see something, we ask ourselves: What do I stand to gain from this and what do I stand to lose? Imagine it's a gas station: Going by a gas station, you ask yourself: How much money is in this gas station? If I steal it, what's the chance that somebody will catch me and how much time will I have in prison? And you basically look at the cost and benefit, and if it's a good deal, you go for it."

On why the cost/benefit theory is flawed

"It's inaccurate, first of all. When we do experiments, when we try to tempt people to cheat, we don't find that these three elements — what do we stand to gain, probability of being caught and size of punishment — end up describing much of the result.

"Not only is it a bad descriptor of human behavior, it's also a bad input for policy. Think about it: When we try to curb dishonesty in the world, what do we do? We get more police force, we increase punishment in prison. If those are not the things that people consider when they think about committing a particular crime, then all of these efforts are going to be wasted."

On how small-time cheaters still perceive themselves as good people

"We want to view ourselves as honest, wonderful people and when we cheat ... as long as we cheat just a little bit, we can still view ourselves as good people, but once we start cheating too much ... we can't view ourselves as good people and therefore we stop. So this model of trying to balance the ability to view ourselves as good people on one hand and the ability to cheat on the other hand predicts that people will cheat a little bit and they will still feel good about themselves. ... That's what we see across many, many experiments."

On how only a few people cheat a lot, but a lot of people cheat a little

"Across all of our experiments, we've tested maybe 30,000 people, and we had a dozen or so bad apples and they stole about $150 from us. And we had about 18,000 little rotten apples, each of them just stole a couple of dollars, but together it was $36,000. And if you think about it, I think it's actually a good reflection of what happens in society."

On his favorite cheating experiment

"We give people a sheet of paper with 20 simple math problems and we say, 'You have 5 minutes to solve as many of those as you can, and we'll give you $1 per question.' We say, 'Go!' People start, they solve as many as they can, at the end of the five minutes, we say, 'Stop! Please count how many questions you got correctly, and now that you know how many questions you got correctly, go to the back of the room and shred this piece of paper. And once you've finished shredding this piece of paper, come to the front of the room and tell me how many questions you got correctly.'

"Well, people do this, they shred, they come back, and they say they solved on average six problems, we pay them $6, they go home. What the people in the experiment don't know is that we've played with the shredder, and so the shredder only shreds the sides of the page but the main body of the page remains intact. ... What we find is people basically solve four and report six. ... We find that lots of people cheat a little bit; very, very few people cheat a lot.

On a variation of this experiment in which participants cheated twice as much

"In one of the experiments, people did the same thing exactly, finished shredding the piece of paper, but when they came to report, they didn't say, 'Mr. Experimenter, I solved x problems, give me x dollars.' They say, 'Mr. Experimenter, I solved x problems, give me x tokens,' and we paid people with pieces of plastic in terms of money. And then they took these pieces of plastic and they walk 12 feet to the side and exchanged them for dollars. ... The only difference is when people stared somebody else in the eyes and lied, they lied for pieces of plastic and not money. And what happened? Our participants doubled their cheating."

On how our cashless economy may encourage cheaters

"The moment something is one step removed from money ... people can cheat more and [still] feel good about themselves. It basically relieves people from the moral shackles. And, the reason this worries me so much is because if you think about modern society, we are creating lots of cashless economy. We have electronic wallets, we have mortgage-backed securities, we have stock options, and could it be that all of those payment modalities that as they get more and more further from money become easier for us to cheat and be dishonest with them."

On one version of the experiment, in which the administrator of the test takes a cell phone call while he's giving instructions to the participants, which causes the participants to cheat even more

"I think this goes back to the law of karma, right? So if you ask yourself, how can I rationalize cheating, really the main mechanism in all of our experiments is rationalization. How can you rationalize your actions and still think of yourself as a good person? And if somebody has mistreated you, now you can probably rationalize something to a higher degree."

On the dishonesty that arises from conflicts of interest

"We need to change ... regulation, and it's basically to change conflicts of interest. ... Much like in sports, if you like a particular team and the referee calls against your team, you think the referee is evil, vicious, stupid. ... In the same way, if you have a financial stake in seeing the world in a certain way, you're going to see the world in a certain way. So the first thing I think we need to do is eradicate conflicts of interest."

On the Broken Windows theory of policing — cracking down on minor offenses in an effort to curb major offenses

"There's kind of two ways to think about the Broken Windows theory: one is about cost/benefit analysis and do people do it; the other one is about what ... society around us tells us is acceptable and not acceptable. I actually believe in the second approach for this. So when we go around the world and we ask ourselves what behavior are we willing to engage in/what behavior we're not, we look at other people for a gauge for what is acceptable. In our experiments, we've shown that if we get one person to cheat in an egregious way and other people see them, they start cheating to a higher degree. So, for me, the broken window theory is more as a social signal than fear of being caught."

Dan Ariely, who teaches psychology and behavioral economics at Duke, has spoken here often about our predictable irrationality. He's written a couple of books about how human behavior defies any easy calculus of cost and benefit. And his latest book tackles an unfortunately predictable feature of our behavior: dishonesty. It's called "The (Honest) Truth About Dishonesty." And, Dan, it's good to see you again.

DAN ARIELY: Very good to see you.

SIEGEL: Let's start with the explanation of dishonesty that you are pushing back against. What's the orthodox view of lying, cheating and stealing?

ARIELY: So the standard view is a cost-benefit view. It basically says that every time we see something, we ask ourselves: What do I stand to gain from this, and what do I stand to lose? And imagine it's a gas station. You're going by a gas station, you should ask yourself: How much money is in this gas station? If I steal it, what's the chance that somebody will catch me, and how much time will I have in prison? And you basically look at the cost and benefit, and if it's a good deal, you go for it.

SIEGEL: This is why I don't rob banks because I might get arrested and go to prison. I figure it's not worth it. What's wrong with that view?

ARIELY: It's inaccurate, first of all. When we do experiments and we try to tempt people to cheat, we don't find that these three elements of what do we stand to gain, probability of being caught and size of punishment end up describing much of the result.

SIEGEL: You say that there is a duality of motives here, that, yes, there are some attraction to cheating if by doing so, you can make your deductions look a little bit bigger on your income tax returns, but there's also a contrary desire to want to perceive yourself as an honest person.

ARIELY: That's right. So we want to view ourselves as honest, wonderful people. And when we cheat - what's interesting is that as long as we cheat just a little bit, we can still view ourselves as good people, but once we start cheating too much, it stops us. We can't view ourselves as good people; and therefore, we stop. So this model of trying to balance the ability to view ourselves as good people on one hand and the ability to cheat on the other hand predicts that people will cheat a little bit, and they would still feel good about themselves while they cheat a little bit. And that's what we see across many, many experiments.

SIEGEL: Yeah. You would say that while we might expect to see a few rotten apples who cheat a lot, indeed what we really see is a lot of people, a lot of normal apples who cheated very little bit.

ARIELY: That's right. And if you think about this across all of our experiments, we've tested maybe 30,000 people, and we had a dozen or so bad apples, and they stole about $150 from us.

(LAUGHTER)

ARIELY: And...

SIEGEL: Yes.

ARIELY: ...we had about 18,000 little rotten apples, right, that each of them just stole a couple of dollars, but together, it was $36,000. And if you think about it, I think it's actually a good reflection of what happens in society.

SIEGEL: Well, take us beyond interesting observations about human nature to experimental psychology. I want you to describe the kind of experiment, which you feel, you know, holds methodological water and actually shows this.

ARIELY: So one of my favorite experiments, we give people a sheet of paper with 20 simple math problems, and we say you have five minutes to solve as many of those as you can, and we'll give you $1 per question. We say go. People start, they solve as many as they can. At the end of the five minutes, we say stop, please count how many questions you got correctly. And now that you know how many questions you got correctly, go to the back of the room and shred this piece of paper. And once you've finished shredding this piece of paper, come to the front of the room and tell me how many questions you got correctly.

Well, people do this, they shred, they come back, and they say they solved on average six problems, we pay them $6, they go home. What the people in the experiment don't know is that we've played with the shredder, so the shredder only shreds the sides of the page, but the main body of the page remains intact, and therefore...

SIEGEL: So you can check their self-reported...

ARIELY: That's right.

SIEGEL: ...success.

ARIELY: And what we find is people basically solve four and report six. And, again, we find that lots of people cheat by a little bit; very, very few cheat a lot. But here's the most interesting version for me. In one of the experiments, people did the same thing exactly. They finished shredding the piece of paper, but when they came to report, they didn't say, Mr. Experimenter, I solved X problems, give me X dollars. They say, Mr. Experimenter, I solved X problems, give me X tokens. And we paid people with pieces of plastic in terms of money.

And then they took these pieces of plastic and they walk 12 feet to the side and changed it for dollars. Now, if you think about it, what happened, the only difference is when people staring somebody else in the eyes and lied, they lied for pieces of plastic and not money. And what happened? Our participant doubled their cheating.

SIEGEL: If they're cheating to get something that they can redeem for money, it's easier than saying give me more money than I actually deserve.

ARIELY: Yes. The moment something is one step removed from money, and presumably more steps removed from money would make it even better, people can cheat more and feel good about themselves. It basically relieves people from the moral shackles. And the reason this worries me so much is because if you think about modern society, we are creating lots of cashless economy. We have electronic wallets. We have mortgage-backed securities. We have stock options. And could it be that all of those payment modalities that as they get more and more further from money become easier for us to cheat and be dishonest with them?

SIEGEL: One of my - you do many, many variations on this testing theme, and one of my favorites is when you introduce the proctor, the person who's doing the test, and when he, while describing the experiment, pretends or literally does receive a call on his cellphone and has some stupid irrelevant conversation with his friend about whether I'll go to dinner that night. After hearing somebody be that rude to them, people are more likely to cheat.

ARIELY: That's right. And I think this goes back to the law of karma, right? So if you ask yourself, how can I rationalize cheating? What, really, the main mechanism in all of our experiments is rationalization. How can you rationalize your action and still think of yourself as a good person? And if somebody has mistreated you, now you can probably rationalize something to a higher degree. Insurance company, you know, they must have done something for me for the past. This person must have deserved something like that. And all of those activities get us to act to a higher degree.

SIEGEL: So, Dan, here's an inconsistency that I find. In the end, as you think about, well, what might one do to discourage dishonesty or theft, and you're attracted to the broken windows theory of policing, which is essentially crackdown on even very minor offenses. Let it be known that breaking a window is unacceptable because very often, the kids who start breaking windows end up doing a lot worse a few years down the road. Isn't that based on the very same cost-benefit analysis? If you break that window, you're going to be arrested, you're going to be taken into court, that you begin by saying doesn't describe human behavior.

ARIELY: So, first of all, I think that my main desire of what we need to change is to change regulation, and it's basically to change conflicts of interest, right? So this perspective on the world basically says that much like in sports, if you like a particular team and the referee calls against your team, you think the referee is evil, vicious, stupid, something. In the same way, if you have a financial stake in seeing the world in a certain way, you're going to see the world in a certain way. So the first thing, I think, we need to do is to eradicate conflicts of interest.

In terms of the broken windows theory, there's kind of two ways to think about broken windows theory. One is about cost-benefit analysis, and do people do it? The other one is about what is society around us tells us is acceptable and not acceptable. And I actually believe in the second approach for this. So when we go around the world and we ask ourselves what behavior are we willing to engage in, what behavior we're not, we look at other people as a gauge for what is acceptable.

And in our experiments, we've shown that if we get one person to cheat in egregious way and other people see them, they start cheating to a higher degree. So, for me, the broken windows theory is more as a social signal than the fear of being caught.

SIEGEL: Well, Dan Ariely, thank you very much for talking with us once again.

ARIELY: My pleasure as always.

SIEGEL: Dan Ariely of Duke University, his new book is called "The (Honest) Truth About Dishonesty."

(SOUNDBITE OF MUSIC)

SIEGEL: You're listening to ALL THINGS CONSIDERED from NPR News. Transcript provided by NPR, Copyright NPR.