Like this:

In 2011, a conference honoring the late Dan Wegner, “Wegstock,” was held at Harvard University.

The talks are brief and are well worth watching. We are highlighting individual talks, roughly 15 minutes each, through August and September.

In this video, Todd Heatherton delivers his untitled talk in which he discusses his research (inspired and influenced in part by Dan Wegner) on “mind imaging” with regard to self-regulation and mind perception.

Like this:

This excerpt, which highlights some of the remarkable work by the late Dan Wegner, comes from an article written by Eric Jaffe in a 2006 edition of the APS’s Observer:

“Freud’s Fundamental Rule of Psychoanalysis was for patients to be completely open with a therapist no matter how silly or embarrassing the thought,” says Anita Kelly, a researcher at the University of Notre Dame who published one of the first books on the formal study of secrets, The Psychology of Secrets, in 2002.

Only since the late 1980s and early 1990s have researchers like Daniel Wegner and James Pennebaker put Freud through the empirical ringer and begun to understand the science behind secrets. “The Freudian way of thinking about things was, he assumed suppression took place and looked at what was happening afterwards,” says Wegner, a psychologist at Harvard. “The insight we had was, let’s not wait until after the fact and assume it occurred, let’s get people to try to do it and see what happened. That turned out to be useful insight; it opened this up to experimental research. It became a lab science instead of an after-the-fact interpretation of peoples’ lives.”

For Wegner, an interest in secrets began with a white bear. In Russian folklore attributed to Dostoevsky, Tolstoy, or sometimes both, a man tells his younger brother to sit in the corner and not think of a white bear, only to find later that the sibling can think of nothing else. If a meaningless white bear can arouse such frustration, imagine the crippling psychological effects of trying not to think of something with actual importance when the situation requires silence — running into the wife of a friend who has a mistress, being on a jury and having to disregard a stunning fact, or hiding homosexuality in a room full of whack-happy wiseguys.

So in 1987, Wegner, who at that time was at Trinity University, published a paper in the Journal of Personality and Social Psychology discussing what happens when research subjects confront the white bear in a laboratory. In the study, subjects entered a room alone with a tape recorder and reported everything that came to mind in a five-minute span. Before the experiment, Wegner told some subjects to think of anything except a white bear, and told others to try to think of a white bear. Afterwards, the subjects switched roles. Any time a subject mentioned, or merely thought of, a white bear, he or she had to ring a bell inside the room.

It was not quite Big Ben at noon, but those who suppressed the white bear rang the bell once a minute — more often than subjects who were free to express the thought. More remarkably, Wegner found what he called the “rebound effect”: When a subject was released from suppression and told to express a hidden thought, it poured out with greater frequency than if it had been mentionable from the start. (Think fresh gossip.) He also found evidence for an insight called “negative cuing.” The idea is that a person trying to ditch a thought will look around for something to displace it — first at the ceiling fan, then a candle, then a remote control. Soon the mind forms a latent bond between the unwanted thought and the surrounding items, so that everything now reminds the person of what he is trying to forget, exacerbating the original frustration.

“People will tend to misread the return of unwanted thoughts,” Wegner said recently. “We don’t realize that in keeping it secret we’ve created an obsession in a jar.” Wegner told the story of a suicidal student who once called him for help. Desperate to keep her on the phone, but lacking any clinical training, Wegner mentioned the white bear study. Slowly the student realized she had perpetuated a potentially fleeting thought by trying to avoid it. “She got so twisted up in the fact that she couldn’t stop thinking of killing herself, that she was making it come back to mind. She was misreading this as, there’s some part of me that wants to do it. What she really wanted was to get rid of the thought.”

One method of diverging attention from an unwanted thought, says Wegner, is to focus on a single distraction from the white bear, like a red Volkswagen, an idea that he tested successfully in later experiments. The concern with this technique, which Freud first laid out, is that a person could become obsessed with an arbitrary item, planting the seeds for abnormal behavior. In a later experiment, published in 1994 in the same journal, Wegner found more evidence that secrets lead to strange obsession. He placed four subjects who had never met around a table, split them into two male-female teams, and told them to play a card game. One team was instructed to play footsie without letting the other team know. At the end of the experiment, the secret footsie-players felt such a heightened attraction toward one another that the experimenters made them leave through separate doors, for ethical reasons. “We can end up being in a relationship we don’t want, or interested in things that aren’t at all important, because we had to keep them quiet,” Wegner said, “and it ends up growing.”

Live Free or Die

The logical opposite of an unhealthy obsession based on secrets is a healthy result from disclosing such secrets. This healing aspect of revelation is where Wegner’s work connects with James Pennebaker’s. In the late 1970s, Pennebaker was part of a research team that found, via survey, that people who had a traumatic sexual experience before age 17 were more likely to have health problems as they got older. Pennebaker looked further and found that the majority of these people had kept the trauma hidden, and in 1984 he began the first of many studies on the effects of revealing previously undisclosed secrets.

In most of Pennebaker’s experiments, subjects visited a lab for three or four consecutive days, each time writing about traumatic experiences for 15 or 20 minutes. In the first five years, hundreds of people poured their secrets onto the page. A college girl who knew her father was seeing his secretary; a concentration camp survivor who had seen babies tossed from a second-floor orphanage window; a Vietnam veteran who once shot a female fighter in the leg, had sex with her, then cut her throat. By the end of the experiment, many participants felt such intense release that their handwriting became freer and loopier. In one study of 50 students, those who revealed both a secret and their feelings visited the health center significantly fewer times in the ensuing six months than other students who had written about a generic topic, or those who had only revealed the secret and not the emotions surrounding it.

The work led to many papers showing evidence that divulging a secret, which can mean anything from telling someone to writing it on a piece of paper that is later burned, is correlated with tangible health improvements, both physical and mental. People hiding traumatic secrets showed more incidents of hypertension, influenza, even cancer, while those who wrote about their secrets showed, through blood tests, enhanced immune systems. In some cases, T-cell counts in AIDS patients increased. In another test, Pennebaker showed that writing about trauma actually unclogs the brain. Using an electroencephalogram, an instrument that measures brain waves through electrodes attached to the scalp, he found that the right and left brains communicated more frequently in subjects who disclosed traumas.

(It should be noted that the type of secrets discussed in this article are personal secrets—experiences a person chooses not to discuss with others. They can be positive, in the case of hiding a birthday cake, or negative, in the case of hiding a mistress. Secrets that could be considered “non-personal,” for example, information concealed as part of a job, were not specifically addressed.)

Exactly why revelation creates such health benefits is a complicated question. “Most people in psychology have been trained to think of a single, parsimonious explanation for an event,” said Pennebaker, who did much of his research at Southern Methodist University before coming to the University of Texas, where he is chair of the psychology department. “Well, welcome to the real world. There are multiple levels of explanation here.” Pennebaker lists a number of reasons for the health improvements. Writing about a secret helps label and organize it, which in turn helps understand features of the secret that had been ignored. Revelation can become habitual in a positive sense, making confrontation normal. Disclosure can reduce rumination and worry, freeing up the mental quagmires that hindered social relationships. People become better listeners. They even become better sleepers. “The fact is that all of us occasionally are dealing with experiences that are hard to talk about,” Pennebaker said. “Getting up and putting experiences into words has a powerful effect.”

At the end of a recent Sopranos episode, Vito looks most content after seeing a New Hampshire license plate, with its state motto: “Live free or die.” Pennebaker’s research may add a new level of truth to that phrase.

Little Machiavellis

In the early 1990s, it was not unusual for 3-year-old Jeremy Peskin to want a cookie. His mother, Joan, used to hide them in the high cupboards of their home in Toronto; when she left, Jeremy would climb up and sneak a few. One day, Jeremy had a problem: He wanted a cookie, but his mother was in the kitchen. “He said to me, ‘Go out of the kitchen, because I want to take a cookie,’ ” Joan recalled recently. Unfortunately for Jeremy, Joan Peskin was a doctorate student in psychology at the time, and smart enough to see through the ruse. Fortunately for developmental researchers, Peskin’s experience led her to study when children first develop the capacity for secrets.

What interested Peskin, now a professor at the University of Toronto, was Jeremy’s inability to separate his mother’s physical presence from her mental state. If she was out of the room, he would be able to take a cookie, whether or not his mother knew that he intended to take a cookie. Peskin took this insight to the laboratory — in this case, local day-care centers — where she tried to get children age three, four, and five to conceal a secret. She showed the children two types of stickers. The first, a gaudy, glittery sticker, aroused many a tiny smile; the second, a drab, beige sticker of an angel, was disliked. Then she introduced a mean puppet and explained that this puppet would take whatever sticker the children wanted most. When the puppet asked 4- and 5-year-olds which sticker they wanted, most of the children either lied or would not tell. The 3-year-olds almost always blurted out their preference, even when the scenario was repeated several times, she found in the study, which was published in Developmental Psychology in 1992. Often the 3-year-olds grabbed at the shiny sticker as the puppet took it away, showing a proper understanding of the situation but an inability to prevent it via secretive means.

The finding goes beyond secrets; 4 has become the age when psychologists think children develop the ability to understand distinct but related inner and outer worlds. “When I teach it I put a kid on the overhead with a thought bubble inside,” Peskin said. “When they could think of someone else’s mental state — say, ignorance, somebody not knowing something — that influences their social world.” In a follow-up study published in Social Development in 2003, Peskin found again that 3-year-olds were more likely than 4- or 5-year-olds to reveal the location of a surprise birthday cake to a hungry research confederate. “When a child is able to keep a secret,” Peskin says, “parents should take it as, that’s great, this is normal development. They aren’t going to be little Machiavellis. This is normal brain development.”

Confidence in Confidants

Soon after Mark Felt revealed himself as Deep Throat, the anonymous source who guided Bob Woodward during the Watergate scandal, Anita Kelly’s phone began to ring. “One morning I had 10 messages from different news groups,” she recalled recently. “They wanted me to say that secrecy’s a bad thing, and I’d say, look, there’s no evidence. This guy’s in his early 90s, and has seemed to have a healthy life.”

When preparing The Psychology of Secrets, Kelly re-examined the consequences and benefits of secret-keeping, and began to believe that while divulging secrets improves health, concealing them does not necessarily cause physical problems. “I couldn’t find any evidence that keeping a secret makes a person sick,” Kelly said. “There is evidence that by writing about held-back information someone will get health benefits. Someone keeping a secret would miss out on those benefits. It’s not the same as saying if you keep a secret you’re going to get sick.”

Her latest work, in press at the Journal of Personality, challenged the notion that secret-keeping can cause sickness. Instead of merely looking at instances of sickness nine weeks after disclosure, Kelly and co-author Jonathan Yip adjusted their measurements for initial levels of health. They found, quite simply, that secretive people also tend to be sick people, both now and two months down the line.

“It doesn’t look like the process of keeping the secret made them sick,” she said. High “self-concealers,” as Kelly calls them, tend to be more depressed, anxious, and shy, and have more aches and pains by nature, perhaps suggesting some natural link between being secretive and being vulnerable to illness. “I don’t think it’s much of a stretch to say that being secretive could be linked to being symptomatic at a biological level.”

This conclusion came gradually. In the mid-1990s, following Pennebaker’s line of research that had really opened up the field, Kelly focused on the health effects of revealing and concealing secrets. The research clearly showed links between secrets and illness. In a review of the field for Current Directions in Psychological Science in 1999, Kelly notes some of these health correlations: cases in which breast cancer patients who talked about their concealed emotions survived almost twice as long as those who did not; students who wrote about private traumatic events showed higher antibody levels four and six months after a Hepatitis B vaccination; and gay men who concealed their sexuality had a higher rate of cancer and infectious disease.

But in 1998 she did a study asking patients about their relationships with their therapists. She found that 40 percent of them were keeping a secret, but generally felt no stress as a result. Kelly began to believe that some secrets can be kept successfully, and that, in some scenarios, disclosing a secret could cause more problems than it solves. Psychologists, she felt, were not paying enough attention to the situations in which disclosure should occur — only that it did. “The essence of the problem with revealing personal information is that revealers may come to see themselves in undesirable ways if others know their stigmatizing secrets,” she wrote in the 1999 paper.

John Caughlin, a professor of communication at the University of Illinois at Urbana-Champaign who has studied secrets, agrees that sometimes openness is not the best policy. “People are so accustomed to saying an open relationship is a good one, that if they have secrets it can make them feel that something’s wrong,” he said recently. In 2005, Caughlin published a paper in Personal Relationships suggesting that people have a poor ability to forecast how they will feel after revealing a secret, and how another person will respond to hearing it. “I’m not touting that people should keep a lot of secrets,” he said, “but I don’t think people should assume it’s bad, and I think they do.” In her new book, Anatomy of a Secret Life, published in April, Gail Saltz, a professor of psychiatry at Cornell Medical School, referred to secrets as “benign” or “malignant,” depending on the scenario. “In teenagers, having secret identities is normal, healthy separation from parents and needs to go on,” said Saltz recently.

To address this concern, Kelly has focused her recent work on the role of confidants in the process of disclosure. She created a simple diagram advising self-concealers when they should, and when they should not, reveal a secret. On one hand, if the secret does not cause mental or physical stress, it should be kept, to provide a sense of personal boundary and avoid unnecessary social conflict. If it does cause anguish, the secret-keeper must then evaluate whether he or she has a worthy confidant, someone willing to work toward a cathartic insight. When such a confidant is not available, the person should write down his or her thoughts and feelings. “The world changes when you tell someone who knows all your friends,” said Kelly, who experienced this change firsthand 15 years back, when she shared with a colleague something “very personal and embarrassing,” as she called it, and then found her secret floating among her colleagues. “You have to think, what are the implications with my reputation,” she said. “It’s more complicated once you have to reveal to someone.”

Like this:

Daniel M. Wegner, a pioneering social psychologist who helped to reveal the mysteries of human experience through his work on thought suppression, conscious will, and mind perception, died July 5 as a result of amyotrophic lateral sclerosis (ALS). He was 65.

The John Lindsley Professor of Psychology in Memory of William James, Wegner redefined social psychology as the science of human experience. He was arguably most famous for his experiments on thought suppression, in which people were unable to keep from thinking of a white bear.

Wegner also broke ground in other areas of social psychology, including transactive memory (how memories are distributed across groups and relationship partners) and action identification (what people think they are doing). He had also explored the experience of conscious will, and most recently focused on mind perception (how people perceive human and nonhuman minds).

“Dan was, I believe, the most original thinker in modern psychology,” said Dan Gilbert, the Edgar Pierce Professor of Psychology, who knew Wegner for three decades. “Most of us work on problems that are important in our field, and we use theories others have invented to make progress. Dan didn’t make progress — Dan made new highways, new roads. He opened doors in walls that we didn’t know had doors in them, and he did this over and over.”

Gilbert said he was privileged to call Wegner one of his closest friends. The two met while they both worked in Texas — Gilbert at the University of Texas and Wegner at Trinity University.

“Being among the few social psychologists in Texas, we were introduced by a mutual friend, and it was love at first sight,” Gilbert said. “We’ve been true friends ever since.” He added. “I’m heartbroken to lose my friend of 30 years, but I guess the only thing worse would have been not to have a friend of 30 years.”

While Wegner was known for his pioneering work on the mind, Gilbert said his intellectual curiosity seemed never to rest.

“The thing about Dan is he didn’t take the lab coat off,” Gilbert said. “For him, being a psychologist wasn’t a job, it was a way of being. He simply spent all his waking time thinking about the interesting aspects of the mind. It was 24/7 for him.”

That intellectual heft, however, never masked Wegner’s humor.

“Dan Wegner was the funniest human being I’ve ever known, and everybody else was a distant second,” Gilbert said. “To say someone was funny may sound frivolous, but I would make the claim that Dan understood something important, which is that humor is the place where intelligence and joy meet. Dan understood that … humor is where a brilliant mind tickles itself.”

That sense of humor, Gilbert said, often showed up in Wegner’s writing, and helped transform the way social psychology is described in many journals today. “If you open a psychology journal now,” he said, “many, many people write in a Wegner-esque style.”

Even in his final days, Gilbert said, Wegner’s restless mind faced the challenge of his death with an inspirational degree of curiosity.

“It was a privilege to sit by his side as he took this journey to the end,” Gilbert said. “About a month ago, I asked him, ‘If you had to think of one word to describe this experience, what would it be?’ He looked at me, and he said ‘fascinating.’ He was a student of the human experience, and he was having an experience unlike most of us ever have. And rather than bemoaning it or crying about it, he took it as another fascinating thing to study and learn about and think about.”

Born in Calgary, Alberta, Canada, Wegner studied as an undergraduate and graduate student at Michigan State University, earning his Ph.D. in 1974. He was appointed an assistant professor and rose to full professor and chair of the psychology department at Trinity in San Antonio.

Wegner joined the faculty in the psychology department at the University of Virginia in 1990, where he was the William R. Kenan Jr. Professor of Psychology before joining the Harvard faculty in 2000.

Wegner was the author of four academic books, an introductory psychology textbook, and nearly 150 journal articles and book chapters.

Wegner’s research was funded by the National Science Foundation and the National Institute of Mental Health. In 1996-1997 he was a fellow at the Center for Advanced Study in the Behavioral Sciences, and in 2011 was inducted as a fellow of the American Academy of Arts and Sciences. He received many of the top honors in his field, including the William James Fellow Award from the Association for Psychological Science, the Distinguished Scientific Contribution Award from the American Psychological Association, the Distinguished Scientist Award from the Society of Experimental Social Psychology, and the Donald T. Campbell Award from the Society for Personality and Social Psychology.

Wegner is survived by his wife of 29 years, Toni Giuliano Wegner of Winchester, and his daughters, Kelsey Wegner Hurlburt of Dunkirk, Md., and Haley Wegner of Winchester. At Wegner’s request, his body was donated to the Massachusetts General Hospital’s Neurological Clinical Research Institute for ALS Research.

A memorial service will be held at 4 p.m. on Saturday at the Winchester Unitarian Society, 478 Main St., Winchester, Mass. Wegner requested that his service be a celebration of life, and so would welcome Hawaiian shirts.

Like this:

When it comes to economics versus psychology, score one for psychology.

Economists argue that markets usually reflect rational behavior—that is, the dominant players in a market, such as the hedge-fund managers who make billions of dollars’ worth of trades, almost always make well-informed and objective decisions. Psychologists, on the other hand, say that markets are not immune from human irrationality, whether that irrationality is due to optimism, fear, greed, or other forces.

Now, a new analysis published the week of July 1 in the online issue of the Proceedings of the National Academy of Sciences (PNAS) supports the latter case, showing that markets are indeed susceptible to psychological phenomena. “There’s this tug-of-war between economics and psychology, and in this round, psychology wins,” says Colin Camerer, the Robert Kirby Professor of Behavioral Economics at the California Institute of Technology (Caltech) and the corresponding author of the paper.

Indeed, it is difficult to claim that markets are immune to apparent irrationality in human behavior. “The recent financial crisis really has shaken a lot of people’s faith,” Camerer says. Despite the faith of many that markets would organize allocations of capital in ways that are efficient, he notes, the government still had to bail out banks, and millions of people lost their homes.

In their analysis, the researchers studied an effect called partition dependence, in which breaking down—or partitioning—the possible outcomes of an event in great detail makes people think that those outcomes are more likely to happen. The reason, psychologists say, is that providing specific scenarios makes them more explicit in people’s minds. “Whatever we’re thinking about, seems more likely,” Camerer explains.

For example, if you are asked to predict the next presidential election, you may say that a Democrat has a 50/50 chance of winning and a Republican has a 50/50 chance of winning. But if you are asked about the odds that a particular candidate from each party might win—for example, Hillary Clinton versus Chris Christie—you are likely to envision one of them in the White House, causing you to overestimate his or her odds.

The researchers looked for this bias in a variety of prediction markets, in which people bet on future events. In these markets, participants buy and sell claims on specific outcomes, and the prices of those claims—as set by the market—reflect people’s beliefs about how likely it is that each of those outcomes will happen. Say, for example, that the price for a claim that the Miami Heat will win 16 games during the NBA playoffs is $6.50 for a $10 return. That means that, in the collective judgment of the traders, Miami has a 65 percent chance of winning 16 games.

The researchers created two prediction markets via laboratory experiments and studied two others in the real world. In one lab experiment, which took place in 2006, volunteers traded claims on how many games an NBA team would win during the 2006 playoffs and how many goals a team would score in the 2006 World Cup. The volunteers traded claims on 16 teams each for the NBA playoffs and the World Cup.

In the basketball case, one group of volunteers was asked to bet on whether the Miami Heat would win 4–7 playoff games, 8–11 games, or some other range. Another group was given a range of 4–11 games, which combined the two intervals offered to the first group. Then, the volunteers traded claims on each of the intervals within their respective groups. As with all prediction markets, the price of a traded claim reflected the traders’ estimations of whether the total number of games won by the Heat would fall within a particular range.

Economic theory says that the first group’s perceived probability of the Heat winning 4–7 games and its perceived probability of winning 8–11 games should add up to a total close to the second group’s perceived probability of the team winning 4–11 games. But when they added the numbers up, the researchers found instead that the first group thought the likelihood of the team winning 4–7 or 8–11 games higher than did the second group, which was asked about the probability of them winning 4–11 games. All of this suggests that framing the possible outcomes in terms of more specific intervals caused people to think that those outcomes were more likely.

The researchers observed similar results in a second, similar lab experiment, and in two studies of natural markets—one involving a series of 153 prediction markets run by Deutsche Bank and Goldman Sachs, and another involving long-shot horses in horse races.

People tend to bet more money on a long-shot horse, because of its higher potential payoff, and they also tend to overestimate the chance that such a horse will win. Statistically, however, a horse’s chance of winning a particular race is the same regardless of how many other horses it’s racing against—a horse who habitually wins just five percent of the time will continue to do so whether it is racing against fields of 5 or of 11. But when the researchers looked at horse-race data from 1992 through 2001—a total of 6.3 million starts—they found that bettors were subject to the partition bias, believing that long-shot horses had higher odds of winning when they were racing against fewer horses.

While partition dependence has been looked at in the past in specific lab experiments, it hadn’t been studied in prediction markets, Camerer says. What makes this particular analysis powerful is that the researchers observed evidence for this phenomenon in a wide range of studies—short, well-controlled laboratory experiments; markets involving intelligent, well-informed traders at major financial institutions; and nine years of horse-racing data.

The title of the PNAS paper is “How psychological framing affects economic market prices in the lab and field.” In addition to Camerer, the other authors are Ulrich Sonnemann and Thomas Langer at the University of Münster, Germany, and Craig Fox at UCLA. Their research was supported by the German Research Foundation, the National Science Foundation, the Gordon and Betty Moore Foundation, and the Human Frontier Science Program.

With the U.S. celebrating Independence Day — carnivals, fireworks, BBQs, parades and other customs that have, at best, only a tangential connection to our “independence,” — we thought it an opportune moment to return to its source in search of some situationism. No doubt, the Declaration of Independence is typically thought of as containing a dispositionist message (though few would express it in those terms) — all that language about individuals freely pursuing their own happiness. Great stuff, but arguably built on a dubious model of the human animal.

That’s not the debate we want to provoke here. Instead, we are interested in simply highlighting some less familiar language in that same document that reveals something special about the mindset and celebrated courage of those behind the colonists’ revolt. Specifically, as Thomas Jefferson penned, “all experience hath shewn that mankind are more disposed to suffer, while evils are sufferable than to right themselves by abolishing the forms to which they are accustomed.”

Part of what made the July 4th heroes heroic, in our view, was their willingness to break from that disposition to suffer evils. They reacted, mobilized, strategized, resisted, and fought because they recognized that their suffering was not legitimate — a conclusion that many in the U.S. and abroad vehemently rejected.

Situationist contributor John Jost has researched and written extensively about a related topic — the widespread tendency to justify existing systems of power despite any unfair suffering that they may entail. As he and his co-authors recently summarized:

Whether because of discrimination on the basis of race, ethnicity, religion, social class, gender, or sexual orientation or because of policies and programs that privilege some at the expense of others, or even because of historical accidents, genetic disparities, or the fickleness of fate, certain social systems serve the interests of some stakeholders better than others. Yet historical and social scientific evidence shows that most of the time the majority of people – regardless of their own social class or position – accept and even defend the legitimacy of their social and economic systems and manage to maintain a “belief in a just world.”

If we truly want to emulate and celebrate the “founding fathers” of this republic, perhaps we should begin by taking seriously the possibility that what “is” is not always what “ought to be.”

More than half of teenagers say they have cheated on a test during the last year — and 34 percent have done it more than twice — according to a survey of 40,000 U.S. high school students released in February by the nonprofit Josephson Institute of Ethics. The survey also found that one in three students admitted they used the Internet to plagiarize an assignment.

The statistics don’t get any better once students reach college. In surveys of 14,000 undergraduates conducted over the past four years by Donald McCabe, PhD, a business professor at Rutgers University and co-founder of Clemson University’s International Center for Academic Integrity, about two-thirds of students admit to cheating on tests, homework and assignments. And in a 2009 study in Ethics & Behavior (Vol. 19, No. 1), researchers found that nearly 82 percent of a sample of college alumni admitted to engaging in some form of cheating as undergraduates.

Some research even suggests that academic cheating may be associated with dishonesty later in life. In a 2007 survey of 154 college students, Southern Illinois University researchers found that students who plagiarized in college reported that they viewed themselves as more likely to break rules in the workplace, cheat on spouses and engage in illegal activities (Ethics & Behavior, Vol. 17, No. 3). A 2009 survey, also by the Josephson Institute of Ethics, reports a further correlation: People who cheat on exams in high school are three times more likely to lie to a customer or inflate an insurance claim compared with those who never cheated. High school cheaters are also twice as likely to lie to or deceive their boss and one-and-a-half times more likely to lie to a significant other or cheat on their taxes.

Academic cheating, therefore, is not just an academic problem, and curbing this behavior is something that academic institutions are beginning to tackle head-on, says Stephen F. Davis, PhD, emeritus professor of psychology at Emporia State University and co-author of “Cheating in School: What We Know and What We Can Do” (Wiley-Blackwell, 2009). New research by psychologists seems to suggest that the best way to prevent cheating is to create a campus-wide culture of academic integrity.

“Everyone at the institution — from the president of the university and the board of directors right on down to every janitor and cafeteria worker — has to buy into the fact that the school is an academically honest institution and that cheating is a reprehensible behavior,” Davis says.

Why students cheat

The increasing amount of pressure on students to succeed academically — in efforts to get into good colleges, graduate schools and eventually to land good jobs — tends to be one of the biggest drivers of cheating’s proliferation. Several studies show that students who are more motivated than their peers by performance are more likely to cheat.

“What we show is that as intrinsic motivation for a course drops, and/or as extrinsic motivation rises, cheating goes up,” says Middlebury College psychology professor Augustus Jordan, PhD, who led a 2005 study on motivation to cheat (Ethics and Behavior, Vol. 15, No. 2). “The less a topic matters to a person, or the more they are participating in it for instrumental reasons, the higher the risk for cheating.”

Psychological research has also shown that dishonest behaviors such as cheating actually alter a person’s sense of right and wrong, so after cheating once, some students stop viewing the behavior as immoral. In a study published in March in Personality and Social Psychology Bulletin (Vol. 37, No. 3), for example, Harvard University psychology and organizational behavior graduate student Lisa Shu and colleagues conducted a series of experiments, one of which involved having undergraduates read an honor code reminding them that cheating is wrong and then providing them with a series of math problems and an envelope of cash. The more math problems they were able to answer correctly, the more cash they were allowed to take. In one condition, participants reported their own scores, which gave them an opportunity to cheat by misreporting. In the other condition, participants’ scores were tallied by a proctor in the room. As might be expected, several students in the first condition inflated their scores to receive more money. These students also reported a greater degree of cheating acceptance after participating in the study than they had prior to the experiment. They also found that, while those who read the honor code were less likely to cheat, the honor code did not eliminate all of the cheating,

“Our findings confirm that the situation can, in fact, impact behavior and that people’s beliefs flex to align with their behavior,” Shu says.

Another important finding is that while many students understand that cheating is against the rules, most still look to their peers for cues as to what behaviors and attitudes are acceptable, says cognitive psychologist David Rettinger, PhD, of the University of Mary Washington. Perhaps not surprisingly, he says, several studies suggest that seeing others cheat increases one’s tendency to cheat.

“Cheating is contagious,” says Rettinger. In his 2009 study with 158 undergraduates, published in Research in Higher Education (Vol. 50, No. 3), he found that direct knowledge of others’ cheating was the biggest predictor of cheating.

Even students at several U.S. military academies — where student honor codes are widely publicized and strictly enforced — aren’t immune from cheating’s contagion. A longitudinal study led by University of California, Davis, economist Scott Carrell, PhD, examined survey data gathered from the U.S. Military Academy at West Point, U.S. Naval Academy and U.S. Air Force Academy from 1959 through 2002. Carrell found that, thanks to peer effects, one new college cheater is “created” through social contagion for every two to three additional high school cheaters admitted to a service academy.

“This behavior is most likely transmitted through the knowledge that other students are cheating,” says Carrell, who conducted the study with James West, PhD, and Frederick Malmstrom, PhD, both of the Air Force Academy. “This knowledge causes students — particularly those who would not have otherwise — to cheat because they feel like they need to stay competitive and because it creates a social norm of cheating.”

Dishonesty prevention

Peer effects, however, cut both ways, and getting students involved in creating a culture of academic honesty can be a great way to curb cheating.

“The key is to create this community feeling of disgust at the cheating behavior,” says Rettinger. “And the best way to do that is at the student level.”

* * *

Teachers can also help diminish students’ impulse to cheat by explaining the purpose and relevance of every academic lesson and course assignment, says University of Connecticut educational psychologist Jason Stephens, PhD. According to research presented in 2003 by Stephens and later published in the “The Psychology of Academic Cheating” (Elsevier, 2006), high school students cheat more when they see the teacher as less fair and caring and when their motivation in the course is more focused on grades and less on learning and understanding. In addition, in a 1998 study of cheating with 285 middle school students, Ohio State University educational psychologist Eric Anderman, PhD, co-editor with Tamara Murdock, PhD, of “The Psychology of Academic Cheating,” found that how teachers present the goals of learning in class is key to reducing cheating. Anderman showed that students who reported the most cheating perceive their classrooms as being more focused on extrinsic goals, such as getting good grades, than on mastery goals associated with learning for its own sake and continuing improvement (Journal of Educational Psychology, Vol. 90, No. 1).

“When students feel like assignments are arbitrary, it’s really easy for them to talk themselves into not doing it by cheating,” Rettinger says. “You want to make it hard for them to neutralize by saying, ‘This is what you’ll learn and how it’s useful to you.’”

At the college level in particular, it’s also important for institutional leaders to make fairness a priority by having an office of academic integrity to communicate to students and faculty that the university takes the issue of academic dishonesty seriously, says Tricia Bertram Gallant, PhD, academic integrity coordinator at the University of California, San Diego, and co-author with Davis of “Cheating in School.” . . .

* * *

There’s also evidence that focusing on honesty, trust, fairness, respect and responsibility and promoting practices such as effective honor codes can make a significant difference in student behaviors, attitudes and beliefs, according to a 1999 study by the Center for Academic Integrity. Honor codes seem to be particularly salient when they engage students, however. In Shu’s study on the morality of cheating, for example, she found that participants who passively read a generic honor code before taking a test were less likely to cheat on the math problems, though this step did not completely curb cheating. Among those who signed their names attesting that they’d read and understood the honor code, however, no cheating occurred.

“It was impressive to us how exposing participants to an honor code and really making morality salient in that situation basically eliminated cheating altogether,” she says.

Like this:

Behavioral economist Dan Ariely studies the bugs in our moral code: the hidden reasons we think it’s OK to cheat or steal (sometimes). Clever studies help make his point that we’re predictably irrational — and can be influenced in ways we can’t grasp.

“A student told me a story about a locksmith he met when he locked himself out of the house. This student was amazed at how easily the locksmith picked his lock, but the locksmith explained that locks were really there to keep honest people from stealing. His view was that 1% of people would never steal, another 1% would always try to steal, and the rest of us are honest as long as we’re not easily tempted. Locks remove temptation for most people. And that’s good, because in our research over many years, we’ve found that everybody has the capacity to be dishonest and almost everybody is at some point or another.”

2. We’ll happily cheat … until it hurts.

“The Simple Model of Rational Crime suggests that the greater the reward, the greater the likelihood that people will cheat. But we’ve found that for most of us, the biggest driver of dishonesty is the ability to rationalize our actions so that we don’t lose the sense of ourselves as good people. In one of our matrix experiments [a puzzle-solving exercise Ariely uses in his work to measure dishonesty], the level of cheating didn’t change as the reward for cheating rose. In fact, the highest payout resulted in a little less cheating, probably because the amount of money got to be big enough that people couldn’t rationalize their cheating as harmless. Most people are able to cheat a little because they can maintain the sense of themselves as basically honest people. They won’t commit major fraud on their tax returns or insurance claims or expense reports, but they’ll cut corners or exaggerate here or there because they don’t feel that bad about it.”

3. It’s no wonder people steal from work.

“In one matrix experiment, we added a condition where some participants were paid in tokens, which they knew they could quickly exchange for real money. But just having that one step of separation resulted in a significant increase in cheating. Another time, we surveyed golfers and asked which act of moving a ball illegally would make other golfers most uncomfortable: using a club, their foot or their hand. More than twice as many said it would be less of a problem — for other golfers, of course — to use their club than to pick the ball up. Our willingness to cheat increases as we gain psychological distance from the action. So as we gain distance from money, it becomes easier to see ourselves as doing something other than stealing. That’s why many of us have no problem taking pencils or a stapler home from work when we’d never take the equivalent amount of money from petty cash. . . .”

4. Beware the altruistic crook.

“People are able to cheat more when they cheat for other people. In some experiments, people cheated the most when they didn’t benefit at all. This makes sense if our ability to be dishonest is increased by the ability to rationalize our behavior. If you’re cheating for the benefit of another entity, your ability to rationalize is enhanced. So yes, it’s easier for an accountant to see fudging on clients’ tax returns as something other than dishonesty. And it’s a concern within companies, since people’s altruistic tendencies allow them to cheat more when it benefits team members.”

5. One (dishonest) thing leads to another.

“Small dishonesties matter because they can lead to larger ones. Once you behave badly, at some point, you stop thinking of yourself as a good person at that level and you say, What the hell. This is something many people are familiar with in dieting. We’re disciplined until we lapse, and if we can’t think of ourselves as good people, then we figure we might as well enjoy it. And it happens with honesty as well. Cheaters too can start with one step. We conducted an experiment where participants were given designer sunglasses to wear and evaluate. Some were told their pair was authentic, others were told they were wearing fakes and others were given no information. Then, after they had been wearing their glasses for a while, we gave them matrices to solve. In all three groups, a significant portion of the participants reported solving a few more matrices than they actually had. Moderate cheating, as usual. But while 30% of the group wearing real designer sunglasses cheated, and slightly more, around 40%, of the people in the no-information group cheated, more than 70% of the group wearing the fakes exaggerated the number of matrices they solved. One moral violation leads to further immorality.”

6. Better to encourage honesty than discourage cheating.

“Most attempts to limit cheating come from a cost-benefit understanding of the problem. We think if we make the punishments harsh enough, people will cheat less. But there is no evidence that this approach works. Think of the death penalty. There is no evidence that it decreases crime. A better approach would be to ask, How can we help people stay honest? When we had an insurance company move the signature on a mileage reporting form from the bottom of the document to the top — so people were attesting that the information they were reporting was true before they filled out the form, rather than after — the amount of cheating went down by about 15%.”

7. Honesty is a state of mind.

“In one of our experiments, we split participants into two groups. We asked one group to try to recall the 10 Commandments, the other 10 books they read in high school. Then we had everyone do some matrices. What we found was that the people in the group who recalled books engaged in the same level of cheating as most people. But the participants in the group that tried to remember the 10 Commandments didn’t cheat at all. Small reminders of ethical standards can be very powerful.”