Who: Situationist Contributor and Drexel Law School Professor Adam Benforado and University of Pennsylvania Psychology Professor Geoff Goodwin will discuss historical and empirical research regarding retributive punishment imposed upon animals. They will then use this evidence to draw inferences about human intuitions regarding punishment.

On September 21st, Fiery Cushman, a newly-minted PhD recipient and post-doctoral fellow at Harvard’s Mind, Brain and Behavior Initiative, presented some of his recent research at an event titled “Outcome vs. Intent: Which Do We Punish, and Why?” Cushman’s work suggests that at a gut-level, people assess whether a behavior was morally right or wrong by looking at the actor’s intentions, but when assigning punishment, people are overwhelmingly interested in outcomes, even if an outcome was accidental.

Cushman described several experiments where he was able to look at a participant’s intentions in isolation from the actual outcome of the participant’s actions. In one case, participants were given the choice of dice that would later be rolled to assign rewards to a second, receiving party. When given the opportunity, the recipient would consistently punish more often when the dice produced less favorable rewards, even if the initial participant intended to provide rewards generously. This work has interesting implications for tort law, explaining in part why findings of negligence lead to large compensatory rewards even in the absence of any intentional action.

Owen Jones and Robert Kurzban recently posted their paper, “Intuitions of Punishment” (forthcmoing in the University of Chicago Law Review) on SSRN. Here’s the abstract.

* * *

Recent work reveals, contrary to wide-spread assumptions, remarkably high levels of agreement about how to rank order, by blameworthiness, wrongs that involve physical harms, takings of property, or deception in exchanges. In The Origins of Shared Intuitions of Justice we proposed a new explanation for these unexpectedly high levels of agreement.

Elsewhere in this issue, Professors Braman, Kahan, and Hoffman offer a critique of our views, to which we reply here. Our reply clarifies a number of important issues, such as the interconnected roles that culture, variation, and evolutionary processes play in generating intuitions of punishment.

Humans are incredibly cooperative, but why do people cooperate and how is cooperation maintained? A new research study by UCLA anthropology professor Robert Boyd and his colleagues from the Santa Fe Institute in New Mexico suggests cooperation in large groups is maintained by punishment.

The finding challenges previous cooperation/punishment models that argue punishment is uncoordinated and unconditional.

To understand the study, let’s start with a small group of friends. In small groups, individuals often have personal connections with other group members and cooperation typically is maintained by a “you help me, I’ll help you” reciprocity system. Group members cooperate because they do not want to hurt their friends by not participating in group efforts, and also because they may want help in the future.

But in a larger group, like a tribe, those mechanisms for maintaining cooperation are lost. All group members experience the benefits of the large group, even those members who stop cooperating and become “free-riders.” Free-riders are people who benefit from the group in food sharing and protection from enemies, for example, without contributing to food collection or war. In these cases, the personal connection to the group’s members is often gone.

But it turns out that most members of large groups cooperate. Why? Boyd and his colleagues suggest cooperation is maintained by punishment, which reduces the benefits to free riding. There are tribes, for example, that punish free-riders who do not participate in warfare by not allowing them to take a bride. Thus, there is the threat of losing societal benefits if a member does not cooperate, which leads to increased group cooperation.

Previous models of cooperation assumed that punishment of free-riders was uncoordinated and unconditional. One problem with these models was that the costs associated with punishment were often higher than the gains of cooperation. Thus, the cost of one group member’s punishing a free-rider would be substantial and would not overweigh the gains achieved through increased cooperation.

Costs may be defined as loss of friendship or loss of relational closeness with other members of the group.

To address the problem, Boyd and his colleagues changed the assumptions built into previous cooperation/punishment models. First, they allowed for punishment to be coordinated among group members. In their model, group members could signal their willingness to punish someone who was not participating in the group, but punishment would only occur if it was coordinated. This meant the cost of punishing a free-rider would be distributed across members and would not be higher than the cost of gains achieved through increased cooperation.

Second, the researchers allowed for the cost of punishing a free-rider to decline as the number of punishers increased. Boyd explained that this new model was “catching up with common sense” because these two assumptions exist in reality.

Their model had three stages in which a large group of unrelated individuals interacted repeatedly. The first stage was a signaling stage where group members could signal their intent to punish. In the second stage, group members could choose to cooperate or not. The final stage was a punishment stage when group members could punish other group members.

The results of their model look a lot like what is seen in most human societies, where individuals meet and decide whether and how to punish group members who are not cooperating. This is coordinated punishment where group members signal their intent to punish, only punish when a threshold has been met and share the costs of punishing.

Boyd argues that even in societies without formal institutions for establishing rules and methods of punishment, group punishment appears to be effective at maintaining cooperation.

Like this:

Marc Fisher of the Washington Post has an intersting piece on recent research by Colgate University social psychologist Kevin Carlsmith. We excerpt the piece piece below.

* * *

Colgate University psychologist Kevin Carlsmith looked at the consistent support for the University of Virginia’s legendary honor code–an example, he posits, of a policy that “assigns extreme punishments for minor offenses.” Under the code, any case of lying, cheating or stealing leads to expulsion. No lesser punishments are possible in the system. Carlsmith wondered why that system remains so popular and he theorized that people love the clarity and simplicity of the approach in the abstract, even if they are often offended by how it plays out in reality.

Carlsmith found that most people choose punishments designed more for retribution than to create any deterrence against future wrongdoing. People often endorse punishment systems that they later decide–after they see them in action against real people–are unfair. “A person focused on deterring future crime ought to be sensitive to the frequency of the crime, the likelihood of its detection, the publicity of the punishment, and so forth,” the professor writes.

* * *

For the rest of the piece, click here. Last November, we blogged about Professor Carlsmith’s research on the satisfaction some feel from torture.

When a person receives a description of an individual committing a specific crime, the person rapidly forms judgments on the severity of the offense and the duration of appropriate punishment for it. Evidence is converging that those judgments are what we would now call “intuitive” judgments. Rather than being the result of a reason-guided, step-by-step analysis, the severity and punishment judgments seem simply to “pop into” the heads of the respondents. Thus, they are similar to the heuristics, biases, and other shortcut decisions that we all use, as documented by the judgment and decision-making researchers. Imaging
research suggests that these intuitions draw on both evaluative and emotional areas of the brain. Behavioral research demonstrates that these intuitions are driven by intuitive ideas of “just deserts” (i.e., retributive reactions rather than more reason-based deterrent or incapacitive considerations). The high degree of consensus within cultures on these judgments
suggests that they are learned through early socialization processes, perhaps building on evolutionarily prepared cognitive structures.

* * *

This “justice as intuitions” account has several implications for the judicial system. Most disturbingly, evidence demonstrates that legal codes that contradict these culturally shared intuitions will move citizens toward moral contempt for the justice system and lessened willingness to voluntarily “obey the law.” Rule-of-law practices need reconsideration in the light of an understanding of the basic shared notions within the population about right and wrong. Finally, we will examine how such actions as “insider trading” can be framed so that citizens will find them appropriately criminalized, and why some public welfare offenses are difficult to intuit as crimes.

Below you can watch a video of Darley’s fascinating presentation (26 minutes).

The public urge for punishment that helped delay the passage of Washington’s economic rescue plan is more than a simple case of Wall Street loathing, according to scientists who study the psychology of forgiveness and retaliation. The fury is based in instincts that have had a protective and often stabilizing effect on communities throughout human history. Small, integrated groups in particular often contain members who will stand up and — often at significant risk to themselves — punish cheaters, liars and freeloaders.

Scientists debate how common these citizen enforcers are, and whether an urge to punish infractions amounts to an overall gain or loss, given that it is costly for both parties. But recent research suggests that in individuals, the fairness instinct is a highly variable psychological impulse, rising and falling in response to what is happening in the world. And there is strong evidence that it hardens in times of crisis and uncertainty, like the current one.

The catch in this highly sensitive system, most researchers agree, is that it most likely evolved to inoculate small groups against invasive rogues, and not to set right the excesses of a vast and wildly diverse community like the American economy. . . .

The downside of these instincts, Dr. McCullough added, “is that they often promote behavior that turns out to be spiteful in the long run.”

The urge to punish is not restricted to humans. . . .

* * *

Given the choice, most people prefer that others do the hard work of enforcement, recent research has found. Scientists often study cooperation and punishment by having participants play one-on-one investment games in which each player chooses how much money to pony up in a joint investment, without knowing up front how much the other person will contribute. If both contribute a lot, they maximize their profits. If one snubs the other’s contribution, or “defects,” he or she is guaranteed a good profit and the other gets nothing.

Researchers adjust the costs and benefits of this game, as well as the number of times people play each other. And often another feature is added: an option to punish the other person, say, by spending a dollar to dock his or her earnings by two dollars.

In a series of such experiments, Jeffrey P. Carpenter and Peter Hans Matthews, economists at Middlebury College in Vermont, have found that depending on the costs of imposing penalties and the circumstances, 10 to 40 percent of people will act on their referee instincts.

“The urge to punish seems very strong,” Dr. Carpenter said. “Some people will spend money to punish even if it has no effect on them — if they’re watching players in another game and can penalize people. They’re inequality averse, it seems.” The researchers have found similar results across several cultures, including in Japan and Southeast Asia.

The conscious psychological motive for this behavior, regardless of its effect, is typically not deterrence but what some psychologists call just-deserts retribution. In a landmark 2002 study, psychologists at Princeton University had more than 1,000 participants evaluate vignettes describing various crimes and misdemeanors, and give sentencing recommendations. The psychologists found that people very carefully tailored their recommended sentences to the details of the infraction, its brutality and the record of the perpetrator. That is, people valued punishment for its own sake, as a measured consequence for behavior, not as a deterrent.

* * *

The sense of betrayal Americans feel toward Wall Street, and the financial tumult’s effects on 401(k) accounts and small businesses, has certainly made many people less laissez-faire in their attitudes toward punishment, Dr. [Robert] Kurzban said. And there is nothing anonymous about the debates over the economic rescue plan, whether in Congress or at the water cooler: people are stating their views to an audience, and the collective fairness instinct is stoked to high heat.

Fortunately for the economy, researchers say, a strong countervailing psychological force is also at work: the instinct to forgive, and to cooperate. Punishments are balanced by peace offerings, and in fact researchers have come close to calculating the rough ratio most people employ.

Running thousands of computer variations of the investment game, scientists have found that the strategies that pay off the most are tipped toward cooperation.

* * *

The upshot of all this, researchers say, is that human beings prefer cooperation, both in their individual makeup and in the makeup of their social groups. In a recent study, Dr. McCullough found that the urge for revenge against personal betrayals erodes in the same way some kinds of memory do: sharply in the first few weeks, slowly thereafter.

“The forgiveness instinct is every bit as wired in as the revenge instinct,” he said. “It seems that our minds work very hard to get away from resentment, if we can.”

Mary R. Rose and Janice Nadler have a nice paper, “Victim Impact Testimony and the Psychology of Punishment” available for downloading on SSRN. Here’s the abstract.

* * *

A growing body of empirical evidence from psychology, sociology, law, and criminal justice has demonstrated that lay intuitions about punishment are strongly rooted in retributivism: i.e., the idea that punishment should be distributed in proportion to moral desert. Level of harm is often thought to be indicative of desert, but harm described by victims (or survivors) in the context of victim impact evidence is subjective and often unforeseeable insofar as it is attributable to chance factors. How do observers (such as jurors or judges) use information about consequences determined by chance factors when they judge punishment? The emotional and cognitive processes involved in jurors’ use of victim impact evidence potentially reveals key insights about the psychological mechanisms underlying laypersons’ punishment judgments generally. This paper explores empirical evidence for the notion that the subjective harm experienced by the victim of an offense serves as proxy for the level of defendant’s effort and culpability, and by implication, the perceived seriousness of the crime.