This Is Your Brain on Ethics

A new study may shed light on how much weight we place on our sense of fairness when we are faced with an ethical dilemma.

Researchers from the California Institute of Technology and the University of Illinois, Champaign-Urbana say that their recent work shows that differences in our moral decisions are tied to how much weight our brains place on fairness.

"What was really surprising, in terms of the results that came out, was that it showed our sense of fairness is rooted in emotional processing, and there's a great deal of individual differences," said Steven Quartz, director of the Social Cognitive Neuroscience Laboratory at Cal Tech and the lead author on the study.

The study is published in the May 9 issue of Science.

The researchers conducted brain scans on 26 people as they played an electronic game designed to find out the choices they would make during a food shortage in an orphanage in Uganda.

Subjects chose whether to take a greater number of meals from everybody or to take fewer meals, when that meant leaving a few children with a lot less than everyone else.

As subjects were forced to quickly decide which children would lose meals, researchers noted which areas of their brains "lit up" — an indication that these brain areas were playing a part in the final decision.

Researchers found that regions of the brain that decided efficiency — a logical area — lit up about the same for everyone. But, regions involved with equity — an emotional area — lit up to varying degrees depending on the person.

Those differences, researchers say, accounted for how much each individual was taking their own sense of fairness into account when making a decision.

"When you see unfairness," said Quartz, "it evokes a negative emotion in you; you feel uncomfortable with it. The amount of 'uncomfortable' is different in different people."

"[The study] provides further evidence for the force of our emotional energy in making difficult moral decisions, said Judy Illes, a neuroscientist at the University of British Columbia, who was not involved in the research. "I think it is an extremely robust and elegant report."

But, she continued, "despite the extremely thoughtful way they conducted their research and their discussion ... I'm compelled to urge caution in making broad conclusions about the ways societies function based on studies of 20-some people in a lab setting in an MRI scanner in Pasadena.

"People may try to take these results and apply them to other kinds of human behavior and phenomena outside the context of this laboratory study."

Think Quick!

One of the novel things about this study, said Quartz, was that subjects were not given a long time to think, as with many classroom moral dilemmas. Instead, they had to decide in a little less than 10 seconds which children would be losing meals.

"It captures the idea that you're not allowed to have an infinite amount of time to make a decision. In real life you have to make decisions in a certain amount of time," he said.

Illes had some questions about the scenario used but still believes it is a useful setup overall.

"We typically take longer than 10 seconds to make decisions about how to deliver food to children in northern Uganda," she said.

However, Illes said, "As much as it is discordant with the actual decision making that would be used to deliver food to children in Uganda, it mimics rapid decision making."

Another important implication of the study, said Quartz, is that it furthers the notion that our ideas of fairness are inborn, and not learned.

"A sense of fairness is a basic part of our biology," he said. "This underlies the idea that our brain comes prepared to engage and to solve moral dilemmas, which are critical for any kind of life in a society."

Brain scans are increasingly playing an important part in understanding how different areas of the human brain interact to make decisions.

"It's a remarkable advance in our understanding of the brain to be able to peer into the brain, understand the dynamics of decision making and predict what will be decided," said Dr. Paul Root Wolpe, a bioethicist at the University of Pennsylvania and president of the American Society for Bioethics and Humanities, who was not involved with the study.

It's important, said Wolpe, not to oversimplify how our decisions are made.

"To chalk it off to emotions is to take away from what it means to evolve as a social species," he said, adding that emotion still plays an important role in decision making.

"I'm not sure there's anybody who believes the moral decisions that people make in their everyday lives are not deeply influenced by emotion."

Any Questions?

One possible follow-up proposed by Quartz is to study whether brain scans can help predict whether someone is a Democrat or a Republican based on how their brain lights up in these moral situations.

Wolpe agrees that insights like these would be fascinating, but cautions that it will raise privacy concerns.

"Imagine what could happen if, through brain imaging, we could correlate with voting patterns and other behaviors?" he said. "That's a very troubling idea in some ways.

"It would be a fascinating and also troubling development to look at brain function or brain traits and categorize people through them," said Wolpe. "[But] it has a deterministic feel that I think people find really disturbing."