You are here

Character&Context

Can focusing on the self make you more moral?

By: Paul Conway

Quick: imagine an orphan asking you to donate to their orphanage charity. Which mindset is more likely to motivate you to donate: focusing on the other person, or focusing on yourself? Or: imagine you faced a moral dilemma where you needed to kill an innocent person to save many other lives.* Which mindset would leave you feeling most averse to performing this (brutal but necessary) action: focusing on the person you could kill, or focusing on yourself?

Traditionally, it has seemed obvious that people who focus on others will be most likely to help those in need and most resistant to hurting innocent people. A large body of works shows this is true: helping often stems from other-oriented psychological processes like cerebral perspective-taking (walking a mile in someone’s shoes) and emotional empathic concern (feeling upset when you see someone suffering).

But what happens when people focus on the self in morally-charged circumstances?

Traditional wisdom suggests that focusing on the self reduces moral behavior, because self-focused people care about their own interests and well-being at the expense of others. Certainly this can occur. But this perspective fails to appreciate a simple truth: people really, really want to feel moral.** In fact, morality is so important for the self-concept that focusing on the self in morally-charged circumstances can actually improve moral behavior. This was the theme of our symposium at SPSP 2015 in Long Beach, California.

First, Ryan Miller and colleagues distinguished between two different reasons people wish to avoid causing harm. One is outcome aversion: aversion to witnessing others in pain (Imagine watching someone slam their finger in a car door. Did you just wince?). The other is action aversion: aversion to performing actions that usually cause harm (Imagine stabbing an actor in the neck with a fake knife. You know it technically won’t harm them—but it still feels icky, right?). Put differently, people have two different reasons to avoid committing murder: a) so the victim won’t die, and b) so they won’t become a murderer. Miller and colleagues developed a scale measuring how much people care about each of these reasons for avoiding harm.

Intriguingly, they showed that often the second reason drives people’s moral judgments more powerfully than the first. For example, they found that people who were more averse to causing harm tended to make harm avoidance judgments in moral dilemmas. Surprisingly, people who were more averse to witnessing harm were not more likely to make such harm avoidance judgments. This finding is surprising because it flies in the face of many other studies suggesting that harm avoidance judgments are linked to emotions—especially, empathic concern for the victim of harm. Together, these findings suggest that wanting to avoid being a murderer may be more important for moral judgments than wanting to avoid witnessing a murder.

Next, Paul Conway and colleagues revisited Miller and colleagues’ claims. In two independent studies, they replicated Millers’ findings: action aversion, but not outcome aversion, predicted harm avoidance judgments on moral dilemmas. However, Conway and colleagues also used an advanced technique to measure not only judgments themselves, but also the processes behind judgments. This technique separately estimates a) how much people wish to avoid causing harm, and b) how much people wish to maximize outcomes. Regular judgments pit these tendencies against one another.

Using this technique, Conway and colleagues found that aversion to causing harm still predicted motivation to avoid causing harm (makes sense). More importantly, they also found a role for aversion to witnessing harm: people who want to avoid witnessing others in pain wanted both to avoid causing harm and to maximize outcomes. They care about everyone, and don’t want to see anyone get hurt! These two tendencies cancel out for judgments themselves, because judgments pit these tendency against each other. Its like pressing your hands together—if you push harder on both your right hand and your left, your hands won’t move even though both have more force behind them. These findings confirm that dilemma judgments involve both the desire not to murder, and the desire not to see people get murdered.***

Next, Brendan Gaesser and colleagues flipped the topic from hurting others to helping others: how does focusing on the self impact prosocial behavior? They found that when people vividly imagine helping in a specific time and place—called episodic simulation—they are more likely to help. For example, imagine it’s winter, and you see someone with a flat tire stuck by the side of the road. Imagine yourself stopping your car, getting out, and helping them. The more detail the better: imagine the feel of the wind on your face, the weight of the spare tire, the other person’s smile of gratitude. Now, if you are later driving your car and really see someone with a flat tire, you are much more likely to help them—especially compared to people who imagined someone else helping. This makes sense when we remember that people really want to feel moral. When they have vividly imagined helping, this helps maintain this moral feeling—but if they then drive right past someone who needs help the way they just imagined, now they will be a hypocrite unless they ‘put their money where their mouth is’ and actually help.

Our final talk was by Anna van't Veer and colleagues, who examined how focusing on the self impacts judgments that something is morally wrong (and disgusting). However, instead of imagining the self helping people or avoiding becoming a murder, she looked at interoception—sensing physical bodily states. Take a moment to close your eyes and try to listen to your heart rate.**** If it is nice and steady, that is probably because you are calm and relaxed. But if your heart is racing, maybe you are upset about something! Now you must figure out what is bothering you.

van't Veer and colleagues used this principle to see if people use their racing heart to infer that they are upset about immoral actions that other people perform. They gave people scenarios to read about different kinds of immoral actions (like hitting a child or performing disgusting acts with a chicken). Some people had to listen to their heart beating while reading, and other people had to listen to a machine beeping. When people paid attention to their heart beating, they judged disgusting acts as more immoral. In other words, paying close attention to their body seemed to make them more aware of how upsetting the gross actions were, and so they felt those actions were even worse.

Together, these four talks show that in morally charged situations, sometimes focusing on the self can heighten moral concerns and alter moral judgments. Moral dilemma judgments to avoid harm are motivated by the desire not to be a murderer, as well as by a desire not to see people get murdered. Imagining oneself helping others in need can help make it come true, and turn one into a person who genuinely helps others. And focusing on one’s own bodily states can attune people to the grossness of disgusting acts, making those acts feel worse. Together, these findings show that focusing on the self in morally charged circumstances does not necessarily make people more selfish or less involved in others—because people care deeply about being moral people, sometimes focusing on the self improves moral judgments and behavior.

Notes:

* Fans of the show Game of Thrones should be able to think of a recent, vivid example! Everyone else can imagine the by-now-ubiquitous trolley dilemma: a runaway trolley will kill five people unless you divert it to kill one person instead—should you kill one person to save five?

** People think morality lies at the core of personality (Strohminger & Nichols, 2014), so feeling immoral can be deeply upsetting—it suggests you may be wrong in your core. Some researchers have described this feeling as moral injury (Maguen & Litz, 2012) and it can lead people to ‘give up’ trying to be moral and keep on acting selfish (Conway & Peetz, 2012). However, most people feel they are pretty moral overall (Epley & Dunning, 2001) and they want to keep it that way. People have a lot of strategies to keep feeling moral even when they act less-than-morally (e.g., Effron, 2014), but that’s a topic for another day. If you want an example, see the recent blog post by Daniel Effron.

Maintaining the moral self is no easy task. You can’t just claim to be moral and expect everyone to agree with you. You can call yourself a guitarist all you want, but unless you play guitar now and then, people are not going to think of you as a guitarist. Besides, you will feel like a fraud. You have to walk the walk if you want to feel legitimate. The same is true of calling yourself a good person. To feel moral, people need to help others once in a while, and avoid hurting others, and generally stick to the moral rules of their society. It turns out that people who care a lot about feeling moral tend to help others and avoid harming them (Aquino & Reed, 2002), and reminding people of their moral self can have the same effect (Conway & Peetz, 2012).

***Of course, there are also other processes involved in moral dilemma judgments, such as cognitive evaluations of outcomes and adherence to moral rules—but, again, diving in to that complexity is a topic for another day.

****Apologies if you are a vampire or the lion from the Wizard of Oz; you will have to sit this one out.

About our Blog

Character & Context is the blog of the Society for Personality and Social Psychology (SPSP). With more than 7,500 members, SPSP is the largest organization of social psychologists and personality psychologists in the world.