Milgram experiment

Benedict Carey writes about two new papers on the Milgram experiment coming soon to academic journals near you. Dominic Packer of Ohio State University uses Milgram’s actual data to analyze the point at which disobedience becomes most likely – 150 volts (out of a possible 450 volts). This number has been noticed by researchers before, most notably Steven Gilbert who called 150 volts the first “qualitative change.” This voltage level is the first at which the person being shocked moves beyond grunting and yelping in pain, and says “Stop, let me out! I don’t want to do this anymore.” About one-third of subjects disobey at 150 volts, which occurs, Packer says, because it is the first point in the experiment at which the “conflict between the instructions of the experimenter and the contradictory requests of the learner” becomes clear. The decision at 150 volts had a fairly sticky effect on future decisions. Subjects who followed orders at 150 generally did so at higher levels. Subjects who disobeyed generally continued to disobey. Writes Packer:

This may have been due to cognitive dissonance processes, such that, after ignoring the learner’s initial request, it became increasingly difficult to acknowledge the validity of his subsequent pleas and justify a new course of action. This could also be viewed as a foot-in-the-door type phenomenon, in that participants found it harder to refuse orders to give larger shocks after having previously acquiesced to less consequential actions.

In case you think the Milgram findings wouldn’t be replicated today, think again. A Santa Clara researcher ran Milgram’s experiment to a tee.

Once again, more than half the participants agreed to proceed with the experiment past the 150-volt mark. Jerry M. Burger, the author, interviewed the participants afterward and found that those who stopped generally believed themselves to be responsible for the shocks, whereas those who kept going tended to hold the experimenter accountable. That is, the Milgram work also demonstrated individual differences in perceptions of accountability — of who’s on the hook for what.

If you took Psychology 101 in college you might have seen the Milgram experiment, one of the most famous in social science. A 50 minute edited video of it popped up recently on You Tube. We discuss the Milgram experiment in Nudge not to explain the rise of fascism – as has become the common view of the experiment today – but rather to make a point about how social pressures nudge people to accept conclusions that are at odd with their own views of reality, and then shape their behavior.

About

The Nudge blog is the online companion to Richard Thaler and Cass Sunstein’s “Nudge: Improving Decisions About Health, Wealth, and Happiness.” Here you’ll find much more about nudging, choice architecture, libertarian paternalism, and many other terms you won’t read about in standard economics books.

Cass Sunstein is currently the Administrator of the White House Office of Information and Regulatory Affairs and has no affiliation with the Nudge blog.

The Nudge blog is edited by John Balz.

Tell us about a nudge

The possibilities for great nudges are everywhere. For a list of favorites from the book, check out our dozen nudges. We invite readers to send their own nudge suggestions to nudgeblog@gmail.com.

What is Choice Architecture?

Decision makers do not make choices in a vacuum. They make them in an environment where many features, noticed and unnoticed, can influence their decisions. The person who creates that environment is, in our terminology, a choice architect. The goal of Nudge is to show how choice architecture can be used to help nudge people to make better choices (as judged by themselves) without forcing certain outcomes upon anyone, a philosophy we call libertarian paternalism. The tools highlighted are: defaults, expecting error, understanding mappings, giving feedback, structuring complex choices, and creating incentives.

For a user-friendly introduction to choice architecture, check out this paper.