In the study, participants were told that they were participating in a study testing the effect of electric shocks on memory. Participants were told to press a button that administered a shock to a “learner” in another room each time the learner answered a question incorrectly.

When the button was pushed, an audio recording of the learner screaming was played; as the study progressed, the power of the “shocks” was increased and the screams became louder. Though many of the participants expressed reluctance to increase the shocks, 79 in Milgram’s study and 70 percent in Burger’s study completed the test.

Milgram’s study was conducted during the trial of Nazi war criminal Adolf Eichmann, who claimed that his actions were excusable because he was only following orders of superiors. Milgram was looking to examine this impulse, coined the “banality of evil” by author Hannah Arendt, to understand why so many Germans were willing to commit the atrocities of the Holocaust even if they themselves were not evil.

Burger wanted to test whether these impulses remain in today’s society. “People learning about Milgram’s work often wonder whether results would be any different today,” he said in a press release. “Many point to the lessons of the Holocaust and argue that there is greater societal awareness of the dangers of blind obedience. But what I found is the same situational factors that affected obedience in Milgram’s experiments still operate today.”

The results of the Milgram Experiment, and its many variations conducted by Milgram, Burger and others, are used to explain examples of human rights abuses around the world. The average person is often unwilling to defy orders by authority figures or challenge the ideals of society, even if they know that they are wrong.

“Although one must be cautious when making the leap from laboratory studies to complex social behaviors such as genocide,” Burger told Reuters, “understanding the social psychological factors that contribute to people acting in unexpected and unsettling ways is important.”

The Milgram Experiment, and Burger’s replication, illustrates that people’s behavior is often more influenced by the situation they’re in than by their own conscience. “The conclusion is not: ‘Gosh isn’t this a horrible commentary on human nature,’ or ‘these people were so sadistic,’” Burger said. “It shows the opposite—that there are situational forces that have a much greater impact on our behavior than most people recognize.”

Many participants in the study express concern for the learner and ask to stop; however, notes Time’s Alex Altman, “early reluctance did not translate into a greater likelihood of refusing to continue. … Rather, the results again are in line with those who point to the power of situational variables to overcome feelings of reluctance in this situation.”

Milgram conducted many variations of his original test, finding that full compliance could range from 10–90 percent depending on just one variable. For example, participants were less likely to continue shocking if orders were given by telephone or if there was disagreement between two authority figures; furthermore, almost all participants ended the experiment if they saw a fake participant do the same.

Thus, the likelihood of a participant continuing is based more on the perceived authority of the authority figure than on the participant’s conscience or moral beliefs. “Morality does not disappear—it acquires a radically different focus: the subordinate person feels shame or pride depending on how adequately he has performed the actions called for by authority,” Milgram wrote in a 1974 Harper’s Magazine piece.

Philip Zimbardo, a former colleague of Milgram and the creator of the “Stanford Prison Experiment,” writes that Milgram’s findings explain why so many people are willing to carry out torture or human rights atrocities, and why suicide bombers are willing to sacrifice their own lives.

There is a gradual transformation in which a normal person is first willing to submit to reasonable authority, with defined rules designed encourage mindless obedience. Slowly, people are asked to do “just a little bit more” for their cause, building up to more improbable acts. Finally, they are told a “Big Lie”—for example, that Allah rewards suicide bombers—designed to encourage a final act. “The die is cast; their minds have been carefully prepared to do what is ordinarily unthinkable,” writes Zimbardo.

The basis of this process is that if people do not believe that they are liable for an action, they are likely to carry it out regardless of moral doubts. “The essence of obedience,” writes Milgram, “is that a person comes to view himself as the instrument for carrying out another person’s wishes, and he therefore no longer regards himself as responsible for his actions.”

Philip Zimbardo’s 1971 Stanford Prison Experiment found that a group of Stanford undergraduates told to guard a mock prison full of fellow undergraduates became abusive and sadistic after just several days.