Author
Topic: The Problem of Political Authority (Read 1728 times)

It's a bit boring in the beginning, but gets quite interesting in Chapter 6:

6.2.1 Setup

Perhaps the most famous psychological study of obedience and authority is the one conducted by Stanley Milgram at Yale University in the 1960s.

Milgram gathered volunteers to participate, supposedly, in a study of memory. When each subject arrived at the laboratory, he was paid $4.50 (then a reasonable payment), which he was told was his to keep just for showing up. Another "volunteer" (actually a confederate of the experimenter) was already present. The experimenter (actually a high school teacher whom Milgram had hired to play the role) informed both of them that they would participate in a study of the effects of punishment on learning. One of them would be designated as the "teacher" and the other as the "learner". Through a rigged drawing, the naive subject was selected as the teacher, and the confederate as the learner.

The experimenter explained that the teacher would read pairs of words to the learner, who would attempt to remember which word was associated with which other word. The teacher would then quiz the learner. Each time the learner gave a wrong answer, the teacher was to administer an electric shock, through an impressive-looking shock generator. With each wrong answer, the shocks would increase in intensity, starting with a 15-volt shock and increasing by 15 volts each time. The experimenter gave the teacher a sample 45-volt shock to show what it was like (and to convince subjects of the authenticity of the shock generator). The learner mentioned that he had a slight heart condition and asked whether the experiment was safe. The experimenter assured him that while the shocks might be painful, they were not dangerous. The learner was strapped into a chair in another room with an electrode attached to his wrist, supposedly connected to the shock generator.

On a fixed schedule, the learner would make mistakes, leading to increasingly severe shocks. The switches on the shock generator were labeled from 15 volts all the way up to 450 volts, along with qualitative labels ranging from "Slight Shock" up to "Danger: Severe Shock", followed by an ominous "XXX" under the last two switches. Each time the learner made a mistake, the teacher was supposed to use the next switch on the shock generator. At 75 volts, the learner began grunting in pain. At 120 volts, he shouted to the experimenter that the shocks were becoming painful. At 150 volts, the learner complained that his heart was bothering him and demanded to be released. Cries of this sort continued, up to an agonized scream at 270 volts. At 300 volts, the victim refused to provide any further answers to the memory test. The experimenter instructed the teacher to treat a failure to answer as a wrong answer and to continue administering shocks. The victim continued to scream and insist that he was no longer a participant, complaining about his heart again at 330 volts. After 330 volts, however, nothing further was heard from the learner. When the teacher reached the final, 450-volt switch on the shock generator, the experimenter would instruct the teacher to continue using the 450-volt switch. After the teacher had administered the 450-volt shock three times, the experiment was ended.

If at any point in this process the teacher expressed reluctance to continue, the experimenter would prod the teacher with "Please continue." If the subject repeatedly expressed reluctance, the experimenter would prompt him with "The experiment requires that you continue", then "It is absolutely essential that you continue", and finally "You have no other choice. You must go on." If the subject still resisted after the fourth prod, the experiment was discontinued.

6.2.2 Predictions

The learner, of course, did not truly receive the electric shocks. The real purpose was to determine how far subjects would be willing to obey the experimenter. If you are not already familiar with the experiment, it is worth taking a moment to reflect, first, on what you think would be the right way for the teacher to behave, and second, what you think most people would in fact do.

Postexperimental interviews established that subjects were convinced that the situation was what it appeared to be and that the learner was receiving extremely painful electric shocks. Given this, a teacher clearly ought not to continue administering shocks after the learner demands to be released. To do so would have been a serious violation of the victim's human rights. At some point, the experiment would have amounted to torture and then murder. While the experimenter has some right to direct the conduct of his experiment, no one would say he has the right to order torture and murder.

What would you have done if you had been a subject in the experiment? Milgram described the experiment to students, psychiatrists, and ordinary adults and asked them to predict both how they themselves would behave if they were in the experiment and how most other people would behave. Of 110 respondents, every one said that they would defy the experimenter at some point, explaining their reasons in terms of compassion, empathy, and principles of justice. Most thought they would refuse to continue beyond the 150-volt shock (when the learner first demands to be released), and no one saw themselves going beyond 300 volts (when the learner refuses to answer). Their predictions of others' behavior were only slightly less optimistic: respondents expected that only a pathological fringe of 1-2 percent of the population would proceed all the way to 450 volts. The psychiatrists Milgram surveyed thought that only one experimental subject in a thousand would proceed to the end of the shock board.

Milgram's experiment shows something surprising, not only about our dispositions to obey but also about our self-understanding. The predictions of psychiatrists, students, and lay people fell shockingly far from reality. In the actual experiment, 65 percent of subjects complied fully, eventually administering the 450-volt shock three times to a silent and apparently lifeless victim. Most subjects protested and showed obvious signs of anxiety and reluctance - but ultimately, they did what they were told.

Milgram followed up the experiment with mailed surveys to participants. Despite the stress involved in the experiment, virtually no one regretted participating. Those who hear of the experimental design, without having participated, usually think, "People will not do it", and then, "If they do it they will not be able to live with themselves afterward." But in fact, Milgram reports, obedient subjects have no trouble living with themselves afterward, because these subjects by and large rationalize their behavior, after the fact, in the same way they rationalized it in the course of the experiment: they were just following orders.

6.2.4 The dangers of obedience

What lessons can we draw from Milgram's results? One important lesson, the one most prominently advanced by Milgram himself, is that of the danger inherent in institutions of authority. Because most individuals are willing to go frighteningly far in satisfaction of the demands of authority figures, institutions that set up recognized authority figures have the potential to become engines of evil. Milgram draws the parallel to Nazi Germany. Adolf Hitler, working alone, could perhaps have murdered a few dozen or even a few hundred people. What enabled him to become one of history's greatest murderers was the socially recognized position of authority into which he maneuvered himself and the unquestioning obedience rendered him by millions of German subjects. Just as none of Milgram's subjects would have decided on their own to go out and electrocute anyone, very few Germans would have decided, on their own, to go out murdering Jews. Respect for authority was Hitler's key weapon. The same is true of all of the greatest man-made evils. No one has ever managed, working alone, to kill over a million people. Nor has anyone ever arranged such an evil by appealing to the profit motive, pure self-interest, or moral suasion to secure the cooperation of others - except by relying on institutions of political authority. With the help of such institutions, many such crimes have been carried out, accounting for tens of millions of deaths, along with many more ruined lives.

It is possible that such institutions also serve crucial social functions and forestall other enormous evils. Even so, in light of the empirical facts, we must ask whether humans have too strong a disposition to obey authority figures. This brings us to a closely related lesson suggested by Milgram's results: most people's disposition to obey authorities is far stronger than one would have thought at first glance - and far stronger than one could possibly think justified.

6.2.5 The unreliability of opinions about authority

Another interesting lesson is this: the experience of being subjected to an authority has a distorting influence on one's moral perceptions. Everyone who hears about the experiment correctly perceives the moral imperative, at some point, to reject the experimenter's demands to continue shocking the victim. No rational person would think that complete obedience to the experimenter was appropriate. But once a person is in the situation, he begins to feel the force of the experimenter's demands. When Milgram asked one obedient subject why he did not break off the experiment, the subject replied, "I tried to, but he [indicating the experimenter] wouldn't let me." The experimenter in fact exercised no force to compel subjects to continue - yet subjects felt compelled. By what? By the sheer authority of the experimenter. Once a person has been subjected to this authority and has obeyed, the distortion of ethical perception often continues. The subject continues to find his actions justifiable or excusable on the grounds that he was just following orders - even though no one outside the experiment would agree.

The parallel to Nazi Germany again asserts itself. While almost all outside observers condemn the actions of the Nazis (and not just those of Adolf Hitler, who gave the ultimate orders), Nazi officers famously defended themselves with the appeal to superior orders. Was this simply an insincere ploy to escape punishment? Probably not; like Milgram's subjects, the officers probably felt that they had to obey orders. In Hannah Arendt's memorable description of the case, Adolf Eichmann thought he was doing his duty by obeying the law of Germany, which was inextricably tied to the will of the Führer; he would have felt guilty if he didn't follow both the letter and the spirit of Hitler's orders. Even more clearly, average soldiers in the German army cannot be supposed so much more evil than typical non-Germans that they independently wanted to participate in a genocide. While anti-Semitism was rampant in Germany, it did not issue in widespread murder until the government ordered the killings. Only then did ordinary soldiers feel the killings to be justified or required.

History records many similar cases. During the Vietnam war, an American army unit carried out the massacre of hundreds of civilians at My Lai. In one of the most notorious war crimes in the nation's history, defenseless women, children, and old men were gathered together and shot en masse. Again, the soldiers involved pled that they were only following orders. One soldier reportedly cried during the massacre yet continued firing.

The widespread acceptance of political authority has been cited as evidence of the existence of (legitimate) political authority. The psychological and historical evidence undermines this appeal. The Nazis, the American soldiers at My Lai, and Milgram's subjects were clearly under no obligation of obedience - quite the contrary - and the orders they were given were clearly illegitimate. From outside these situations, we can see that. Yet when actually confronted by the demands of the authority figures, the individuals in these situations felt the need to obey. This tendency is very widespread among human beings. Now suppose, hypothetically, that all governments were illegitimate and that no one was obligated to obey their commands (except where the commands line up with preexisting moral requirements). The psychological and historical evidence cannot show whether this radical ethical hypothesis is true. But what the evidence does suggest is that if that hypothesis were true, it is quite likely that we would still by and large feel bound to obey our governments. That is likely, because even people who are subjected to the clearest examples of illegitimate power still typically feel bound to obey. And if we felt this requirement to obey, it is likely that this would lead us to think and say that we were obliged to obey and then - in the case of the more philosophically minded among us - to devise theories to explain why we have this obligation. Thus, the widespread belief in political authority does not provide strong evidence for the reality of political authority, since that belief can be explained as the product of systematic bias.