Site Navigation

Site Mobile Navigation

RESEARCH THROUGH DECEPTION

This is a digitized version of an article from The Times’s print archive, before the start of online publication in 1996.
To preserve these articles as they originally appeared, The Times does not alter, edit or update them.

Occasionally the digitization process introduces transcription errors or other problems.
Please send reports of such problems to archive_feedback@nytimes.com.

September 12, 1982, Page 006066Buy Reprints The New York Times Archives

Morton Hunt is a freelance writer with a special interest in the behavioral sciences. His most recent book, ''The Universe Within, A New Science Explores the Human Mind," was published by Simon & Schuster

On a spring evening two years ago, Steve Kaufman, a wiry 18-yearold whose plain-featured intensity reminds one of Dustin Hoffman, hurried across the Stanford University campus to what he thought would be an interesting and enjoyable experience. He was headed for Jordan Hall, where the department of psychology is housed and where he had been receiving training as a hypnotic subject, preparatory to an experiment scheduled for that night.

The experiment was the core of a research project being carried out by Prof. Philip Zimbardo. A social psychologist with a flair for imaginative experimentation, Zimbardo was a campus celebrity and well known in his profession. Steve, a freshman, felt privileged to be working directly with such a notable, but to add to his motivation Zimbardo was rewarding him, and other subjects in the project, with a few dollars and with training in the use of self-hypnosis to increase concentration while studying, fight off fatigue, and control various other mental and physical states.

This night, Steve's final session, would first be devoted to pain control; this interested him because, being a fencer, he thought he might perform better if he could master the pain of physical effort. Later in the evening, he would take part, as a subject, in the experiment, which Zimbardo had said concerned the effect of hypnotism on problem-solving ability.

What Steve expected to take place that evening would take place - but so would much more, for he was wholly unaware of the real but covert goal of the research project. This was to train Steve and other unsuspecting volunteers in becoming good hypnotic subjects until, in the final session, Zimbardo could tell them, while they were in a trance, that afterward they would be partially deaf but would not remember that this had been hypnotically induced. Steve would be a ''naive'' subject - and would therefore react to his deafness as if it were genuine.

The morality of research in which human beings may be subjected to unpleasant and possibly harmful experiences that they have not agreed to undergo has long been hotly debated by social psychologists, university officials, ethicists at centers such as the Institute of Society, Ethics and the Life Sciences at Hastings-on-Hudson, N.Y., and administrators of Government agencies funding such work. The debate finally resulted in Federal regulations, put forth in 1971 and 1974, and revised last year; these regulations, setting boundaries on the permissible use of human beings in social-psychological research projects funded by major agencies, have throttled research into many aspects of human psychology without quite killing it off, and limited - but not eliminated - the chances for the stressful use of naive subjects. This compromise between the freedom to inquire and the rights of human beings pleases almost no one, and the debate continues in scientific and governmental circles and in the meetings and publications of institutes of ethics. Many psychologists, disheartened, have given up exploring problems they consider important; others, undeterred, continue to seek knowledge by means that critics label morally unacceptable no matter what the scientific rewards may be. Zimbardo, convinced that his goal justified what he was doing to his volunteers had painstakingly worked out a script for the final session, specifying every action and word of those taking part except, of course, the naive member. The entire sting had been thoroughly rehearsed by Zimbardo and his two assistants, and by two confederates - undergraduates who would appear to be Steve's fellowsubjects.

All this, Zimbardo felt, was necessary to put to the test his hypothesis that the paranoia - the demented suspiciousness - of many old people (a common cause of commitment to mental hospitals) was the result not of mental disorder but of a gradual, undetected loss of hearing. Such a change, if unnoticed by the individual, might lead him to wonder why others were whispering or making faces he could not interpret, and to imagine that their behavior was malicious and involved some sort of plot against him.

If Zimbardo was right, paranoia in many elderly persons might be remedied simply by the use of hearing aids rather than treated as an intractable psychosis. Zimbardo hoped, therefore, that when his naive subjects inexplicably had trouble understanding the conversation of the confederates they would attribute it to some dark design on their part. He hoped, in short, to see his subjects suffer an attack of paranoia.

With the possible exception of his small dark goatee, there is nothing about Zimbardo to suggest a Svengali. A rumpled, paunchy man of 49 who grew up in the brawling southeast Bronx, he is genial, talkative and concerned - albeit somewhat sarcastically - about what others think of his research methods. As he said to me when I visited him not long ago: ''Imagine someone innocently going into an experiment, perfectly normal, and then this terrible thing happens - they become paranoid! Because essentially I'm making normal people crazy. Now, isn't that an awful thing to do? Who am I to presume to exercise that power over another human being?''

Hostile reactions to his work are nothing new to Zimbardo, for he has a penchant for flamboyant experiments with disturbing consequences. In 1970, for instance, he persuaded 18 Stanford undergraduate men to play the roles of guards and prisoners in a prolonged realistic simulation of prison life. In less than a week, most of the guards had spontaneously become sadistic, while some prisoners had become dangerously hostile and others seriously disturbed. Zimbardo concluded that much of the vile behavior of prisoners and guards stems from the social roles they are cast in. While he regarded this as a valuable finding that should lead to reforms, he was widely criticized for having inflicted suffering on his subjects in the name of science.

Zimbardo planned to limit the deafness-induced paranoia, if any, to no more than half an hour and then to dispel it promptly. But even if the experience was genuinely upsetting to his subjects, he considered the stakes high enough to warrant imposing it on them.

''A fundamental assumption in psychiatry and clinical psychology,'' he told me, ''is that where you see mental illness, there must have been a premorbid personality. But that's a completely unnecessary assumption, based on the study of people who are already disturbed. I hold that that's too late to look for good evidence about how the pathological process began. We should look at normal people and see whether there aren't many kinds of experiences that can transform them, distorting the way they think.

''I hope that the hearing-loss study will be just the first of a series. All sorts of mundane circumstances, I believe, may lead to pathological mental processes of many kinds -phobias, hypochondrias, conversion reactions and so on. We can investigate all these phenomena by basically the same procedure as in the hearing-loss study - that is, if the work is not prevented.'' He paused dramatically. ''What could prevent it?'' he asked, and answered: '' A sufficient outcry about the use of deception.'' Social psychology, the study of how the individual's mental processes are influenced by interactions with other people, used to be a ''soft science,'' consisting largely of efforts to make wise interpretations of everyday social behavior. But by the 1950's it became scientifically respectable as a result of a shift to the use of laboratory experiments in which social interactions could be controlled and measured.

Unlike laboratory rats, however, human subjects can understand what is going on, and their understanding will often alter and distort their normal responses to the stimulus. If, for instance, they know that the researcher is studying how and when people help strangers in distress, they are quite likely to behave in an admirable - but perhaps not characteristic - fashion.

Accordingly, social psychologists soon saw that they would often have to disguise their real aims. In 1951, for example, in a classic pioneering study of conformity, the social psychologist Solomon E. Asch asked students to participate in ''an experiment in perceptual judgment.'' Several persons took part at a time; they were asked to say which one of three lines on cards in front of them was the same length as a standard line, also on display. In each case, one student - the naive subject - heard the others (who were confederates) choose a line plainly either shorter or longer than the standard. Faced with unanimity among his or her peers, the naive subject would squirm, sweat, feel upset - and, about a third of the time, go along with the majority vote. Deception, here, was obviously crucial; as ''The Handbook of Social Psychology,'' a standard reference work comments, ''One cannot imagine an experimenter studying the effects of group pressure ... by announcing his intentions in advance.''

Deception rapidly became the method of choice in investigating many such issues. By the late 1960's and early 1970's, according to several estimates, somewhere between 38 percent and 44 percent of all social-psychological research used deceptive methodology.

From the beginning, deceptive research yielded rich scientific rewards. Deceptive experiments performed in the 1960's, for instance, established the fact that violence seen in a movie increases the likelihood that the viewers will afterward react to provocations with violence.

After the Kitty Genovese murder of 1964, when 38 residents of Kew Gardens, Queens, watched but did nothing as her attacker strangled and stabbed her repeatedly for half an hour, many social critics spoke glumly of the brutish New York character, alienation in contemporary America and so on. The social psychologists Bibb Latane and John M. Darley conducted a series of experiments in which subjects were fooled into thinking that a stranger was in distress, and found that if the naive subject was alone, he or she would rush to the aid of the stranger, but that if others, especially persons unknown to the subject, were also present, he or she was much less likely to act. (The presence of the others apparently ''diffused'' the naive subject's sense of responsibility.) The Kitty Genovese incident, Latane and Darley concluded, revealed not a loss of human decency in America but the inhibiting effect on helping behavior of the social interactions typical of big-city life.

Such results gave ingenious deceptive research an aura of glamour and prestige. But some of those who regard it as immoral and who have been fighting its use say that deceptive research came to be prized more for its ingenuity than for the scientific value of its findings. Social psychologists themselves sometimes speak admiringly of the cleverest and trickiest deceptions as ''cute.'' Nonetheless, ingenuity was often required: Human subjects, it turned out, are good at sensing the true goal of an experiment when it is only thinly disguised. At Jordan Hall, Steve Kaufman went directly to a small room where he had attended previous training sessions. Zimbardo, seated in a chair, greeted him warmly; Steve thought again how lucky he was to be working intimately with a famous scientist. He sank down on a couple of the large pillows scattered around the floor. Another student, whom he did not know, was already sprawled out; he and Steve exchanged hellos. In a moment, a third student, also unknown to Steve, came in, greeted everyone, and then said to the other student, ''Say, you were in my Econ One section, weren't you?'' The other, nodding, said, ''Oh, yeah, right.'' This interchange established a basis for a spirited later conversation between the two that Steve would misinterpret, if Zimbardo's hypothesis about deafness was correct.

''Today,'' said Zimbardo, ''I'm going to use the first part of this session to help you with pain control. In part two, I'll deepen your hypnotic state even more than before by having you listen to some exotic Oriental music. Then, in part three, I'll ask you to participate, as planned, in our experiment on the effect of hypnosis on creative problem solving.''

For half an hour, as the three students several times put themselves into a hypnotic state and came back out of it, Zimbardo taught them how to self-administer suggestions that would counter pain. Then Zimbardo said it was time for the next phase, during which they would listen, separately, to different musical tapes. He handed each a set of earphones and signaled a woman assistant, watching from a control room, to start the tapes. Steve heard Zimbardo's voice slowly and soothingly explaining that by means of the following music he was to induce a deep trance in himself. He then heard a prolonged series of flutelike arpeggios slowly rippling upward. After a few minutes, when Steve was in a profound hypnotic state, Zimbardo's voice returned, saying, ''Later on in this experiment, when you see the word 'focus' on the screen in the laboratory, sounds will become very low and difficult to hear. You will not remember having heard this instruction until we put a hand on your shoulder and say, 'That's all for now,' and then your hearing will return and you'll remember everything.''

Although Steve's tape was still playing, those of the others seemed to have come to an end. Steve was aware that Susan M. Andersen, Zimbardo's chief assistant, had come in and taken the other students away. As his own tape concluded, she returned and said, ''Hi, Steve, are you finished?'' Steve said he was. As always, he was entranced by the sight of her. Susan Andersen, a graduate student in her late 20's, was, in Steve's opinion, quite beautiful. She was warm and outgoing, and Steve could easily imagine falling in love with her.

Susan led him to a large laboratory next door. The other two students were seated side by side at a table; nearby,a projector was aimed at a screen, and in one wall was a one-way mirror, beyond which was the observation and control room. As Susan and Steve entered, the other students were talking about what they intended to major in; again, this was meant to set the stage for their later conversations.

Steve took the only available seat, at the end of the table. (This kept the other two together, as planned.) Susan now briefed them: ''As you know,'' she said, ''we're interested in the effects of hypnosis on the way people solve problems. We expect that hypnosis may improve these skills.'' The problems would be a series of slides that they were to interpret in writing. (The slides came from the Thematic Apperception Test, a set of drawings of people in ambiguous situations; the stories an individual makes about them reveal a lot about that person's emotional condition.) Susan added that they were to work by themselves on the first slide but could work together from then on, if they liked, since groups solve certain kinds of problems better than individuals. All other instructions, she said, would appear on the screen.

She turned to the projector and switched it on; apologetically, she said: ''It'll take me a few minutes to set this up. We've been having a little trouble with the controls lately. I'll be operating this from the other room.'' She pushed a button and a test slide came on; the word ''focus'' appeared on the screen, and she left.

The confederates now talked to each other about whether to work together after the first slide; they decided to do so. One of them then asked Steve, ''How about you?'' and when Steve made no reply, said, ''O.K., fine.'' Then, with a smile, he said to his partner, ''I know where I've seen you since that Econ class. You were at Bob Elder's party, right? - where that guy started heaving all over the television and then looked at everyone like -'' and he made a hangdog face and groaned, ''Duuuh!'' They carried on about the mythical party; laughing and shaking their heads, they glanced at Steve, who, stony-faced, stared at the questionnaires and forms that Susan had left at his place.

He was having a poor time of it. From the moment the word ''focus'' appeared on the screen, he had been unable to understand what the other two students were saying. He wasn't sure whether he couldn't hear them or couldn't understand them. He had no idea what was wrong, and was fighting to control a wave of panic. When one of the others had spoken to him in what seemed to be either a mumble or doubletalk, Steve had ignored him. ''I was getting very upset,'' Steve recalls, ''especially when he and the other guy started talking and laughing and making faces. I was completely in the dark about why they were saying things in a way I couldn't understand. I was very agitated.''

An error has occurred. Please try again later.

You are already subscribed to this email.

At that moment, the projector clicked, and on the screen appeared this message: ''As you view the next slide, please write a brief description of what the characters portrayed in the slide are doing, who they are, and what the mood of the scene is.'' On came a picture of a man standing by a bed, looking down at what seemed to be the figure of a woman. The other two students, still talking, started to write, and Steve, his hands trembling and sweat running down his sides, got to work. A Supreme Court decision of 1914 established the right of medical patients not to be subjected to treatment without their permission. But since gaining the consent of patients to try new or experimental procedures for research purposes is difficult, many biomedical researchers continued not to ask their patients' permission or to get it through various ruses and deceptions.

To be sure, the aims of these researchers were generally lofty. But in the 1940's the world learned that it could not count on the benevolence of medical researchers: At the Nuremberg trials, 20 Nazi doctors were proved to have conducted atrocious and sadistic experiments on concentration-camp prisoners. As the basis for convicting the doctors of crimes against humanity, the Nuremberg Tribunal formulated a code of ethics governing medical research, the fundamental clause of which read: ''The voluntary consent of the human subject is absolutely essential. ... (He) should have sufficient knowledge and comprehension ... of the subject matter ... to make an understanding and enlightened decision.'' This has been known in recent years as the doctrine of ''informed consent.''

After the Nuremberg trials, world medical opinion officially held that biomedical researchers were morally obliged to obtain the informed consent of their subjects before using them in research; nonetheless, many researchers, unconstrained by law or administrative regulations, continued to take the easier course. In this country, biomedical experiments in which the subjects were ignorant of what was being done to them came to light during the 1950's and 1960's; some proved to be morally repugnant in the extreme, the most notable case being the Tuskegee study in which syphilitic black men were deliberately not treated in order to study the course of the disease. As a result of such disclosures, pressure built up within the Public Health Service (P.H.S.) and in Congress to create controls. During the 1960's, the P.H.S. gradually adopted various rules - including the requirement of informed consent - governing the behavior of biomedical researchers; the rules were binding only on persons supported by P.H.S. grants, but this affected the bulk of those doing such work in America.

Meanwhile, an analogous movement got under way in social research. A number of social psychologists, ethicists and Government administrators began to attack deceptive methodology as being incompatible with informed consent. They found such research particularly obnoxious when the subjects were made to undergo anxiety, embarrassment and other stresses.

The experiment that drew the sharpest fire of these critics was conducted by Stanley Milgram at Yale in 1963. In the guise of research into the learning process, Milgram told his naive subjects to administer progressively stronger electrical shocks to an unseen confederate, heard on an intercom, whenever he made a mistake. As the shocks grew stronger (rising to a supposedly dangerous level), the confederate - his voice was actually tape-recorded - would grunt, then cry out, and finally scream in agony. When Milgram's subjects wanted to break off, he would say that the experiment required that they go on. A number of them, he reported, ''were observed to sweat, tremble, stutter, bite their lips, groan and dig their fingernails into their flesh'' - yet most of them obeyed his orders. Milgram concluded that, when someone else is in authority, perfectly normal people are capable of obediently acting brutally toward others, a finding that might cast light on the behavior of many of the ordinary citizens of Nazi Germany. Milgram's work won the 1964 award for social-psychological research of the American Association for the Advancement of Science.

The opponents of deceptive research, citing such experiments, argue that behavioral research should be subjected to tightened controls. But both the P.H.S., in its rules governing grantees, and the American Psychological Association, in its code of ethics, continued to tolerate deceptive methods in social research - with some reservations - in order not to interfere with the freedom to pursue knowledge. The dilemma, as outlined in ''The Handbook of Psychology,'' stemmed from the coexistence of two incompatible American ideals: ''A belief in the value of free scientific inquiry and a belief in the dignity of man and his right to privacy.'' Researchers could explore certain areas of psychology only by robbing subjects of their right to informed consent; society could protect that right only by depriving researchers of the chance to cast light on important issues through deceptive experimental methods. While waiting for the second slide to appear, the two confederates chatted animatedly about the chances of getting summer jobs in Palo Alto. One of them, turning to Steve, asked, ''Are you staying in Palo Alto in the summer?'' Steve, who had understood nothing, stared at him sullenly; the other smiled and said something else, then turned away.

The next slide said they could now work together. Then on came a sketch of a young man, seated, and an older man standing near him, holding a piece of paper in one hand. The confederates, scribbling, showing their papers to each other and, chuckling and talking, decided the scene showed a son, upset at his grades, being comforted by his father. Steve took it to show a Soviet police inspector, seated, to whom a colleague was presenting a report on a civilian suspect. With some difficulty he managed to write this down; the incomprehensible talk and carrying on of the other students was flooding him with anxiety and anger. As he recalls it, ''They were talking to each other and laughing and making faces, and I couldn't understand any of it. I kept asking myself, 'Who are these two clowns sitting here and laughing at me? Why are they mumbling or talking in some strange language? How can Susan, who I thought was my friend, be letting this happen to me?' And I was extremely upset that Professor Zimbardo, whom I really admired, was in on it.''

After a while the series of slides came to an end and on the screen appeared instructions to fill out the forms and questionnaires in front of each of the students. One questionnaire, listing 12 traits and moods, asked the subject to rate his present status as to each on a scale of 0 to 100. Steve filled in the blanks as follows: a. relaxed - 0. b. agitated - 100. c. happy - 0. d. irritated - 100. e. friendly - 0. f. hostile - 100. g. intellectually sharp -20. h. confused - 100. i. vision - 0. j. hearing - 0. k. creative - 0 l. suspicious - 100. The projector clicked again, directing one of the confederates - and, in a moment the other - to leave. Steve, filling out a final form, jabbed angrily at the paper and broke his pencil, threw it away, and got up and paced back and forth like a caged animal. He had never in his life felt so peculiar or so agitated. In 1971 the Department of Health, Education and Welfare tightened the Public Health Service regulations governing biomedical research with human subjects and made them binding on research funded by its other agencies, and in 1974 toughened the policies further. Among other things, informed consent was very strictly construed. For good measure, the regulations covered behavioral research as well. Even minor deceptive methods, in social-psychological experiments, were ruled out unless given special approval by an institutional review board (I.R.B.) - a watchdog committee designated within every university or research foundation to see that research proposals using human subjects obeyed the Federal regulations. Any institution that failed to comply might have all its H.E.W. subsidies cut off.

Social psychologists planning research now had to write detailed explanations, defend their proposals before suspicious I.R.B.'s and dilute the impact of their experimental situations; even so, their proposals were often turned down. The explicit motive of the I.R.B.'s was to protect human subjects; the implicit one was to safeguard the institutions' grant money. A survey made in 1974 and 1975 for the National Commission for the Protection of Human Subjects (a congressionally created advisory group) asked a large sample of behavioral scientists whether the review procedure had impeded research at their institutions; more than half said it had. As a result of this survey and of many complaints from social scientists, the Department of Health and Human Services (successor to the Department of Health, Education and Welfare) eased the requirements a little in July 1981; among other things, incompletely informed consent was now held acceptable if, in the view of the I.R.B., it involved ''minimum risk to the subject'' and if the research ''could not practicably be carried out'' otherwise.

Even in their slightly loosened form, the regulations have made a major change in social-psychological research. Deception is still in common use, but mostly in quite limited form and productive of only trivial experimental stresses. As a result, the scope of research has been significantly narrowed. ''You don't even consider experiments that would run into resistance,'' Prof. Edward E. Jones of Princeton University told me. ''Whole lines of research have been nipped in the bud.'' Obedience experiments like Milgram's are now quite beyond the pale; most helping experiments are considered unacceptably hard on some subjects; and so are research designs which would even briefly frighten or embarrass subjects, or teach them unpleasant truths about themselves.

The case of Prof. Stanley Schachter of Columbia University is instructive. Schachter, for 30 years a leading and innovative researcher, told me that recently he gave up doing experiments. ''I simply won't go through all that,'' he said. ''It's a bloody bore and a waste of time.''

Nor, if present conditions had prevailed in 1962, would he have made the most seminal discovery of his career. He and Dr. Jerome Singer asked naive subjects for permission to inject them with a vitamin preparation to test its effects on vision. They told their subjects that for some minutes it might make their hearts pound, their faces flush, and their hands shake. The injection was, in fact, adrenalin, which is produced in the body by a variety of emotions. Schachter and Singer sought to show that human beings cannot identify their emotions from bodily sensations alone; they need external cues as to what they are feeling. Accordingly, the researchers had their naive subjects, while feeling the effects of the adrenalin, fill out a form in the company of a confederate (also supposedly injected with vitamins); in some cases, the confederate acted giddy, silly and happy, while in other cases, the confederate was angered by the questions in the form, eventually tore it up and then stormed out. Naive subjects who had been with euphoric confederates said that they, too, had felt euphoric; those with angry confederates said that they, too, had felt angry. Q.E.D. ''That experiment had major consequences,'' Schacter said, ''but I'd hesitate to do it again today. In fact, I couldn't do it.'' Susan Andersen came in. ''She approached me,'' Steve recalls, ''saying something I couldn't understand, and I dodged away from her and sat down. She came over and put her hand on my shoulder, and I was shaking, not knowing whether to cry or scream or what, but then she said something'' -(''That's all for now'') - ''and I could understand her and she was saying, 'Are you all right?' I was amazed, and all I could say was, 'Would you please tell me what is going on here?'

''She had me relax in a reclining chair and sat next to me and said, 'Do you remember the headphones?' and I did. 'The music?' and I did. And so on, until bit by bit I could remember everything, and it came to me why I had been unable to hear. I was in a state of shock, of total disbelief - but of course total belief, too. I was very bewildered - overjoyed, and yet still extremely upset.

''Then Professor Zimbardo came in with a big smile on his face and said, 'That worked wonderfully! You were terrific!' He sat down and slowly and carefully explained the real purpose of the experiment to me and what he was hoping to show by his research -and had shown in my case. I was absolutely fascinated. He was right -when you're deaf, it seems like they're all plotting against you. Even though I still felt shaken, I was really proud that I had played an important part in the work.''

Zimbardo asked Steve to come back in a week for a checkup. (When he did, he took a test for paranoia - a retest of one he had taken weeks earlier, which showed him to be normal. The retest showed no residual effects of the experiment.) All in all, the debriefing lasted 45 minutes - longer than the experiment; defenders of deceptive research lay great stress on thorough, restorative debriefing. By the time Steve left, he felt in control of himself and calm. But all night his mind was a kaleidoscopic jumble of images and thoughts, and he continued to feel a succession of recalled moods in an emotional reverberation that took days to die away. Defenders of deceptive research argue that it is morally justified if the risks or costs to the subject - often minor or nil - are outweighed by the benefits to humanity. But how to weigh costs against benefits is far from clear. In what units can one measure embarrassment or anxiety, and how can one compare this to the worth of some piece of new knowledge?

Critics of deceptive research assert that, most of the time, the costs greatly outweigh the benefits. Their evidence, however, is largely anecdotal, consisting of a number of stories of subjects who, despite debriefing, have remained disillusioned, suspicious of authority, or permanently ashamed of something they have learned about themselves, as in the case of people who failed to help a stranger in distress. Diana Baumrind, a psychologist at the University of California at Berkeley, a forceful polemicist against deceptive research, calls the last outcome ''inflicted insight,'' and argues that foisting unsought selfknowledge on another is morally indefensible.

On the other hand, most social psychologists maintain - again on the basis of subjective appraisals - that little, if any, harm is done by deceptive research. And the few follow-up surveys that exist support this sanguine view: Most subjects, looking back on their experiences, say the deception imposed on them, even if stressful, was justified by the scientific knowledge gained. In Milgram's obedience experiment, 84 percent of his subjects said afterward that they were glad they had been in it; only 1.3 percent regretted the experience, and the rest were neutral.

But some of the most articulate enemies of deceptive research reject any justification based on the weighing of costs against benefits. Diana Baumrind, for one, says this is a utilitarian ethic; she rejects it on the grounds that concern for the rights of the human being is morally superior to the freedom to seek knowledge and should take precedence over it. Thomas Murray, a social psychologist at the Hastings Center, says that even if deception does no harm, it does wrong to the person it deprives of free choice; in his view, cost-benefit calculations treat the subjects as useful tools, not as moral agents.

Some critics of deceptive research have said that it is not necessary, that alternative techniques will serve. They have suggested the use of role-playing, for instance, in which subjects are asked to imagine how they would behave in a given experiment if they were naive as to its real purposes. In many cases, they give themselves better marks than when the experiment is run with truly naive subjects. As Dr. Joel W. Goldstein, executive secretary of the Research Scientist Development Review Committee of the Department of Health and Human Services, comments, ''We know that what people say or do in role-playing experiments is qualitatively different from what they really do.'' Much the same, disappointingly, is true of various other suggested alternatives.

Accordingly, both the Federal regulations and the code of ethics of the American Psychological Association reluctantly continue to acknowledge the necessity of deceptive research and to use the costbenefit ratio as the operative ethical standard by which to judge it. Typically American in its pragmatism, it judges the morality of any case of deceptive research by its results. By the end of the spring semester, Zimbardo and his assistants had completed the experimental work. Six subjects had been hypnotically made hard of hearing and unaware of why they were; a control group knew why they were experiencing deafness; and another control group was not deafened. Statistical analysis of the test scores plus observations and self-reports showed that the naive deafened subjects had been far more paranoid during the experiment than the control groups; the differences between the groups were ''highly significant'' - which, in this case, meant that there was only a onein-a-thousand chance that the results were due to pure accident.

Steve, meanwhile, was feeling disagreeable aftereffects, but of a kind unrelated to temporary paranoia. He had expected to continue to play a part in Zimbardo's research; indeed, Zimbardo, as part of the cover story, had said as much. ''But it didn't turn out that way,'' Steve says. ''I ran into him and said, 'I'd really like to work with you some more,' and he said, ''Sure,' but I never heard from him. I left messages but he never called me back. I felt no resentment about what was done to me in the experiment, but what I did - and still do - resent was being dropped like a cold jellyfish after such a substantial emotional event in my life. That's what made me feel like 'just another subject.' Yet I still greatly admire Professor Zimbardo for the research he's doing and his reasons for doing it. To this day, I feel honored that I was able to work with him and contribute to an important study.''

Zimbardo, who is regretful about Steve's feelings but says there is nothing a researcher can do about that, wrote up his results in a journal article in which he concluded that the findings have obvious bearing on the connection between deafness and paranoia among the middle-aged and elderly. The article appeared in the June 26, 1981, issue of the journal Science. Not surprisingly, it was greeted with both approval and criticism -the latter because of its ethics. Zimbardo defended his work, specifying the precautions taken, the need for deception, the lack of harm done and the value of the findings. He ended by reiterating the cost-benefit ethic: ''The risks to the subjects appear to be quite transient while the potential benefits hold substantial promise.'' Some social psychologists - the braver ones, or perhaps those in particularly secure positions - continue to use deception, albeit more conservatively than before, to investigate major issues. John M. Darley, now chairman of the psychology department at Princeton, told me why he feels obliged to do so: ''Psychologists have an ethical responsibility to do research about processes that are socially important, and to discover clearly what is going on - which means that sometimes they have to keep their subjects in the dark about certain aspects of the study.'' In a study he recently conducted with a colleague, Princeton undergraduates were asked to help validate a new method by which teachers could judge the potential of pupils. On videotape, some subjects saw a 9-year-old girl playing in shabby, lower-class surroundings; others saw the same girl playing in a manicured suburban setting; each group then saw her giving the same answers aloud as she took an achievement test. Both groups were asked to evaluate her school potential: Those who had seen her in a lower-class milieu rated it below grade level, while those who had seen her in an upper-middle-class milieu rated her well above it. Evidently, knowledge of a person's socio-economic status can distort perception of his or her performance.

But many other researchers are now turning to topics which can be investigated without deception or, at most, by merely withholding a little information. Some, perhaps many, of those topics are worthy of serious investigation - but so are the topics that require major deception and that are no longer researched except by the few who are able to run the risk. No doubt, the present compromise between freedom of inquiry and the rights of individuals has corrected the occasional flagrant misuses of human subjects, but perhaps the correction has somewhat overshot the mark. Prof. Leon Festinger of the New School for Social Research, a key figure in the history of social psychology, recently deplored the caution that has been forced upon its practitioners. ''One can stay far away from ethical questions,'' he said, ''but it seems to me that steering clear of these difficulties keeps the field away from problems that are important and distinctive to social psychology.'' Steve Kaufman is two years older, two inches taller, and rather more realistic than he used to be as a freshman. When he suffered the disappointment of being no longer wanted for further experimental work, its attraction for him dwindled. He has decided to major in German and urban studies. But he still looks back on the period of his participation in Zimbardo's research as one of the high points of his life thus far.

That has not, however, led him to take a simplistic stand on the ethical issues; like all those who find the cost-benefit ethic sensible but difficult to apply, he sees both sides of the issue. ''Before I was in the experiment,'' he says, ''I was very much against deceptive research. But then I heard Professor Zimbardo lecture about it, and I could see that you can't get certain kinds of information without deceptive methods. I agree with the people who say it's not right to deceive human beings; it's not right to treat people as if they were mice. But I agree with Professor Zimbardo that he couldn't do his work on deafness and paranoia without deceiving his subjects, because if they knew what was going on, they wouldn't react the same as if they didn't.

''I can see both sides. That's my dilemma, and I don't think there's any simple answer to it, only complicated ones.''

A version of this article appears in print on September 12, 1982, on Page 6006066 of the National edition with the headline: RESEARCH THROUGH DECEPTION. Today's Paper|Subscribe