A famous 1970s experiment was recently replicated, revealing what it takes for us to question and resist those in positions of authority.By Liliana Segura,
Posted February 12, 2009.
Courtesy Of AlterNet
Between 1963 and 1974, Dr. Stanley Milgram conducted a series of experiments that would become one of the most famous social psychology studies of the 20th century. His focus was how average people respond to authority, and what he revealed stunned and disturbed people the world over.
Under the pretense of an experiment on "learning" and "memory," Milgram placed test subjects in a lab rigged with fake gadgetry, where a man in a lab coat instructed them to administer electrical shocks to a fellow test subject (actually an actor) seated in another room in "a kind of miniature electric chair."
Participants were told they were the "teachers" in the scenario and given a list of questions with which to quiz their counterparts (the "learners"). If the respondent answered incorrectly to a question, he got an electric shock as punishment.
The shocks were light at first -- 15 volts -- and became stronger incrementally, until they reached 450 volts -- a level labeled "Danger: Severe Shock." The actors were never actually electrocuted, but they pretended they were. They groaned, shouted, and, as the current became stronger, begged for relief. Meanwhile, the man in the lab coat coolly told the test subjects to keep going.
To people's horror, Milgram discovered that a solid majority of his subjects -- roughly two-thirds -- were willing to administer the highest levels of shock to their counterparts. This was as true among the first set of his test subjects (Yale undergrads), to subsequent "ordinary" participants as described by Milgram ("professionals, white-collar workers, unemployed persons and industrial workers"), to test subjects abroad, from Munich to South Africa. It was also as true for women as it was for men (although female subjects reported a higher degree of anxiety afterward).
For people who learned of the study, this became devastating proof, not only of human beings' slavish compliance in the face of authority, but of our willingness to do horrible things to other people. The study has been used to explain everything from Nazi Germany to the torture at Abu Ghraib.
But what if Milgram's obedience studies tell us something else, something just as essential, not about our obedience to authority, but what it takes for people to resist it? Now, for the first time in decades, a psychologist has replicated Milgram's famous study (with some critical changes).
The bad news: His results are statistically identical to Milgram's. The good news: Contrary to popular perception, the lesson it teaches us is not that human beings are a breed of latent torturers. "Actually," says Dr. Jerry Burger, the psychologist who led the exercise, "what I think is that the real lesson of the demonstration is quite the opposite."
Replicating Milgram: 'I Can't Tell You Why I Listened to Him and Kept Going'
Burger works at Santa Clara University in Santa Clara, Calif. Like many in his field, he has long been interested in Milgram.
"Everybody who works in my area has his or her own ideas about why Milgram's participants did what they did," he says. And many have ideas about what they would change if they did the study themselves. "I have kind of had ideas like that forever … but it's pretty much been considered to be out of bounds for research. I think we all kind of assumed no one was every going to be able to do this study again."
Indeed, Milgram's obedience study was deeply controversial in its time. His deceptive methodology would later be criticized as unethical, and stiffer regulations concerning the psychological well-being of participants in such studies would follow. Thus, despite its enduring role in the popular imagination -- and relevance to the events of the day -- Milgram's study would remain firmly entrenched in its time and place.
Then, in 2004, the Abu Ghraib scandal broke. In the analysis that followed, many pointed to Milgram's findings as a way to understand what could have led otherwise-average soldiers to act so cruelly. At ABC News, producers decided they wanted to do an investigative report on this question.
"I think what they had in mind at first was some sort of journalistic stunt," Burger recalls "… to set up the Milgram study themselves." But ABC was advised not undertake such a project lightly. "Someone told them, 'If you want to do some sort of exploration of obedience, you need to talk to someone who works in the field,' " says Burger. "Somehow my name surfaced in this conversation."
When ABC called him, "I told them, 'No you can't replicate Milgram,' but I thought it was great that they wanted to explore these questions. … I was not interested in helping them put on some kind of stunt (but), it was something that I always wanted to do. And if ABC would foot the bill …"
It took months to set up the project -- recruiting and vetting participants, getting insurance, consulting lawyers, etc. When it came to conduct the experiment, Burger had implemented significant changes to Milgram's original study. One crucial adjustment had been to establish a threshold that did not exist under Milgram. Burger calls it the "150-volt solution."
"You can't put people through what Milgram did," says Burger. Revisiting descriptions of his subjects, he says, "you see that people were suffering tremendously." They believed they were torturing people, that people were "presumably even dying on the other side of the wall."
Thus, based on Milgram's original data, which showed that the majority of the participants who administered 150-volt shocks to their subjects were willing to go all the way to the highest levels, Burger decided that he would stop participants at the 150-volt mark, "the point of no return."
When the ABC special aired in January 2007, it took a predictably sensationalist approach. "A Touch of Evil" was the title, and foreboding music provided a dark backdrop.
The segment showed men and women of various ages, ethnicities and professions doing the same thing -- administering what they believed were electric shocks to a person in another room.
Often the participants would be startled by the shouts behind the wall, turning to look to the man in the lab coat with nervous expressions. But at his behest, they continued, even amid protests from the actor. ("Get me out of here, I told you I had heart trouble. My heart's starting to bother me now.")
In the end, 70 percent of the subjects reached the 150-volt mark -- a statistic basically identical to Milgram's. Unlike in Milgram's experiment, however, Burger told his subjects immediately after their time was up that the whole thing had been staged.
"I can't tell you why I listened to him and kept going," one participant told his ABC interviewer. "I should have just said no."
In the media and the blogosphere, the response to Burger's study has played into the notion that Milgram's findings, as true now as they were a generation ago, point to some intrinsic capacity for evil in human beings. It was more or less summed up by one blog's headline, which Burger noted, chuckling: "This Just In: We Still Suck."
'Under the Right Circumstances, People Will Act In Surprising and Unsettling Ways'
Although Milgram's research is understood mainly through the lens of "obedience," Burger believes that authority is actually not the definitive factor in the situation.
Just as important, if not more so, are the combination of factors that make up the scenario and which make subjects so dependent on authority. For example, despite being shown the "learner" strapped in before the experiment begins, participants are operating on relatively little information.
"They want to be a good participator, they don't know, 'should I stop, should I not,' " says Burger, "… Except there's a person in the room that's an expert, who knows all about the study, the equipment, etc … and he's acting like, well, this is nothing unusual … If the only information you have is telling you that this is the right thing to do -- of course you do it."
Participants are also absolved of any real sense of personal responsibility. "I was doing my job," is a common refrain. Burger notes, "when people don't feel responsible, that can lead to some very unsettling behaviors." And then, there's the high pressure created by the limited window of time participants have to choose whether to shock their "learner."
"Imagine if Milgram had allowed those people to take 30 minutes and think about it," says Burger. "They don't have time, and the experimenter doesn't allow them time. In fact, if the person pauses, the experimenter steps in and says, 'Please continue.' "
But perhaps the most important enabling factor is the fact that the volts go up in little by little.
"Milgram set this up so that people responded in small increments," says Burger. "They didn't start with 150 volts, they started with 15 and worked their way up … That is a very powerful way to change attitudes and behaviors." Most people, after all, don't start with extreme behaviors right off the bat.
"People didn't start by drinking Jim Jones' poison Kool-Aid," Burger says. "They probably started by donating money, or going to a meeting … you probably see that in most examples where you're scratching you head and saying, 'How can they do that?' "
In Burger's opinion, the significance of Milgram's findings are widely misunderstood. "The point is not 'look how bad people are.' … What we fail to recognize is the power of the situation and [that] under the right circumstances, people will act in surprising and sometimes unsettling ways."
Indeed, what these factors demonstrate is not how easily people will harm another person, but how quickly people will cede their own authority to another person when they feel isolated, pressured and powerless. The more controlled an environment, the more vulnerable a person is.
What Does It Take to Resist Authority?
Long before his most famous experiment, Stanley Milgram was interested in phenomena showing that people placed in the right situation will often do the wrong thing.
Writing in The Nation magazine in 1964 about a case in which a New York woman named Kitty Genovese was killed within earshot of 38 neighbors, none of whom intervened, Milgram wrote, "We are all certain that we would have done better." But, he argued, it is a mistake to "infer ethical values from the actual behavior of people in concrete situations."
"…We must ask, did the witnesses remain passive because they thought it was the right thing to do, or did they refrain from action despite what they thought or felt they should do? We cannot take it for granted that people always do what they consider right. It would be more fruitful to inquire why, in general and in this particular case, there is so marked a discrepancy between values and behavior."
One lens through which to understand this is politics, a profession notorious for its moral corrosiveness. In his book, Conservatives Without Conscience, John Dean, Richard Nixon's White House counsel, wrote about the Milgram experiment to explore how members of the Bush administration could be so complicit in the immoral policies of the so-called war on terror.
In a 2006 interview with Thom Hartmann, Dean explained:
"I looked at this because I was trying to understand, how do people who work at the CIA and know that they're part of a system that is torturing people in the Eastern European secret prisons -- and they're supporting that system, they're providing information or bringing it out of it -- how they do that every day?
"How do the people who work at NSA who were turning that huge electronic apparatus of surveillance on their neighbors and their friends, where's their conscience?
"And then I realized that this is a perfect example of the Milgram experiment at work. They're under authority figures. What they are doing is, they're haven't lost their conscience -- they have given their conscience to another agent, and so they feel very comfortable in doing it."
If Milgram's experiment showed a sort of moral death by a thousand cuts, the decisions, compromises and rationalizations that politicians make on a daily basis from their Washington offices that seem otherwise unfathomable indeed seem easier to explain, if not justifiable. After all, unlike the participants in Milgram's original study, who were paid $5 for their time (and notoriety), politicians in the White House or on Capitol Hill build their careers on decisions that can destroy human beings. Whether in Iraq or at Guantanamo, the suffering on the other side of those walls is real.
But Milgram has much to teach us, too, about what it takes to resist powerful governments and their destructive policies. It's not easy, and the stakes can be high.
Writing about war resisters in The Nation in 1970, Milgram noted, "Americans who are unwilling to kill for their country are thrown into jail. And our generation learns, as every generation has, that society rewards and punishes its members not in the degree to which each fulfills the dictates of individual conscience but in the degree to which the actions are perceived by authority to serve the needs of the larger social system. It has always been so."
But while Milgram so effectively demonstrated the challenge of defying authority, he also showed that subjects were far more likely to do it when they saw other people doing it. He wrote in The Perils of Obedience, "The rebellious action of others severely undermines authority."
"In one variation, three teachers (two actors and a real subject) administered a test and shocks. When the two actors disobeyed the experimenter and refused to go beyond a certain shock level, 36 of 40 subjects joined their disobedient peers and refused as well."
Put in a political context, this is perhaps the most important lesson Milgram has to teach us. The best hope people have of resisting an oppressive system is to validate their experiences alongside other people. There is no more basic antidote to authoritarianism than support, solidarity and community.
Milgram wrote, "When an individual wishes to stand in opposition to authority, he does best to find support for his position from others in his group. The mutual support provided by men for each other is the strongest bulwark we have against the excesses of authority."
See more stories tagged with: abu ghraib, jim jones, abc news, stanley milgram, dr. jerry burger, the nation magazine
No comments:
Post a Comment