I'm back from a 2 week holiday in Ischia (Italy). I like to take a book or two with me because I rarely find the leisure (and the concentration) to read something complex during my normal life. This time, it was The Lucifer Effect by Philip Zimbardo. If you haven't heard about this guy, you may have heard about the Stanford Prison Experiment (SPE). If you haven't heard about this as well, here is a short summary: Take 20 random, normal people (i.e. someone like you and me and your neighbor), put them into a situation where they can get away with anything and watch them turn into human nightmares within a few days.
The book explains in great detail how normal people turn evil. Evil in the sense that they stop acting kind and instead start to humiliate, torture, even kill other people. The hard part of the book is to understand that these people aren't evil to begin with. They are just ordinary people. For example, most suicide bombers are well educated, have an intact family background. Still, it's exactly these people who are most vulnerable to the mechanics which turn a nice guy into Darth Vader.
I'll point out a few places in the book in this post which impressed me most. On pages 167f, Jerry-5486 (one of the prisoners in the SPE) explained how he rationalized his own evil actions against a fellow prisoner by thinking "It's only a game, and I know it and I can endure it easy enough, and they can't bother my mind, so I'll go through the actions." Doing this, he could completely ignore how much his actions would hurt the people around him, even people he thought of as friends. Instead of supporting those he liked, he submitted to the cruel commands of the guards. Because of how our brains work in these circumstances, he only noticed what he had done afterwards.
On page 205, there is a great quote:
Life is the art of being well-deceived; and in order that the
deception may succeed it must be habitual and uninterrupted.
-- William Hazlitt, "On Pedantry", The Round Table, 1817
as is on page 208:
Wherever anyone is against his will, that is to him a prison
-- Epictetus, Discourses, second century A.D.
On page 210ff, Zimbardo explains the different factors which led to the humiliation/torture in the SPE: Situation, role expectations, responsibility (or lack thereof), anonymity and deindividuation, cognitive dissonance and social approval. On page 220, he explains how cognitive dissonance forces us to rationalize our actions when we hurt others:
Oddly enough, the dissonance effect becomes greater as the justification for such behavior decreases, for instance, when a repugnant action is carried out for little money, without threat, and with only minimal sufficient justification or inadequate rationale provided for the action. Dissonance mounts, and the attempts to reduce it are greatest, when the person has a sense of free will or when she or he does not notice or fully appreciate the situational pressures urging enactment of the discrepant action.
Instead of using our brain to stop what we do, we use it to explain the pain away. The less reason someone has to be cruel, the stronger the desire to give the cruel actions a reason that will make it "look good". Ever wondered why anyone will come up with such unreal excuses when they are caught? That's why.
In out daily life, we feel that our reason is in control when in fact, it's the situation that shapes how we see the world and thus forces our hand without our reason even noticing. This is what happens when the government calls the population to arms because of a "threat to national security". Ironically, it was always those who yelled that were the threat (and not those who were accused). Unfortunately, situational forces, our shared desire for social approval, drives us to believe the yellers even when five minutes of rational thought would show how silly their words and how false their motives must be (p. 226ff). This is what lead to slaughter of the Jews in World War II, the Challenger disaster, the genocide in Rwanda and the abuses in Abu Ghraib (which was one of the more harmless incidents in US prisons in the area). It's always the same mechanisms which lead to these results.
The basics of these mechanisms are listed on p. 230: Our need "for consistency and rationality", our need "to know and to understand our environment and our relationship to it" and "our need for stimulation". Also, how we sense time has a great influence on our behavior (p. 243): Do we believe in the past? The future? Do we have reason to plan or do we sit back in depression because we feel all our plans and hopes are in vain?
These same mechanisms are used by the military to train normal people (which still have an inhibition to kill) into people who can kill on command. A visitor to the SPE website explains how the drill instructors used the usual old, irrational lie to punish the group for a mistake: "if you guys would have moved faster, we wouldn't be doing this for hours". Well, the only reason why the group is going "this" is because of the drill instructor (p. 256).
Tests show that group pressure is a great tool to create obedience. Imagine you had to judge the length of a line on a card against three reference cards. Doing the test alone, you'd get it right more than 99% of the time. But what if you were in a group with several people who would chose the wrong answer on purpose? Could you resist the urge to comply to the group or would you submit? When the experiment was done, 70% of 123 test persons caved and agreed with the group that two obviously different lines had the same length (p. 263). Modern technologies allow to look into the brain and help to understand what happens. If you deviate from the group's consensus, the images from your brain will show that "resistance creates an emotional burden for those who maintain their independence -- autonomy comes at a psychic cost" (p. 265). The images also show that often, you don't even notice the conflict. The brain will adjust it's perception of the world to what the group wants to believe before reason is even asked for its opinion.
That all leads to Milgram's experiment where unsuspecting test persons would be lead by the experimenter to apply deadly electrical shocks to another person. Of course, the other person was a confederate and the shocks were never actually applied but the test person didn't know that. 66% of the test persons would go all the way even when the "victim" would scream in pain and beg for mercy. Most would even continue after the "victim" stopped responding. Because the results were so shocking, the experiment was repeated with different people in different countries. With different settings, Milgram could get the compliance rate (people who would go through the whole hell until the bitter end) from 10% to 90%. It didn't matter who was tested but how. From the results of his various experiments, Milgram compiled a list of ten methods to turn anyone into a torturer (p. 273f):
- Create a sense of obligation (for example, make a contract or say "you have to protect your country/family/whatever").
- Give the participant a positive role (teacher, protector, soldier)
- Present a set of rules that seem to make sense at first glance. It doesn't matter if they are really consistent since you can later change them arbitrarily.
- Make the job sound harmless. Don't say "hurt the prisoner", say "protect the country".
- Make sure no one really feels responsible for what happens or that everyone can point at someone else to blame when something bad happens. In Milgram's Experiment, a lot of people stopped unless the experimenter told them "I'll accept full responsibility if the other person is harmed".
- Start with an act that points in the general direction but doesn't seem too bad ("We'll just lock these people up to protect society").
- Work with small, seemingly insignificant steps ("that's not much different from what I did yesterday"). Humans are great for detecting big changes but they suck with little ones.
- Gradually change the nature of the authority figure (leader, boss) from "just and reasonable" to "unjust, demanding, even irrational". If you do this in little steps, the subject won't notice and still believe in the "first impression" even after the transformation is complete. At that stage, the leader can ask anything and (s)he will get it.
- Make exiting hard. People are allowed to complain but insist that they still follow the orders to the letter ("Do you want to ruin everything we achieved?")
- Offer an ideology, an explanation or a big lie to justify the use of any means to achieve the seemingly desirable ends.
If you get everything right, 90% of all people will follow you to hell and beyond. If you take these rules, you can easily find parallels to the many dictatorships around the planet. That's how and why they work and why it is so hard to break them down from within.
There are also references to anonymity experiments with schoolchildren during Halloween (p. 301) which shows what happens if people can be aggressive anonymously. In this scenario, they will act more aggressive unless they can be identified, even if their aggressive behavior is not in their best interest.
Being part of a group apparently causes us to reduce our self-awareness (a process called deindividuation, p. 305) and our ratio (our "good" sides) in favor of our "Dionysian nature", where instincts, lust and desire rule. This is what creates the strong group cohesion or "peer pressure" which allows us to create a civilization despite our egos. But if you don't know about these effects, then they can get out of hand, driving the group ever deeper towards chaos or evil.
These effects are enhanced when we see other people as inferior, non-human, animals or objects. In the end, it leads to a state of moral disengagement (p. 310). In this state, the people of the group are still able to understand that their actions are wrong but they don't care anymore. This state allows people to kill, torture, rape.
We don't even have to go that far. People are less likely to help when they are in a hurry (p. 316). The same effect is at work when a large group of bystanders watch someone suffer and don't intervene to help. Or when a sane person is taken to a mental hospital (p. 321). Staff members of a hospital played the role of patients for a short period of time and were very surprised how they were treated by their colleagues on duty. In the end, lawyers had to demand their release or they would still be in the ward.
Or Abu Ghraib. In chapter 14, Zimbardo dissects the situation and the people behind the media hype we all know from the images in the news shows. An apparently normal, mental healthy soldier is thrown into a situation where a life doesn't count much, there is no support from outside or superiors, the place is under constant attack by the locals and local forces that are supposed to help instead smuggle weapons into the cells of the prisoners for money and your superiors order you to torture to get intelligence from the prisoners. And the icing of the cake is that you get the night shift which means sleep deprivation. How long could you withstand such a stress? Staff Sergeant Chip Frederick (that's the guy who got all the blame for the incident) made it for more than two months until he caved (p. 343).
From the official news, you might get the idea that he was accused to have tortured the prisoners. I find that hard to believe. Others have killed prisoners and are still walking free (see "The Iceman"). From my point of view, he was convicted for making trophy photos. Other teams with similar records were instructed to destroy the evidence before returning home. The kind reader is welcome to draw his/her own conclusions.
This pattern of taking trophy photos comes up again and again when people are abused. It happened when black men and women were lynched in the US, German soldiers took them after committing atrocities against Polish Jews and Russians, etc. The list is endless. But there is a difference between these photos and real trophy photos (taken when a big-game hunter had his shot). On the original photos, the hunters don't laugh and cheer. They appear "rather grim, rarely are they smiling: they are victors in a battle against formidable adversaries." (p. 364).
The author goes on to explain how the Bush government actively helped to produce the environment which lead to abuses like in Abu Ghraib and many other places in Afghanistan and the Iraq, Guantanamo, and probably elsewhere, too. The last figure I know is of 600 similar incidents. An interesting if controversial read.
Chapter 16 contains the most important part of the book: How to fight evil. Not on a big scale but on a personal one. How to notice and withstand forces which try corrupt you. On page 451ff, you can find a list of ten steps to "resist unwanted influences". I especially like "Reject simple solutions as quick fixes for complex personal or social problems." Print that out and read it twice every morning.
Even if you don't read the rest of the book, look at these few pages. They alone are worth the buy.
In the last part of the book, Zimbardo takes a look at heroism. Unlike popular belief, heroes and heroines are ordinary people who just did the right thing at the right time at great personal risk because they believed they had to do it (p. 457). Or as the author puts it (p. 466):
Heroism can be defined as having four key features: (a) it must be engaged in voluntarily; (b) it must invoke a risk or potential sacrifice, such as the threat of death, an immediate threat to the physical integrity, a long-term treat to health, or the potential for serious degradation of one's quality of life; (c) it must be conducted in service to one or more other people or the community as a whole; and (d) it must be without secondary, extrinsic gain anticipated at the time of the act.
Recommendation: Must Have.
No comments:
Post a Comment