A Mentalist Explains The Milgram Experiment and Social Compliance
I recently rewatched Derren Brown’s The Push on Netflix.
Derren is a master at using suggestion to get people to comply with what he wants them to do. He is also a master at using suggestion put people (and the audience watching them) in situations where the moral is that we don’t have to be controlled by the suggestions we face every day.
And in The Push Derren Brown uses social engineering to see if an average, everyday person would commit murder by pushing someone off of a building.
But this is far from the first time someone has wondered how far people will go to comply with the social pressures around them. And in this article, I will tell you about four incredible examples of how social compliance can manipulate your decisions and what you can learn from them.
1. The Power Of A Uniform
Let’s say you’re walking down the street. A man is standing on the corner. There’s nothing overly striking about him: a T-shirt, jeans, maybe a baseball cap. As people walk by, he gives them orders.
“Hey, pick up that trash.”
“You! Go stand over there.”
“Give that guy a dime.”
“Move away from that bus stop.”
How many of you would do what he says? Or even pay attention? Not many, I’d wager. But let’s change the game a little bit.
What if the man on the street is in a security officer’s uniform rather than regular, run-of-the-mill street clothes? What then? How many people do you think would obey his commands?
Bickman discovered that twice as many people complied with random orders from someone in an authoritative uniform. This trend continued even after the fake guard gave the order. Future studies found that even wearing a nice suit increased compliance.
There are two lessons here. The first is to dress the part. The more authoritative you look, the more likely people will listen to you.
The second is from the other side; if someone looks like they’re in a position of authority, it could be they’re not as legit as they may seem. Check credentials, experience, or anything else that can signify ACTUAL credibility.
2. The Lines Of Conformity
Let’s try a different situation. You’re standing in a room with a small group of strangers, and you’re shown this:
The group is asked to match the lone line on the left card with one of the same lengths on the right card. Obviously, the answer is C, right? Everyone agrees, shakes hands, and heads to the pub.
Now imagine this instead: you’re standing in a small group of strangers, looking at the two images. Clearly, the match is line C. But then one of the group members pipes up and says it’s B. And then another agrees with him. And another, and another. What do you say then? You stand up for yourself, right?
Nope. Well, at least not all of you. In Solomon Asch’s 1951 experiment, thirty-two percent of participants agreed with the group (a.k.a. the experimenter’s confederates) when they knowingly gave the wrong answer. That’s a third. A third of the participants knowingly agreed to be wrong because of the social pressure of the group.
I’m sure some of you think, “Alright, magic-man, that’s just some guy yelling random orders or lines on a card. That’s not a big deal. It wouldn’t be the same when it really matters.” Let’s up the ante in these little mind games and see what happens.
3. A Room Full Of Smoke
You’re sitting in a room by yourself, filling out a questionnaire. Smoke starts seeping into the room from under a door. What do you do?
The obvious answer is you say something. And statistically, you do. In studies examining the bystander effect and diffusion of responsibility, 75% of people said something about the smoke before the experiment ended and in an average of around 2 minutes.
But let’s change the game. What happens if the room is filled with people (accomplices with the experimenter) who say and do nothing about the smoke as if it’s completely normal?
90% – 9 out of 10 subjects – continued to fill out the questionnaire while rubbing their eyes and waving the smoke away from their faces. 90% of people clearly ignored a surefire (pun intended) sign that they could be in danger, all because the group around them ignored it. If you don’t believe it, you can watch it for yourself.
4. The Milgram Experiment
You might think, “Jeff, this is very interesting, but these experiments are based on relatively inconsequential stuff. How bad could it be when it really counts?”
Enter the Milgram Experiment.
Imagine you and another participant sitting in a waiting room after volunteering to participate in a psychological study about learning.
You chat for a while. You tell them about your job, how you just went grocery shopping, and how you love coffee. They tell you about their family, what they had for lunch, and how they just got back from the doctor after getting a check-up on their weak heart. You know, friendly conversation. Eventually, an experimenter dressed smartly in a white lab coat and carrying a clipboard enters the door and randomly assigns both of you a role.
You are labelled a “Teacher,” and your newfound friend is the “Learner.” You see him taken into a room, strapped into a chair, and fitted with what looks like electrodes. The door is closed, and you’re seated in front of this:
Above each of the switches was a label indicating voltage, from 15 Volts (labelled “Slight Shock”) to 350 Volts (labelled “Danger: Severe Shock”) to 450 Volts (simply labelled “XXX”).
Your job as the “Teacher” is to ask the “Learner” questions regarding a memorized list of paired words. If the “Learner” gets the question right, great. If not, the first 15-Volt switch gets thrown. Every time a mistake is made, the next switch down the line gets flipped.
The “Learner” gets a lot of questions wrong. You can hear them scream in pain as they get larger and larger shocks. You remember them saying they have a heart condition. You remember their family. You look to the experimenter, who has been standing in the corner recording the experiment the entire time, and voice your concerns. He responds simply and calmly with:
“Please continue.”
Or “The experiment requires you to continue.”
Or “It is absolutely essential that you continue.”
Or “You have no other choice but to continue.”
Eventually, the screams from the “Learner” stop. They don’t even answer the questions. Silence. Since no answer is considered incorrect, the next switch must be flipped. And the next. And the next. All urged on by the man in the lab coat.
Sounds horrifying, right? You might be thinking, “There’s no way I’d continue. I’d leave. I couldn’t do that to someone.”
As you probably already figured out, the “Learner” was an actor. There was no shock. There was no interest in learning. All the researchers wanted to know was how far a person would go before they stopped flipping the switches. What they discovered was completely unexpected.
66%. Two thirds. Two-thirds of the participants flipped every single switch. In some conditions, this number was even higher. This experiment performed by Stanley Milgram in the 1960s has become legendary in social psychology.
Again, you might think, “This is in a lab…how does this translate to real life? Does it even translate to the real world at all?”
Well, Stanley Milgram got the idea for the experiment while he (and the rest of the world) was watching the trial of Otto Adolf Eichmann. Eichmann was being tried (and later convicted and executed) for war crimes he committed as a Nazi officer during the Second World War. His defence? He was just following the orders of his superiors.
I’ll admit it. This has been a depressing blog. You could look at this as a confirmation that humanity just sucks. That we’re sheep, and our choices can be twisted and turned. That we, as humans, are capable of horrible things, all under the excuse of “someone told me to do it.” But here’s the thing. You, reading this, now know that. Suddenly, after seeing all this, you know that you always have a choice, can take accountability for your actions, and make your own decisions, regardless of what anyone else says. And knowing that should make you feel pretty good.