Question: Why do people do horrible things? Slave owners, and Nazis, any of the perpetrators
of history's atrocities. How do they so successfully dehumanize other people for so long? At a
smaller scale, how do bullies in the lunchroom manage to treat other kids with such cruelty
and then go home and pet their dog and call their grandma and say "happy birthday?" Most of what we've been studying so far has
focused on the individual. We've covered sub-fields of psychology like cognitive, personality,
and clinical psychology, which tend to address the phenomena contained within a single person's
mind. But there's also social psychology, which focuses on the power of the situation.
It examines how we think about, influence, and relate to one another in certain conditions.
And it's better equipped to answer this question about people doing horrible things. Social psychology can not only give us some
of the tools we need to understand why people behave brutally, it can also help us understand
why we sometimes act heroically. Like why did Jean Valjean reveal his true
identity to save some stranger from being tried in his place? And why did Nazi Oskar
Schindler risk his own hide to save over a thousand Jewish people? What made Darth Vader
throw the Emperor down that hole, even as he was being electrocuted? I can't say there are any easy answers about
humanity's greatness or it's horribleness. Certainly, there aren't any that we can find
in the next ten minutes. But we can point ourselves in the right direction, and it starts
with social thinking. When we're trying to understand why people
act like villains or heroes, one of the things we're really asking is, "Did they do what
they did because of their personality? Or their situation?" Austrian psychologist Fritz
Heider began plumbing the depths of this question in the 1920s when he was developing what's
now known as the Attribution Theory. This theory simply suggests that we can explain
someone's behavior by crediting either their stable, enduring traits - also known as their
disposition - or the situation at hand. And we tend to attribute people's behavior to
either one or the other. Sounds pretty simple, but it can be surprisingly hard to tell whether
someone's behavior is dispositional or situational. Say you see Bruno at a party and he's acting
like a wallflower all night. You might assume that he just has a shy personality. But maybe
he doesn't; maybe he'd ordinarily be re-enacting all the moves from Footloose at this party
but on this night, he had a twisted ankle or a headache or he'd just seen his ex with
somebody new - those are all situational explanations. Overestimating the forces of personality while
underestimating the power of the situation is called the Fundamental Attribution Error.
And as you can imagine, making this kind of error can really end up warping your opinion
of another person and lead to false snap judgments. This might not be such a big deal when it
comes to Bruno and his awesome dance moves but according to one study of college students,
7 in 10 women report that men have misread their polite friendliness - which would be appropriate
for the situation - as a sexual come-on. We choose how we explain other people's behavior
everyday and what we choose to believe can have big consequences. For example, our political
views will likely be strongly influenced by whether we decide to attribute poverty or
homelessness to personal dispositions, like being lazy and looking for a hand-out, or social
circumstances like lack of education and opportunity. And these attitudes can, in turn, affect our
actions. Activists and politicians know this well and they can use it to their advantage
to persuade people in different ways. In the late 1970s and 80s, psychologist Richard
Petty and John Cacioppo developed a dual process theory of understanding how persuasion works.
The first part of their model is known as the Central Route Persuasion and it involves calling on
basic thinking and reasoning to convince people. This is what's at work when interested people
focus on the evidence and arguments at hand, and are persuaded by the actual content of
the message. So when you're watching a political debate, you might be persuaded by a candidate's
particular policies, positions or voting history. That is, the stuff they're actually sayin'. But we all know that persuasion involves more
than that. There is also Peripheral Route Persuasion at work. This influences people
by the way of incidental cues, like a speaker's physical attractiveness or personal relatability. There's not a lot of hard thinking going on here,
it's more of a gut reaction. So you might decide to vote for a particular candidate because you think
they're cute or they're from your home town. Peripheral Route Persuasion happens more readily
when you're not paying a ton of attention, which is why billboards and television ads
can be scarily effective. So that's how politicians and advertisers and maybe bosses and teachers
and pushy friends try to change our behavior by changing our attitudes. But, it turns out that the reverse is true
too. Our attitudes can be affected by our behaviors. You might have heard about the
phrase, "Fake it till you make it." Meaning, if you smile when you're actually sad the
act of smiling may carry you through an attitude change until you actually feel better. Sometimes we can manipulate ourselves this
way, but it's also an incredibly effective method people use to persuade each other.
It generally works best in increments, through what psychologists call the foot-in-the-door
phenomenon. People tend to more readily comply with a big request after they've first agreed
to smaller more innocuous requests. Like Darth Vader didn't just go from "Go get
'em Anakin," to Dark Lord overnight. He was slowly enticed to the dark side, by a series
of escalating actions and attitude changes. Do this favor for me, now run this errand,
now kill these Padawans. Now blow up a planet! What started this small actions went on to
become big ones, suddenly transforming Vader's belief's about himself and others. There's plenty of experimental evidence that
moral action really does strengthens moral convictions, just as amoral action strengthen
amoral attitudes. And there is perhaps no better example of this than the Stanford Prison
Experiment. Back in 1971 Stanford psych professor Philip
Zimbardo and his team put an ad in the local paper looking for volunteers to participate
in a 14 day experiment. After screening around 70 applicants, 24 male college students were
deemed physically and mentally fit enough to participate in the study. For their troubles
they'd each be given $15 a day. The participants didn't know the exact nature
of the experiment, just that it involved a fake prison situation. And with a coin flip,
half were randomly deemed prisoners and the other half guards. The guards were told that
it was the prisoner's behavior that was being studied. The prisoners weren't told much of
anything, aside from that they had been arrested and taken to prison. Other than that neither
group had many specific instructions. Zimbardo wanted to observe how each party
adapted to their roles, and so, on a quiet Sunday summer morning in Palo Alto, real cops
swooped in and arrested the prisoners in their homes under charges of robbery. They were
frisked, handcuffed, and read their rights. Back at the station, they were formally booked
and then blindfolded in a holding cell wearing only hospital gowns. The researchers had taken
great care to make sure that the setting was extremely realistic, which is one reason they
used real cops in the arrest before handing the prisoners over to the fake guards. And
it took no time at all for this role-playing to become really, really real. The initial trauma of the humiliation of the
arrest, the booking, strip-searching and waiting, immediately kicked off a loss of identity
in the prisoners. A few prisoners only made it through the first night before they became
too emotionally distressed and had to be released. Things only went downhill from there. Though
the guards could act any way they wanted as long as they didn't physically hurt anyone,
encounters quickly became cruel, hostile, and dehumanizing. Guards hurled insults and
commands, referred to the prisoners only by number, and put some of them in solitary confinement.
Prisoners started breaking down, others rebelled, and still others became passively resigned
as if they deserved to be treated so badly. Things got bad enough that the experiment
ended after only six days, causing relief in the fake prisoners, while interestingly
leaving some fake guards feeling angry. Luckily, everyone involved bounced back to
normal once out of the prison setting. All of those negative moods and abusive behaviors
were situational, and that fact reinforced the important concept that the power of a
given situation can easily override individual differences in personality. Although it would
never fly by today's ethical standards, Zimbardo's famous study remains influential today because
it sheds such a harsh light on the nature of power and corruption. And yet, people differ. Many people succumb
and become compliant in terrible situations, but not everyone does. Lots of people risked
their lives to hide Jewish people in World War II, help runaway slaves along the Underground
Railroad, keep Tutsi refugees safe during the Rwandan genocide, or generally refuse
to comply or participate in actions they didn't believe in. Some people can, and do resist
turning to the dark side, even when it seems like everyone around them is going mad. And yet,
the fact is, these people tend to be in the minority. So why? Why does it seem so easy to rationalize
a negative action or attitude and so hard to muster the positive ones? One partial explanation
comes from American social psychologist Leon Festinger's theory of cognitive dissonance. It's one of
the most important concepts in psychology. Festinger's theory begins with the notion
that we experience discomfort - or dissonance - when our thoughts, beliefs, or behaviors
are inconsistent with each other. Basically, we don't like to confuse ourselves. For example, if Bruno was generally considered
a peaceful person but finds himself suddenly punching at his friend over a fender-bender, he's likely
experiencing some level of cognitive dissonance. So, by Festinger's thinking, Bruno might relieve
this tension by actually modifying his beliefs in order to match the action's he's already
committed, like telling himself, "Turns out, I'm not such a nice guy after all, maybe I'm
actually a bully." On the other hand, he might resolve his internal tension by changing how
he thinks about the situation. He might still think of himself as a peaceful person, but
realize that an unusual situation led to an unusual action, like, he'd had a bad day and
it was his mom's new car, or his friend was just really askin' for it. So, he can keep
being the ordinarily peaceful guy he was before. It's kind of an inverted fundamental attribution
error if you think about it. Attributing a person's actions mainly to the situation,
instead of his personality. The point is that this mismatch between what we do and who we
think we are induces tension - cognitive dissonance - and that we tend to want to resolve that
tension. That's part of what turns an Anakin into a Darth Vader, and then, if we're lucky,
back into an Anakin. Today you learned that social psychology studies
how people relate to each other. We discussed Fritz Heider's attribution theory, and fundamental
attribution error. You also learned how attitudes can affect actions, like through the duel-process
theory of persuasion, and also how behavior can change attitudes, like through the foot-in-the-door
phenomenon. The Stanford prison experiment illustrated how a situation can override individual
differences in personality, while Leon Festinger's theory of cognitive dissonance explained how we ease
the tension between conflicting thoughts and actions. Thank you for watching, especially to all of our
Subbable subscribers, who make Crash Course possible for them, but also for everyone else. To find out how you can become a supporter, just go to subbable.com. This episode was written by Kathleen Yale,
edited by Blake de Pastino, and our consultant is Dr. Ranjit Bhagwat. Our director and editor
is Nicholas Jenkins, the script supervisor and sound designer is Michael Aranda, and
the graphics team is Thought Cafe.