The Lucifer Effect Summary and Review

by Philip Zimbardo

Has The Lucifer Effect by Philip Zimbardo been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

Do you know the biblical story of Lucifer? Lucifer, once God's favorite angel, challenged his authority, and was punished by being sent to hell with a cadre of other fallen angels. There, he turned into Satan, the personification of all things evil.

This is known as the Lucifer Effect; even angels can turn bad under the right – or wrong – circumstances.

It’s not only the Bible that tells stories of good gone bad. Almost every day, in war zones as well as tight-knit communities, we read about normal people doing evil things.

So how does this work? Does it just happen? These book summary plumb the depths of the human psyche, searching for the mechanisms, situations and conditions that cause the Lucifer Effect. You’ll see that it doesn’t only apply to God's angels; mere mortals experience it as well.

In this summary of The Lucifer Effect by Philip Zimbardo, you’ll learn

  • what turned one US staff sergeant into an infamous sadist;
  • the secret behind the Jonestown cult massacre; and
  • how to resist evil and be a hero.

The Lucifer Effect Key Idea #1: Anybody and everybody can turn into a perpetrator of evil.

Look back on your life. Have you ever taken something that wasn’t yours when no one was watching? Most people have. Though not the greatest of evils, this kind of petty theft nonetheless says something about our willingness to do things we wouldn't normally do if the context or situation allows for it.

And yet, we still cling to the notion that some people are just born evil, while others are born saints. The truth, however, is that the line separating good from evil is exceedingly permeable.

Take, for example, the case of Ivan “Chip” Frederick, a former staff sergeant in the US Army. He was one of the guards at Abu Ghraib prison, which gained worldwide attention in 2003 for the abuse and torture of Iraqi prisoners held there.

Was Frederick a bad person before his tenure at Abu Ghraib? No – quite the contrary. He was a surprisingly normal, patriotic, baseball-loving young man from Virginia, whose psychological assessments yielded an average IQ and no signs of psychopathological traits whatsoever. But in the Abu Ghraib prison, he transformed into a cruel sadist.

What could cause this drastic change in behavior?

When people commit evil deeds, we often assume that those people are evil-natured. So when something like Abu Ghraib happens, we tend to point the finger at individuals. Traditional psychiatry takes the same view.

Psychiatrists and psychologists like to focus on what's called dispositional causes, i.e., inborn traits that cause our behavior. Genetics, character, pathologies – it’s believed that we carry these attributes with us.

In Frederick’s case, people cited dispositional causes – he was born a sadistic monster – to explain his actions. But there were indeed situational causes that were more responsible for his behavior than whatever character traits he was born with.

As you’ll discover in our next book summary, Frederick wasn’t born good or evil. The real causes of evil behavior lie somewhere else entirely.

The Lucifer Effect Key Idea #2: Our personalities aren’t consistent; they change depending on the situation.

Along with the common misperception that some people are born good and others evil, most people think that personalities are static – that they never change.

This view of human behavior is easily put to rest. For example, think about how you behave with your closest friends. Now consider how you behave around young children. Is it the same? Probably not. But why?

Human character is not fixed. Who you are and how you behave depends on the social contexts and circumstances in which you find yourself. This view is called the situational approach to understanding human behavior. It holds that what you do and who you are depend on the situation you’re in.

According to the situational approach, you are, literally, one person when you’re with your beloved and someone else when you’re talking to your boss.

This was demonstrated by the famous Milgram experiment, in which participants were told they were partaking in a memory-improvement study.

The subjects played the role of a “teacher” who was supposed to help the "learner" – played by an actor in another, concealed room – memorize word pairs. Each time the learner made a mistake, the teacher could punish them with an electric shock.

For every mistake, the intensity of the shock was increased, from mild pain at 15 volts to a life-threatening jolt of 450 volts. As the voltage and (apparent) physical pain increased, the learner started making more mistakes, screaming and refusing to answer.

Despite the learners’ seeming distress, most teachers continued increasing the shock level. If they were reluctant to do so, a third person, the “experimenter,” advised them to continue, telling the teachers that it was part of the rules and that he took responsibility for the test.

Of all the participants, 65 percent gave the learner the maximum (and life-threatening) 450 volts!

Were these people all monsters? No. But under the right conditions, they could be made to do monstrous things.

The Lucifer Effect Key Idea #3: The Stanford prison experiment turned ordinary people into cruel sadists.

The Milgram experiment is special in that the subjects couldn’t see their ostensible torture victims. But what if the people we harm aren’t hidden, but right in front of us?

In August 1971, the author conducted an experiment in which he put young male students in a mock prison at Stanford University and randomly assigned them to play the roles of guards and prisoners.

To make things as realistic as possible, the experiment used only people with a history of unremarkable behavior. The 24 subjects were predominantly white and middle-class, had no psychological or medical impairments and no record of criminal behavior. Later personality tests likewise showed no significant deviations in personality traits.

Participants assigned to be guards were equipped with wooden batons, a uniform and a pair of mirrored sunglasses. Upon reaching the prison, the prisoners, who were arrested by real police, were stripped naked, deloused and given a number that replaced their real names. Each prisoner resided in his own tiny cell.

It didn’t take long for this experiment to get out of hand. First, a prisoner refused to follow the instructions of the guards, who then retaliated by attacking him with fire extinguishers. The guards invented several ways to degrade and punish the prisoners, such as forcing them to urinate and defecate in a bucket in the prisoner’s cell, which they then refused to empty.

They removed the prisoners’ clothes and mattresses, forcing them to sleep on the cold floor. One prisoner on a hunger strike was confined to a dark closet and regularly shouted at.

As the days went on, the participants became more violent and crueler than was expected. After only six days, the author aborted the experiment.

Under normal circumstances, you wouldn't expect these students to suddenly transform into sadistic and abusive monsters. So what caused this drastic change in behavior? Our following book summary set out to answer that very question.

The Lucifer Effect Key Idea #4: Obedience to authority and evil deeds are always linked.

We’ve seen how extraordinary circumstances can turn normal people into callous monsters. Now we’ll look at the specific factors that turn good people evil.

One critical aspect of this transformation is obedience to authority – be it embodied by people, institutions or sets of rules.

Consider the Milgram experiment. The person in a lab coat who was ostensibly leading the experiment was an authority figure; he seemed to be trustworthy and genuinely interested in understanding how a person’s memory can be improved.

The participants also signed a contract holding them to the rules. When participants hesitated to give further shocks to the learners, this authority figure would remind them that they had signed a contract, and thus had to continue with the experiment.

As the experiment showed, the majority chose obedience over empathy.

Authority figures too can start out good and then gradually turn evil. Followers, while sometimes confused by this change, rarely disobey.

One of the most tragic examples of this is the Jonestown Massacre. Along with his more than 900 followers, the charismatic religious leader Jim Jones founded his very own utopia in the jungles of Guyana to escape consumerism and practice solidarity and tolerance.

But Jones gradually transformed from a caring father figure into a tyrannical egomaniac, who instituted forced labor and kept armed guards.

When Congressman Leo Ryan and media reporters came to inspect the compound, they were killed in a shoot-out. This tragic event then led to another that shocked the world.

Jones, probably thinking this would be the end of his utopia, held a long speech that convinced the majority of his followers to poison themselves and their children. They blindly obeyed, ending it all in mass suicide.

The Lucifer Effect Key Idea #5: If we aren't held responsible for our actions, then we’re easily seduced by the temptations of evil.

Evil deeds aren’t only caused by a deference to authority. Sometimes, a lack of personal responsibility can create the potential for evil.

Again, we can look to the Milgram experiment for some insight. The participants that shocked the learner in another room were told that they wouldn’t be liable for their actions. Instead, the scientist leading the experiment would take full responsibility for whatever happened.

In cases like these, antisocial behavior is magnified by something called deindividuation, whereby the person carrying out evil actions becomes totally anonymous. In essence, people are more likely to give into the temptation to be evil when they believe that no one will recognize them.

There are two ways to create the feeling of deindividuation, the first being to disguise one's outer appearance.

Think back to the Stanford prison experiment. The guards were provided with uniforms and mirrored sunglasses that would prevent eye contact; this decreased their sense of personal accountability.

The other way is to act in a setting where the risk of being recognized is slim.

This is clearly demonstrated by a field experiment in which the author placed an abandoned car in a neglected area in New York’s Bronx neighborhood. After a few hours, vandals emerged and began stripping the car, unloading the trunk and stealing the battery. After every valuable item had been taken, people began to demolish the car.

The author also placed another abandoned car in a neighborhood in Palo Alto, California, that had a sense of community. There, no one even touched the car. In fact, there were even three concerned neighbors who called the police.

So why the drastic difference between these two field experiments? Well the Bronx provided a setting anonymous enough for deindividuation to occur; the Palo Alto neighborhood did not.

The Lucifer Effect Key Idea #6: We are more liable to do evil to others if we think of them as less than human.

Most people in the world consider themselves to be morally upright. And yet, history abounds with examples of people inflicting cruel inhumanities upon their fellow human beings. How is this possible?

One crucial justification for cruel behavior toward other human beings is called dehumanization, i.e., the process of ceasing to consider someone fully human.

This psychological process was revealed in a study by Stanford psychologist Albert Bandura. In the study, volunteer students were told to supervise and punish another group of students based on certain decisions they had to make. Punishments would vary depending on their judgment of the quality of these decisions. The worse the judgment passed on any given decision, the harsher the punishment inflicted.

Before the students began judging the other groups' decisions, they were made to overhear a conversation among the researchers about the groups of decision-makers. One group was described as "an animalistic, rotten bunch," and the other as "perceptive" and "understanding."

Punishments varied depending on how the groups were described by the researchers. The students punished the "animalistic" – i.e., dehumanized – group much more severely than the other, more humanized group.

Understanding dehumanization is critical in understanding the mechanisms of racism, prejudice and discrimination. When others are stigmatized as tainted and inhuman, they are considered unworthy of moral considerations and thus become targets of cruelty.

We see this in numerous historical examples, such as the “Rape of Nanking,” during the Japanese invasion of China. During the invasion, Japanese soldiers massacred Chinese civilians. One Japanese general explained that they did so because they thought of the Chinese as things, not people like themselves.

The Lucifer Effect Key Idea #7: Euphemistic language and powerful ideology give us a way to justify our evil actions.

We’ve seen how obedience, deindividuation and dehumanization can lead us to commit evil acts. But there is yet another enabler of evil – namely, our ability to wrap evil actions in words that make them sound as if we’re actually doing a good deed.

In social psychology experiments where human behavior is tested, there is always a kind of cover story, that is, a desirable goal that seemingly justifies cruel and immoral action.

For example, in the Milgram experiment, participants were told, and indeed believed, that they were helping to make medical history – even as they seemed to harm or even kill fellow human beings.

The cover story in the Milgram experiment was that the shocks would help scientists to gather important information about how to improve a person’s memory. This gave participants an easy avenue to justify their actions.

What is called the “cover story” in social psychology could be called ideology in real life. With the right ideological frame, perpetrators of evil can view their deeds with a lens through which their actions appear good or even honorable.

Consider the US invasion of Iraq and the subsequent torture committed in the Abu Ghraib prison. The big ideological cover story was that there was a serious threat to national security that made the War on Terror necessary. The apparent imminence of this threat allowed the Bush administration to reclassify torture techniques in a way that would make them legal, all to provide the military with the information they supposedly needed to preserve national security.

So, what first appeared to be the mistakes of a few bad apples was actually part of an ideology – that torture would make the United States more secure. With ideology as their justification, soldiers felt safe and right in torturing prisoners.

With so many enablers of evil out there, you might be wondering what it takes to be a good person. Our final book summary gives some guidance.

The Lucifer Effect Key Idea #8: You can still resist evil and act morally and heroically.

As you now know, the situations we find ourselves in influence our capacity for evil. But no one wants to be evil. So what do you do when you want to resist the forces of evil?

First and foremost, you should always consider yourself responsible for your own decisions and combat all the possible excuses that pop into your mind. Just because someone can’t see through your mirrored sunglasses doesn’t mean that you are suddenly less responsible for your actions.

Second, if you ever feel that you are obeying an unjust authority, you should simply stop doing so. Looking back to the Milgram experiment, not all participants allowed themselves to engage in such cruelty. Some defied authority and stopped the experiment. You are capable of defiance, too.  

Finally, question the stories and ideologies that justify evil actions. For example, if the invasion of a foreign country is framed as “bringing freedom and democracy to the people” or as a “War on Terrorism,” take a moment to consider whether the actions actually advance the stated mission.

It’s one thing to resist evil, but it’s another to actively work for good, i.e., to be a hero. But what makes a hero?

There are two main features that define heroes. First, they take action while everyone else is passive. Second, they put others before themselves.

Take Autrey Wesley, the aptly named “Subway Hero” of New York, as inspiration. When he saw that a young man having a seizure fell on the subway tracks and that there was no time left to lift him up out of harm’s way, he jumped into the tunnel to move the young man between the tracks, thus saving his life. While others simply looked on, he took action, risking his own life in the process.

Ultimately, we all have the potential to become either a monster or a hero. So focus on being a hero in waiting, and when you’re confronted with a difficult situation, you’ll act accordingly.

Final summary

The key message in this book:

People aren’t born evil. Rather, they’re turned evil by the situations they find themselves in. Indeed, our strict moral standards are much more corruptible than we think. Ultimately, it’s up to us to decide whether we’ll engage in evil or, instead, act like heroes.

Suggested further reading: Snakes in Suits by Paul Babiak and Robert D. Hare

Snakes In Suits examines what happens when a psychopath doesn’t wind up in jail, but instead puts on a suit and gets a job. The book outlines the tactics these predators use, how they damage companies and how you can protect yourself.