You Are Not So Smart Summary and Review

by David McRaney

Has You Are Not So Smart by David McRaney been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

A common misconception is that we humans are rational, logical beings who see the world as it really is. But, in fact, a lot of research suggests that we have no idea why we act or think the way we do.

These book summary are a celebration of self-delusions and irrational thinking, and they will help you better understand yourself and everyone around you.

Using the latest psychology research and plenty of engrossing anecdotes, the book explains what strategies we use to deceive ourselves, and the consequences our self-deception has on us and others.

Think you’re an open-minded person? In these book summary, you’re likely to discover that you’re not, and that you’re actually only open to confirmations of what you already believe.

You’ll also discover that probably everyone you know thinks they’re more popular than you.

You’ll find out why, counterintuitively, you’re actually better off if your car breaks down on a quiet country road than on a busy city street.

And, perhaps most importantly, you’ll learn why you should never strip your clothes off for someone who calls you on the phone claiming to be a police officer.

You Are Not So Smart Key Idea #1: We delude ourselves that random situations have meaning or that we can control them.

It would be nice to believe that we humans are rational beings who see the world as it really is. But, in fact, nothing could be further from the truth.

Rather than being mere objective observers of the world around us, we constantly delude ourselves in order to make sense of coincidences and other random happenings.

We do this by applying order to the random events and chaotic coincidences that occur around us.

For ancient man, the ability to recognize patterns was essential to his survival: it enabled him to find food, and to distinguish friends from enemies and predators from potential prey.

As a result, we’ve evolved into beings always on the lookout for patterns in the “noise” around us. We simply aren’t capable of switching off our pattern-recognition ability – which explains why we often see patterns where none exist.

Have you ever, say, marveled at how a particular number – for example, seven – keeps popping up during your day? Or what if you find out that your blind date’s mother shares a name with your mother – does this make you think, even briefly, that you’re meant for each other?

The fact is, the number seven is as common as all other numbers and countless other mothers share your mother’s name. They’re merely coincidences, but we see what we want to see, and we want to see meaning.

What’s more, we not only find meaning in random situations, but we also trick ourselves into believing we can control them.

For example, although the numbers that come up when a die is rolled are completely random, studies have shown that the more powerful a person feels, the more they believe they can predict the next roll of the die.

Also, research indicates that most people engage in at least some “magical thinking,” like crossing our fingers when we wish for something particular to happen. Here, too, we believe we can control the uncontrollable.

You Are Not So Smart Key Idea #2: Without realizing it, we often make up stories to explain our decisions and feelings.

Do you have a favorite song or movie? Think about it for a moment and try to explain why you like it so much.

Even though it’s difficult to do, odds are you still came up with some kind of explanation.

The only problem is that your explanation is probably totally fictional. As research has shown, when we think we’re explaining our emotions and actions, we’re often instead fabricating reasons for our feelings and decisions.

That’s because we aren’t actually aware of our thought processes: we get, at best, the slightest glimpse of them at work.

In the same way that our eyes have a blind spot that our brain automatically compensates for, our mind fills in all the gaps of our reasoning and our memory, too.

If you recall an event from your past, you’ll actually remember only some parts and details of it. But, very quickly, the brain begins to provide details, most of them fictional, to fill in the rest of the picture just so we can obtain a sense of continuity in our memories. This explains why, if we tell a story again and again, it will differ from telling to telling, sometimes even contradicting earlier versions.

We don’t recognize these fictitious additions precisely because we’re unaware of our thought processes.

For example, in one study set in a department store, nylon stockings were arranged in a row and subjects were asked to rate their quality just by looking at them. Most of the subjects chose the stocking that was positioned to the right – even though the stockings were identical.

When asked to explain their choice, these subjects commented on the perceived texture of the stocking, but no one mentioned its position.

Moreover, even when asked outright whether the position had any effect on their decision, the subjects responded they were certain it didn’t.

As this demonstrates, being unaware of how we come to our decisions doesn’t seem to cause us any problems. We simply get creative: we make up justifications and move on.

You Are Not So Smart Key Idea #3: We seek confirmation for our beliefs and ignore whatever might challenge them.

There’s a good chance that most of us think our opinions have developed over years of rational analysis.

But, in fact, our opinions are not objective or rational at all, because we only pay attention to information that confirms what we already believe. In short, we’re confirmation biased.

Studies show that people usually spend more time reading an essay if its argument matches their own opinions, suggesting that, often, we don’t really read to learn new information but to validate our existing views.

Also, an analysis of Amazon customers’ buying trends during the 2008 US presidential election revealed that those who purchased books portraying Barack Obama positively were already staunch supporters of his campaign. In other words, people were seeking confirmation, not information.

Even our memory is confirmation biased: we recall those events which support our beliefs and conveniently forget any which contradict them.

Consider the following study: two groups were presented with a day-in-the-life story about a fictional character called Jane. In the story, Jane acts in ways which demonstrate she could be both introverted and extroverted.

After a few days’ break, one group was asked if they thought Jane would make a good librarian; the other, if she would make a good real estate agent.

The first group remembered Jane as an introvert and assured the researcher that Jane would make an excellent librarian. The other group remembered her as an extrovert, and guaranteed that she would be a great real estate agent.

Here, confirmation bias led all the subjects to remember those parts of the story that confirmed the either-or question, but couldn’t remember all the other potentially contradictory aspects of the story that didn’t help them arrive at their answer.

What’s more, after coming to this conclusion, they stuck to their initial belief and, when asked, insisted that Jane wasn’t suited for the other job.

Without being aware of it, we’re always seeking out information which confirms our existing beliefs, while conveniently avoiding any evidence or opinions that contradict them.

You Are Not So Smart Key Idea #4: We go to great lengths and use different strategies to maintain our self-esteem.

Without our self-esteem, having confidence in ourselves and getting through the day would be extremely difficult. For that reason, we tend to nurture it every chance we get.

One strategy we employ to maintain the resilience of our self-esteem is to accept sole credit for our successes and blame external factors for our failures. Research has shown that this is the case in a variety of situations, ranging from board games to final exams.

Another strategy is to pay close attention to the successes and failures of others in order to judge our own worth and boost our self-esteem.

For example, studies have shown that, on average, every person you know thinks they’re more popular than you, and vice versa. Also, most of us believe that we’re better at our jobs than those we work with, that we act more ethically than our friends, that we’re more intelligent than our peers, and so on.

The final strategy to maintain our self-esteem is called self-handicapping. It involves coming up with excuses for an imagined future failure to avoid the risk of feeling bad about ourselves when that failure becomes a reality. Indeed, we even create the very conditions for ourselves to fail.

This phenomenon was the focus of a study in which subjects were given a difficult test and told afterward that they’d scored perfectly, whether it was true or not.

The twist was that the subjects were then offered the chance to either take a performance-inhibiting or a performance-enhancing drug before a second test.

Most people chose the drug they thought was performance-inhibiting (actually, a placebo), demonstrating that most people wanted to protect their newly gained self-esteem by creating conditions ahead of time that could excuse a potential failure.

It appears that, to maintain our self-esteem, we find ways to inflate what we like most about ourselves and, on top of that, create future conditions for failure so it won’t be our fault if we don’t succeed.

You Are Not So Smart Key Idea #5: Our unconscious mind is a powerful force, yet we are unaware of its effect on us.

For most of us, the unconscious mind is a strange and primal aspect of the human experience, essential to things like breathing, swallowing and book summarying. But there’s actually a lot more to it.

Indeed, our unconscious mind affects us constantly: it receives input from our surroundings all the time and causes us to think and behave in certain ways.

For example, consider a study in which subjects were asked to recall a time when they’d done something they considered sinful and to describe how it made them feel.

Half the participants were then given the opportunity to wash their hands. Finally, all of them were asked if they would be willing to help out a graduate student by participating in another study – for no pay.

While those who’d washed their hands agreed to help 41 percent of the time, the participants who hadn’t washed their hands agreed 74 percent of the time. The researchers concluded that those who’d washed their hands had unconsciously also washed away their guilt. In other words, their unconscious minds had connected their hand-washing with notions like purity, so they didn’t feel the need to further atone for their “sins” by helping the graduate student.

As this suggests, we aren’t aware of the powerful influence of our unconscious mind.

In another study, people were instructed to play a game in which they could earn money. Prior to this, some participants were exposed to business-related images, others to more “neutral” pictures. During the game, it turned out the “business” group was more likely to try to keep as much money as possible, while the neutral group divided the money more evenly among their fellow players.

Afterward, the participants talked at length about what was fair and unfair behavior in the game, their impressions of the other players and how these had influenced the decisions they made. But not one of them mentioned the images they were shown beforehand. They simply weren’t aware of how their unconscious mind had affected their behavior.

You Are Not So Smart Key Idea #6: We think we’re more capable, more special and more attention-grabbing than we really are.

At some point, most of us have thought of ourselves as especially skilled at something, or as particularly special people. In fact, we tend to see ourselves as more special and more skilled than we actually are.

As research has repeatedly shown, if we succeed in achieving something, we’ll tell everyone, but if we fail, we do our best to forget it. And when we compare our accomplishments and skills with those of others, we have a tendency to highlight the positive and downplay the negative.

Furthermore, most of us don’t see ourselves as “average people,” though we do seem to think of others in that way. For instance, we view the ordinary events of our daily lives as more significant than those of others’ lives.

When it comes to evaluating our abilities, this egocentric thinking makes it hard for us to see ourselves as just average. Indeed, the very idea of being average is a huge challenge to our self-esteem, so we search tirelessly for ways to affirm our uniqueness and then end up wildly overestimating how special we are.

We also delude ourselves by thinking that we draw more attention to ourselves than we actually do.

Consider the following study: participants play a competitive video game; then, each of them is instructed to rate, first, the amount of attention that their teammates and opponents paid to their performance and, second, how much attention they paid to the others.

All the participants paid far more attention to their own performance than to that of the others. Nevertheless, each of them felt that the others were keeping an eye on how well they were playing the game.

Even though it can sometimes be necessary for our self-esteem to think of ourselves as more special or skilled than everyone else, in reality, we’re often neither as smart nor as special as we believe.

You Are Not So Smart Key Idea #7: We aren’t as helpful or fair to the people around us as we think.

Imagine you see a car broken down at the side of the road. Would you pull over and lend a hand or keep on driving, telling yourself that someone else will be along soon enough?

If you’re like everyone else, the more cars or people passing by, the less likely you’ll be to stop and help out. This is called the bystander effect, and it refers to the fact that our inclination to help others diminishes if there are witnesses around.

A tragic illustration of the phenomenon is the story of Kitty Genovese, who was the victim of a thirty-minute long knife attack in the middle of a New York parking lot outside her apartment while 38 witnesses ignored her cries for help.

Though the story has since been criticized as an instance of media sensationalism, it nevertheless increased psychologists’ interest in the bystander effect. Their subsequent research demonstrated that the more witnesses there are to a person in distress, the lower the chances that any of them will help out. Counterintuitively, this means that if your car breaks down, you’ll be more likely to receive help if it happens on a country road than on a busy city street.

And we’re not only less helpful than we think; we also have a tendency to be judgmental.

As research shows, we base our first impressions of people on prejudices and generalizations, and often leap to conclusions about a person based solely on how closely that person fits a particular stereotype.

Just think of those times when you’ve seen a child crying and screaming in a supermarket while the parent continues to go on with their grocery shopping, apparently oblivious. If you’re like most people, you’ll jump to the conclusion that the parent is lazy and unattentive – even though you might simply be catching them on a bad day when the parent is exhausted or the child may be uncontrollably hyper. You make this leap even though you lack the information necessary to draw proper conclusions.

You Are Not So Smart Key Idea #8: The say-so of authority figures can influence our actions to an incredible extent.

Most of us like to believe that we’re strong, independent individuals who don’t bow down to authority or cave in to social pressure.

But our desire to conform is actually very strong, particularly in those cases where we act on the orders of an established authority figure.

Take, for example, the very disturbing case of workers employed at a fast-food restaurant chain who were manipulated into shaming and molesting their customers and fellow employees.

Over the course of four years, criminal prankster David Stewart made over 70 phone calls to restaurants, claiming he was a police officer and that one of their employees had committed a crime. His “investigations” involved convincing whoever answered the phone to strip the accused person and to describe and then touch that person’s naked body. Even as Stewart’s requests became increasingly bizarre and sexual, most people were so convinced by his authority that they fully complied.

But, surely you wouldn’t comply so readily?

Actually, it’s quite easy for most of us to become an instrument of authority – as was demonstrated by a famous experiment conducted by Stanley Milgram.

Milgram’s experiment involved having a subject in one room administer an electric shock to a person (an actor) in a separate room whenever that person answered certain, increasingly difficult questions incorrectly. With each incorrect answer, the voltage of the shock also increased.

Because Milgram was testing people’s willingness to obey authority figures, he had someone dressed in a lab coat instruct the subject to send the shock.

At some point, most of the subjects asked to stop the procedure because the screams from the actor supposedly receiving the shocks became increasingly distressing.

The troubling thing is that when the “scientist” urged them to continue regardless, most of the subjects conformed. No less than 65 percent of them administered shocks at a voltage they believed would cause instant death.

It seems that, although we value our individuality and see ourselves as nonconformists, we’re all capable of being highly influenced by authority figures and conforming readily when given orders by them.

Final summary

The key message in this book:

We’re not really as smart as we think we are, but our self-delusions help us ignore that fact. Scientific studies show that we’re constantly deluding ourselves by, for example, finding meaning in random situations, seeking out information we already believe to be true, and making up stories to explain our unconscious preferences. Self-delusions can sometimes keep us sane and help us thrive, but they can also be harmful: they make us judgmental and unhelpful, and they cause us to blindly obey authority figures. To prevent them from causing this kind of harm, it helps to simply be aware of them.

Actionable advice:

Question authority to avoid potentially harmful situations.

Our desire to conform with others is strong and often unconscious. While this desire helps us follow useful social norms – e.g., those that make it easier to work well with other people – it can also lead to dangerous situations. For instance, our readiness to follow the instructions of authority figures can be easily abused. Therefore, never be afraid to question authority when your actions could end up harming yourself or others. Even in apparently harmless everyday situations, if you’re unsure why you should follow a given procedure, feel free to ask the authority to justify themselves.

Avoid the bystander effect by choosing one person from a crowd to help you.

If you’re ever in a situation where you desperately need someone’s help and find yourself on a crowded street, don’t just yell for help to everyone around you. In such situations, people typically freeze, because they look at each other, see that others aren’t helping, either, and assume this means the situation isn’t really serious. Instead, direct your call for help to one particular person, pointing at them. This will likely get them to help, because others will now expect him or her to help, thus increasing the social pressure for them to do so.

Similarly, if you hear someone shouting for help in a crowd, remember people’s natural tendency to ignore them when others are around, and be the one to take action yourself.