Freakonomics Summary and Review

by Steven D. Levitt and Stephen J. Dubner

Has Freakonomics by Steven D. Levitt and Stephen J. Dubner been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

At this very moment, there are probably countless people who wish to affect your behavior: politicians, police, your doctor, your boss, your parents or your spouse, to name just a few. Although the tactics used may vary from threats and bribes to charm and deceit, all attempts have something in common: they rely on incentives.

An incentive is simply a means of urging people to do more of a good thing or less of a bad thing.

Incentives fall into three general categories: economic, social and moral. Most successful incentives – the ones that attain the desired change in behavior – combine all three types.

One area where incentives are crucial is in the field of crime. People regularly have opportunities to cheat, steal and defraud, so it’s interesting to examine what incentives keep them from doing so.

The risk of going to prison and the related loss of employment, house and freedom are all essentially economic in nature, and certainly form a strong incentive against crime.

There is also a strong moral incentive, as people don’t want to do something that they feel is wrong.

And finally there is a strong social incentive, as people do not want to be seen by others as doing something wrong. Often, depending on the crime, this can be a stronger incentive than economic penalties.

It is this combination of all three types of incentives that encourage most people to refrain from crime.

Incentives can affect your wallet, your pride or your conscience.

Freakonomics Key Idea #1: Introducing incentives can often have unintended consequences on people’s behavior.

We are all familiar with attempts to incentivize behavior. Whether it is parents offering small treats to their children for doing schoolwork or companies paying bonuses to employees who hit their sales targets, everyone has had incentives dangled in front of them.

However, influencing behavior by adding incentives is often a more complicated affair than it might first seem. Incentives often operate in an environment where small changes can have a dramatic impact, and not always in the way those initiating the changes would hope.

In a study of day care centers in Haifa, Israel, economists tried to reduce the number of parents arriving late to pick up their children. To accomplish this, they introduced the economic disincentive of a small $3 fine.

But rather than reduce the number of late pick-ups, the change actually doubled them. How could adding this disincentive have backfired?

One problem may have been that the amount was not big enough, signaling to the parents that late pick-ups were not a significant problem.

The main issue, however, was that this small economic disincentive replaced an existing moral disincentive: the guilt parents felt when arriving late. Parents could now effectively buy off their guilt for a few dollars, so they were less worried about being late.

Furthermore, once the signal had been sent, the effect could not be undone. The removal of the fines had no remedial effect on the number of late pick-ups.

As the example shows, setting incentives can be tricky, especially when other forms of incentive are already present. When introducing incentives, think carefully about whether they might displace existing ones.

Introducing incentives can often have unintended consequences on people’s behavior.

Freakonomics Key Idea #2: Incentives are context dependent

Have you ever robbed a bank? Probably not, since there are a variety of disincentives (e.g., prison, loss of social stature, a guilty conscience) that keep you from trying. And yet, some people do rob banks even though they face the same disincentives. Why?

Because different people react differently to the same incentives.

This is fairly self-evident, but more surprisingly, even one and the same person may respond differently to the same incentives on different occasions.

Consider the data collected by Paul Feldman, who ran a business providing bagels to office snack rooms. With the bagels, he left an unattended cash-box for customers to pay in, and picked up the cash and leftovers at the end of each day. Each customer had the same incentives to pay – the desire to be and look honest – so the variations in payment rates each day and at the different locations revealed some interesting trends about honesty in changing conditions.

The key contributing factor in how honest his customers were seemed to have been personal mood, which was in turn affected by other factors:

The weather played a big role with higher payment rates on unseasonably warm days and lower rates on unseasonably cold days. Stressful holidays like Christmas and Thanksgiving dramatically reduced payment rates while more relaxed holidays pushed the rates up.

Similarly, office morale played a part, with people in happy offices being more likely to pay. There was also a universal increase in payment rates following 9/11, which the author attributes to a general surge in empathy.

The lesson is that the incentives that work for some people on some days may not work for the same people on other days, depending on shifts in global, local or personal circumstances that affect their moods.

Incentives are context dependent: what works when it’s sunny might not when it’s raining.

Freakonomics Key Idea #3: Experts can use their informational advantage to exploit laypeople for economic benefit.

Everyone needs the advice of an expert from time to time. Whether you are having something repaired, making a big purchase or dealing with a legal issue, you rely on someone with specialist knowledge to help you navigate through unfamiliar territory.

Experts have access to a wealth of information that the layperson does not, meaning an information asymmetry exists. Although the experts are usually paid a fee or commission for their expertise, they can also use their informational advantage to cheat laypeople for additional gain.

Consider real estate: For most people, selling a house is one of the biggest financial transactions they will make in their lifetime. It can be a complicated business, which is why you rely on your real estate agent who has all the relevant information on property prices and market trends and is presumably also motivated to get the best price possible to raise her commission. You feel assured knowing you have this level of expertise on your side.

While reassuring, this thinking is a little too simplistic. A broader view of the incentive story reveals that although the estate agent’s commission is linked to the final sale price, the additional benefit is small relative to that of just closing the deal. The agent’s incentive to make a new sale quickly outweighs the part of the incentive that is meant to be aligned with the customers’ goal.

A comparison study reveals that when estate agents sell their own houses they leave them on the market longer and get a higher price than when commissioned by clients. Hence beware; when an estate agent encourages you to take the first decent offer on your house, it is not to maximize your profit but their own.

Experts can use their informational advantage to exploit laypeople for economic benefit.

Freakonomics Key Idea #4: Experts can use fear and anxiety to cheat laypeople.

The unknown can be pretty scary. In any transaction in an area you have little knowledge or information about, you will likely be worried and anxious. Experts frequently leverage this fear for financial gain.

This can happen in a number of ways: A car salesman can convince you not to buy a cheaper model by instilling the fear that it is unsafe. A real estate agent can play on your fear of missing out on your dream house to get you to put in a higher bid. A stockbroker can tell you that if you don’t invest in a certain stock now, you’ll miss the boat and have to live with the regret for the rest of your life.

Fear undermines our rational decision-making ability, which is why experts use it to scare us into making decisions we may otherwise not have made.

In face-to-face situations, social fears can exacerbate this problem: the expert can exploit our fears of looking stupid, cheap or dishonorable.

Imagine the stressful situation of arranging the funeral of a loved one. The funeral director, knowing you know little about his business, can use your anxieties about giving your loved one a proper burial to steer you to a more expensive casket than you would have otherwise chosen.

Be wary of situations where an expert seems to be playing on your fears, particularly when you’re told you need to make an immediate decision. In such cases, try to have strategies in place that will buy you valuable time and space to consider your choices in peace, such as saying you need to get a second opinion. You can also try to even out the information asymmetry by researching the topic of the transaction in advance.

Experts can use fear and anxiety to cheat laypeople.

Freakonomics Key Idea #5: The Internet has greatly helped reduce the informational advantage of experts.

In the 1990s, the price of life insurance fell dramatically. There was no similar trend in other forms of insurance, or any significant shifts in the life insurance business or customer base itself. So, why this sudden drop in prices?

The answer lies with the emergence of the Internet, or more specifically of price comparison websites. These websites enabled customers to compare insurance prices offered by dozens of different companies in mere seconds. Price information that had been extremely time-consuming to gather just a few years earlier was available at the click of a mouse. As the policies were fairly similar in nature, the more expensive companies had no choice but to lower their prices, driving down the overall price of the policies.

This example demonstrates how important the Internet has been in eroding and reducing information asymmetries all over the world. At its core, it is a highly efficient medium for sharing and redistributing information from those that have it to those who do not.

Consumers are now able to quickly and conveniently gather information about products and prices before dealing with an expert, providing themselves with a much better idea of what they should pay and what should be included for that price, removing much of the expert’s informational advantage and hence unfair financial gain.

If you are buying a house today, for example, you can go online and find out for yourself what a reasonable offer would be rather than relying on the word of your estate agent.

The Internet has greatly helped reduce the informational advantage of experts.

Freakonomics Key Idea #6: When sellers leave out information, customers often penalize them by assuming the worst.

One of the side effects of a culture of information asymmetry is that even a lack of information – real or perceived – can have a powerful effect.

For example, it is commonly understood that once a new car is bought it will instantly lose as much as a quarter of its value. Someone who had paid $20,000 for a car yesterday can expect to get less than $15,000 for it today.

Why this absurd drop in value in 24 hours?

The reason lies in information asymmetry. The buyer cannot know the true reasons why the seller is selling their new car, so they logically assume that there is something wrong with it. Even if this is not the case, the buyer assumes that the seller has information they are not revealing, and fills this information gap with his or her own assumption. Effectively, the seller is punished because of the information asymmetry.

A study of online dating sites provides another example of this effect. Results show that the single worst thing a user can do to lower the amount of interest they generate is to omit their photo. When others see they have done this, they assume the worst.

The lesson is that in any transaction, it is clearly important to not only focus on the information you provide but also consider the information the other party expects you to provide, and what conclusions they are likely to jump to if you omit it.

When sellers leave out information, customers often penalize them by assuming the worst.

Freakonomics Key Idea #7: People worry disproportionately about risks that are particularly prominent or over which they have little control.

When it comes to assessing risks, we are far less rational than we would like to believe.

One factor that disproportionately influences our assessment is how readily we can imagine the risk in question. Although they are in fact quite rare, we can easily imagine plane crashes, gun crime or terrorist attacks occurring due to their excessive coverage in the media. This leads us to over-assess the risk of these threats.

As another example, ask yourself: Would you feel safer if your child was playing at a friend’s house where a gun is kept or playing at a house with a swimming pool?

The thought of a child being shot with a gun is horrifying and creates outrage. Swimming pools do not, so we would probably feel safer about the swimming pool. But actually, the likelihood of a child being killed by gunshot is much smaller than that of being killed in a swimming pool accident.

A second factor in our evaluation of risk is how in control we feel. This may explain people’s disproportionate fear of flying compared to driving: we feel in control when we’re actually holding the steering wheel of a car, whereas we feel helpless on a plane. But the risk of death in either form of transport is in fact about the same.

Being aware of our biases in these respects is the first step in resisting them. The second is seeking out solid facts about risks to help counter gut reactions and make more rational evaluations.

People worry disproportionately about risks that are particularly prominent or over which they have little control.

Freakonomics Key Idea #8: We often incorrectly assume that just because two things happen simultaneously, one is causing the other.

Despite having similar populations, the city of Washington DC has three times the number of police officers as Denver and eight times the number of homicides. Would you assume that the additional officers are causing the higher rate of homicide?

When we see that an increase in a certain factor, X, corresponds with an increase in another factor, Y, it is tempting to think that the relationship is causal and that the increase in X caused the increase in Y. This is a human tendency: we assume causality when in fact there may only be correlation.

Consider the example of money and politics. Most people would agree that money has a strong influence on the outcome of elections, and in fact data shows that candidates with the most expensive campaigns usually win. We tend to logically infer that money is the cause of the victory. But is this really the case?

People who contribute to political campaigns are generally pragmatic, and therefore use one of two tactics: they either try to make a difference in a close race or back a clear favorite. They believe that the candidate they don't support is not worth their effort.

These trends result in successful candidates attracting more money. But did the money contribute to the success or vice versa?

By studying candidates that run in successive elections, it was found that actually the amount of money spent has hardly any effect on the results. A winning candidate could cut his or her spending in half and lose only 1% of the vote, whereas a losing candidate could double the amount spent but only expect a 1% increase in their vote. It seems money does not win elections after all.

We often incorrectly assume that just because two things happen simultaneously, one is causing the other.

Freakonomics Key Idea #9: When attributing causality, we tend to overlook remote causes in favor of more immediate ones.

In addition to jumping to conclusions about causality between two events, we also tend to look for that causality in the most immediate and obvious places, ignoring more distant or indirect causes.

Consider crime: At the end of 1989 in the United States, crime figures seemed to be going through the roof. Violent crime had increased by 80% over the previous 15 years, and experts were predicting the situation would only get worse. So it came as something of a surprise when crime figures suddenly and dramatically dropped in the early 1990s.

The same experts now rushed to explain the drop. The proposed causes included the improving economy, tougher gun control, innovative policing, increases in police numbers and increased reliance on prisons.

Despite the popularity and plausibility of these explanations, later analysis has shown that most of these factors only had a small effect on crime rates. In fact, the factor that had the biggest effect was not even mentioned at the time: abortions.

Two of the biggest predictors of a child’s future criminal behavior are growing up in a single-parent household and living in poverty. These happen to coincide with the most common reasons people choose to have an abortion. So when the landmark ruling of Roe v. Wade legalized abortion across the United States in 1973, women in such circumstances were suddenly able to have abortions. This greatly reduced the cohort of likely criminals who would be turning 16 in 1989 or after, hence contributing to the drop in crime from then on.

The lesson here is to be wary of obvious and immediate causes – even experts can be fooled by them.

When attributing causality, we tend to overlook remote causes in favor of more immediate ones.

In Review: Freakonomics Book Summary

The key message in this book:

From raising children to selling a house, our everyday lives are full of seemingly simple decisions and interactions. By challenging conventional wisdom, examining how we’re affected by incentives and analyzing data from the world around us, Freakonomics gets to the heart of interactions from all walks of life to reveal unexpected and often irrational factors at play. Only by acknowledging these hidden aspects can we begin to understand and develop strategies to counteract them.

The questions this book answered:

In this summary of Freakonomics by Steven D. Levitt and Stephen J. Dubner,What should you take into account when assessing incentives and their impact?

  • Incentives can affect your wallet, your pride or your conscience.
  • Introducing incentives can often have unintended consequences on people’s behavior
  • Incentives are context dependent: what works when it’s sunny might not when it’s raining.

How does the distribution of information in a transaction affect the parties involved?

  • Experts can use their informational advantage to exploit laypeople for economic benefit.
  • Experts can use fear and anxiety to cheat laypeople.
  • The Internet has greatly helped reduce the informational advantage of experts.
  • When sellers leave out information, customers often penalize them by assuming the worst.

What human biases affect our assessments of risk and causality?

  • People worry disproportionately about risks that are particularly prominent or over which they have little control.
  • We often incorrectly assume that just because two things happen simultaneously, one is causing the other.
  • When attributing causality, we tend to overlook distant or remote causes in favor of more immediate ones.

Suggested further reading: Think Like a Freak by Steven D. Levitt and Stephen J. Dubner

Think Like A Freak is a blueprint for thinking unconventionally and creatively. It demonstrates the benefits of letting go of conventional wisdoms, and teaches you to dig deeper to find out how things really work. By learning to think like a "freak", you’ll gain access to an entirely new way of solving problems and making sense of the world.