Thinking in Bets Summary and Review

by Annie Duke

Has Thinking in Bets by Annie Duke been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

There are very few sure things in life, so when we make decisions, we play the odds. Whether it’s what to study, which job to apply for or which house to buy, the outcomes of our decisions rely on many other factors. It’s simply not possible to know every single relevant variable when we make up our minds.

Like poker, life-changing decisions are largely based on luck. But calling it all “luck” is a bit disingenuous – it’s more like a game of probabilities. What’s more, the decisions we make are linked to the ways our brains are neurologically wired.

So what can you control? Well, probably more than you think.

With lessons grounded in poker and cognitive psychology, this book summary explain how it all starts with thinking in bets.

In this summary of Thinking in Bets by Annie Duke,You’ll find out

  • why hard-luck stories are a waste of time;
  • why car accidents are always someone else’s fault; and
  • how time travel might be our best tool for making decisions.

Thinking in Bets Key Idea #1: Human minds tend to confuse decisions with their outcomes, which makes it hard to see mistakes clearly.

Super Bowl XLIX ended in controversy. With 26 seconds left in the game, everyone expected Seattle Seahawks coach Pete Carroll to tell his quarterback, Russell Wilson, to hand the ball off. Instead, he told Wilson to pass. The ball was intercepted, the Seahawks lost the Super Bowl, and, by the next day, public opinion about Carroll had turned nasty. The headline in the Seattle Times read: “Seahawks Lost Because of the Worst Call in Super Bowl History”!

But it wasn’t really Carroll’s decision that was being judged. Given the circumstances, it was actually a fairly reasonable call. It was the fact that it didn’t work.

Poker players call this tendency to confuse the quality of a decision with the quality of its outcome resulting, and it’s a dangerous tendency. A bad decision can lead to a good outcome, after all, and good decisions can lead to bad outcomes. No one who’s driven home drunk has woken up the next day and seen it as a good decision just because they didn’t get into an accident.

In fact, decisions are rarely 100 percent right or wrong. Life isn’t like that. Life is like poker, a game of incomplete information – since you never know what cards the other players are holding – and luck. Our decision-making is like poker players’ bets. We bet on future outcomes based on what we believe is most likely to occur.

So why not look at it this way? If our decisions are bets, we can start to let go of the idea that we’re 100 percent “right” or “wrong," and start to say, “I’m not sure.” This opens us up to thinking in terms of probability, which is far more useful.

Volunteering at a charity poker tournament, the author once explained to the crowd that player A’s cards would win 76 percent of the time, giving the other player a 24 percent chance to win. When player B won, a spectator yelled out that she’d been wrong.

But, she explained, she’d said that player B’s hand would win 24 percent of the time. She wasn’t wrong. It was just that the actual outcome fell within that 24 percent margin.

Thinking in Bets Key Idea #2: If we want to seek out truth, we have to work around our hardwired tendency to believe what we hear.

We all want to make good decisions. But saying, “I believe X to be the best option” first requires good-quality beliefs. Good-quality beliefs are ideas about X that are informed and well thought-out. But we can’t expect to form good-quality beliefs with lazy thinking. Instead, we have to be willing to do some work in the form of truth-seeking. That means we have to strive for truth and objectivity, even when something doesn’t align with the beliefs we hold.

Unfortunately, truth-seeking runs contrary to the ways we’re naturally wired. For our evolutionary ancestors, questioning new beliefs could be dangerous, so it was low priority. If you hear a lion rustling in the grass, for example, you’re less likely to stop and analyze the situation objectively, and more likely to just run!

When language developed, we could communicate things that our own senses had never experienced, leading to the ability to form abstract beliefs. This ability worked via our old belief-forming methods, though, and questioning remained something we did after belief-forming and only infrequently.

In 1993, Harvard psychology professor Daniel Gilbert and his colleagues conducted experiments showing that this tendency to believe is still with us. In the experiments, participants read statements color-coded as either true or false. Later, they were asked to remember which statements were true and which were false. But this time, they were distracted so as to increase their cognitive load and make them more prone to mistakes. In the end, the subjects’ tendency was to simplify believe that statements had been true – even those that had “false” color-coding.

And as easily as beliefs are formed, they’re equally hard to change. When we believe something, we try to reinforce it with motivated reasoning. That is, we seek out evidence that confirms our belief, and ignore or work against anything contradictory. After all, everyone wants to think well of themselves, and being wrong feels bad. So information that contradicts our beliefs can feel like a threat.

The good news is, we can work around our tendencies with a simple phrase: “Wanna bet?"If we were betting on our beliefs, we’d work a lot harder to confirm their validity. If someone bets you $100 that a statement you made was false, it changes your thinking about the statement right away. It triggers you to look more closely at the belief in question, and motivates you to be objectively accurate. This isn’t just about money. Whenever there’s something riding on the accuracy of our beliefs, we’re less likely to make absolute statements and more likely to validate those beliefs.

Focusing on accuracy and acknowledging uncertainty is a lot more like truth-seeking, which gets us beyond our resistance to new information and gives us something better on which to bet.

Thinking in Bets Key Idea #3: We can learn a lot from outcomes, but it’s difficult to know which have something to teach us.

The best way to learn is often by reviewing our mistakes. Likewise, if we want to improve our future outcomes, we’ll have to do some outcome fielding. Outcome fielding is looking at outcomes to see what we can learn from them.

Some outcomes we can attribute to luck and forget about – they were out of our control anyway. It’s the outcomes that seem to have resulted primarily from our decisions that we should learn from. After analyzing those decisions, we can refine and update any beliefs that led to our initial bet.

Here’s an example: A poker player who has just lost a hand needs to quickly decide whether it was luck or her own poker-playing skill that was responsible. If it was skill, then she needs to figure out where her decision-making went wrong so she doesn't repeat the mistake.

Most outcomes result from a mix of skill, luck, and unknown information. That’s why we often make errors in our fielding. Knowing how much of each is involved is tricky. Plus we’re all subject to self-serving bias. We like to take credit for good outcomes and blame bad outcomes on something or someone else.

For example, social psychologist and Stanford law professor Robert MacCoun examined accounts of auto accidents. In multiple-vehicle accidents, he found that drivers blamed someone else 91 percent of the time. And 37 percent of the time they still refused responsibility when only a single vehicle was involved.

We can try to circumvent self-serving bias by looking at other people’s outcomes. But in that case, it just operates in reverse: we blame their successes on luck and their failures on bad decisions.

Chicago Cubs fan Steve Bartman found this out the hard way in 2003 when he accidentally deflected a fly ball from Cubs left fielder Moises Alou. The Cubs lost the game and Bartman became the subject of angry fans’ harassment and even violence for more than a decade.

But why was Bartman held responsible? He tried to catch the ball, just as lots of other fans did. But Bartman had the bad luck of deflecting it. The world saw the other fans’ good outcome, that is, not touching the ball was a result of their good decision not to intervene. Whereas Bartman’s bad outcome was all his fault.

Thinking in Bets Key Idea #4: To become more objective about outcomes, we need to change our habits.

Phil Ivey is one of the best poker players in the world. He’s admired by his peers and has been incredibly successful in every type of poker. One big reason for this? Phil Ivey has good habits.

Habits work in neurological loops that have three parts: cue, routine and reward. As Pulitzer-prize-winning reporter Charles Duhigg points out in his book The Power of Habit, the key to changing a habit is to work with this structure, leaving the cue and reward alone but changing the routine.

Let’s say you want to minimize your self-serving bias in poker, but your habit is to win a hand (cue), attribute it to your skill (routine) and feed your positive image of yourself (reward). You might try attributing each win to a combination of luck and skill in order to change the habit.

But how do you then get that boost to your self-image? Instead of feeling good about being a winning poker player, you can feel good about being a player who’s good at identifying your mistakes, accurately fielding your outcomes, learning and making decisions.

That’s where Phil Ivey excels. His poker habits are built around truth-seeking and accurate outcome fielding rather than self-serving bias. The author mentions a 2004 poker tournament in which Ivey mopped the floor with his competitors, then spent a celebratory dinner afterward picking apart his play and seeking opinions about what he might have done better.

Unfortunately, most of us don’t have habits as good as Phil Ivey’s, but that doesn’t mean we can’t work with what we’ve got. One way we can improve the way we field outcomes is to think about them in terms of – you guessed it – bets.

Let’s say we got into a car accident on an icy stretch of road. It might be that we were unlucky, that’s all. But would that explanation satisfy you if you had to bet on it? Chances are, you’d start to consider other explanations, just to be sure. Maybe you were driving too fast, or maybe you should have pumped your brakes differently. Once the stakes are raised, we start to look into the causes a little more seriously, to

help us move beyond self-serving bias and become more objective.

As a fringe benefit, this exploration also makes us look at things with a little more perspective. We start to see explicitly that outcomes are a mixture of luck and skill. Despite our hard-wired tendencies, this forces us to be a little more compassionate when evaluating the other peoples’ – and our own – outcomes.

Thinking in Bets Key Idea #5: We can improve our decision-making by being part of a group, but it needs to be the right kind of group.

We’ve all got blind spots, which makes truth-seeking hard. But it’s a little easier when we enlist the help of a group. After all, others can often pick out our errors more easily than we can.

But to be effective, a group dedicated to examining decisions isn’t like any other. It has to have a clear focus, a commitment to objectivity and open-mindedness, and a clear charter that all members understand.

The author was lucky early in her career to be brought into a group like this, made up of experienced poker players who helped each other analyze their play. Early on, poker legend Erik Seidel made the group’s charter clear when, during a break in a poker tournament, the author tried to complain to him about her bad luck in a hand. Seidel shut her down, making it crystal clear that he had no interest. He wasn’t trying to be hurtful, he said, and he was always open to strategy questions. But bad-luck stories were just a pointless rehashing of something out of anyone’s control.

If she wanted to seek the truth with Seidel and his group, she would have to commit to objectivity, not moaning about bad luck.

She did, and over time, this habituated her to working against her own biases, and not just in conversations with the group. Being accountable to committed truth-seekers who challenged each other’s biases made her think differently, even when they weren’t around.

In a decision-examining group committed to objective accuracy, this kind of change is self-reinforcing. Increasing objectivity leads to approval within the group, which then motivates us to strive for ever-greater accuracy by harnessing the deep-seated need for group approval that we all share.

Seeking approval doesn’t mean agreeing on everything, of course. Dissent and diversity are crucial in objective analysis, keeping any group from being more than an echo chamber.

Dissent helps us look more closely at our beliefs. That’s why the CIA has “red teams," groups responsible for finding flaws in analysis and logic and arguing against the intelligence community’s conventional wisdom. And as NYU’s professor Jonathan Haidt points out, intellectual and ideological diversity in a group naturally produces high-quality thinking.

Thinking in Bets Key Idea #6: To work together productively, a group needs CUDOS.

Shared commitment and clear guidelines help define a good-quality decision-examining group. But once you’ve got that group, how do you work within it?

You can start by giving each other CUDOS.

CUDOS are the brainchild of influential sociologist Merton R. Schkolnick, guidelines that he thought should shape the scientific community. And they happen also to be an ideal template for groups dedicated to truth-seeking.

The C in CUDOS stands for communism. If a group is going to examine decisions together, then it’s important that each member shares all relevant information and strives to be as transparent as possible to get the best analysis. It’s only natural that we are tempted to leave out details that make us look bad, but incomplete information is a tool of our bias.

U stands for universalism – using the same standards for evaluating all information, no matter where it came from. When she was starting out in poker, the author tended to discount unfamiliar strategies used by players that she’d labeled as “bad.” But she soon suspected that she was missing something and started forcing herself to identify something that every “bad” player did well. This helped her learn valuable new strategies that she might have missed and understand her opponents much more deeply.

D is for disinterestedness and it’s about avoiding bias. As American physicist Richard Feynman noted, we view a situation differently if we already know the outcome. Even a hint of what happens in the end tends to bias our analysis. The author’s poker group taught her to be vigilant about this. But, teaching poker seminars for beginners, she would ask students to examine decision-making by describing specific hands that she’d played, omitting the outcome as a matter of habit. It left students on the edge of their seats, reminding them that outcomes were beside the point!

“OS” is for organized skepticism, a trait that exemplifies thinking in bets. In a good group, this means collegial, non-confrontational examination of what we really do and don’t know, which keeps everyone focused on improving their reasoning. Centuries ago, the Catholic church put this into practice by hiring individuals to argue against sainthood during the canonization process – that’s where we get the phrase “devil’s advocate.”

If you know that your group is committed to CUDOS, you’ll be more accountable to these standards in the future. And the future, as we’ll see, can make us a lot smarter about our decisions.

Thinking in Bets Key Idea #7: To make better decisions, we need to spend some time in the future.

Comedian Jerry Seinfeld describes himself as a “Night Guy.” He likes to stay up late at night and doesn’t worry about getting by on too little sleep. That’s Morning Jerry’s problem, not Night Jerry. No wonder Morning Jerry hates Night Jerry so much – Night Jerry always screws him over.

It’s a funny description, but temporal discounting – making decisions that favor our immediate desires at the expense of our future self – is something we all do.

Luckily, there are a few things we can do to take better care of our future selves.

Imagining future outcomes is one. Imagined futures aren’t random. They’re based on memories of the past. That means that when our brains imagine what the future will be like if we stay up too late, they’re also accessing memories of oversleeping and being tired all day long, which might help nudge us into bed.

We can also recruit our future feelings using journalist Suzy Welch’s “10-10-10.” A 10-10-10 brings the future into the present by making us ask ourselves, at a moment of decision, how we’ll feel about it in ten minutes, ten months and ten years. We imagine being accountable for our decision in the future and motivate ourselves to avoid any potential regret we might feel.

And bringing the future to mind can also help us start planning for it.

The best way to do this is to start with the future we’d like to happen and work backward from there. It’s a matter of perspective: the present moment and immediate future are always more vivid to us, so starting our plans from the present tends to make us overemphasize momentary concerns.

We can get around this with backcasting, imagining a future in which everything has worked out, and our goals have been achieved, and then asking, “How did we get there?" This leads to imagining the decisions that have led us to success and also recognizing when our desired outcome requires some unlikely things to happen. If that’s the case, we can either adjust our goals or figure out how to make those things more likely.

Conversely, we can perform premortems on our decisions. Premortems are when we imagine that we’ve failed and ask, “What went wrong?" This helps us identify the possibilities that backcasting might have missed. Over more than 20 years of research, NYU psychology professor Gabrielle Oettingen has consistently found that people who imagine the obstacles to their goals, rather than achieving those goals, are more likely to succeed.

We’ll never be able to control uncertainty, after all. We might as well plan to work with it.

In Review: Thinking in Bets Book Summary

The key message in this book summary:

You might not be a gambler, but that’s no reason not to think in bets. Whether or not there’s money involved, bets make us take a harder look at how much certainty there is in the things we believe, consider alternatives and stay open to changing our minds for the sake of accuracy. So let go of “right” and “wrong” when it’s decision time, accept that things are always somewhat uncertain and make the best bet you can.

Actionable advice:

Try mental contrasting to make positive changes.

If you want to reach a goal, positive visualization will only get you so far. In fact, research shows that mental contrasting – visualizing the obstacles that are keeping you from your goal – will be far more effective. So if you want to lose a few pounds, don’t picture yourself looking good on the beach. Instead, think about all the desserts to which you’ll struggle to say “no” – that’s much more likely to motivate you to do the hard work.