Thinking in Bets by Annie Duke (Book Summary)

Buy this book from Amazon

We give decisions throughout our lives. Job and relocation decisions are bets. Sales negotiations and contracts are bets. Buying a house is a bet. Ordering the chicken instead of the steak is a bet. Everything is a bet. Most of them based on luck. The book explains how our biases systematically impede our decision-making process in our mind. We probably don’t regret the decision to do those things at all. We regret the outcome.

Our thoughts tend to mix up decisions with their effects, which makes it difficult to see the problems clearly.

Super Bowl XLIX ended in controversy. With 26 seconds left in the game, instead of running the ball from the one-yard line to win the Super Bowl, he calls a pass play, it gets intercepted, and the Seahawks lose a game they probably should’ve won. The next day, public opinion about Carroll had turned nasty. The headline in the Seattle Times read: “Seahawks Lost Because of the Worst Call in Super Bowl History”!

But it wasn’t really Carroll’s decision that was being judged. Given the circumstances, it was actually a fairly reasonable call. It was the fact that it didn’t work.

It is really simple thought experiment: what if he had called for the pass play and the ball got caught and (they) won the Superbowl? What would the press headlines have looked like?

Poker players call this tendency to confuse the quality of a decision with the problem of resulting, and it’s a dangerous tendency. A bad decision can lead to a good outcome, after all, and good decisions can lead to bad outcomes. No one who’s driven home drunk has woken up the next day and seen it as a good decision just because they didn’t get into an accident. There’s too much luck involved in the life to do that.

In fact, decisions are rarely 100 percent right or wrong. Life isn’t going on like that. Life is like poker, a game of unknown information – since you never know what cards the other players are holding – and luck. Our decision-making is like poker players’ bets. We bet on future outcomes based on what we believe is most likely to occur.

So why not look at it this way? If our decisions are bets, we can start to let go of the idea that we’re 100 percent “right” or “wrong,” and start to say, “I’m not sure.” This opens us up to thinking in terms of probability, which is far more appropriate. We’re normally flipping a coin a couple of times, and trying to figure out something of value.

Volunteering at a charity poker tournament, the author once explained to the crowd that player A’s cards would win 76 percent of the time, giving the other player a 24 percent chance to win. When player B won, a spectator yelled out that she’d been wrong.

But, she explained, she’d said that player B’s hand would win 24 percent of the time. She wasn’t wrong. It was just that the actual outcome fell within that 24 percent margin. You would realize that whatever that belief that you have, sort of by definition, can’t be 100%.

If we want to look for the truth, we should deal with our mind-coded tendency to believe what we hear.

We all want to make good decisions and have good outcomes. But saying, “I believe X to be the best option” first requires good-quality beliefs. Good-quality beliefs are ideas about X that are informed and well thought-out. But we can’t expect to form good-quality beliefs with lazy thinking. Instead, we have to be willing to do some work in the form of truth-seeking. It is a good thing to combat our innate tendency to distort reality, and it requires us to address painful truths about ourselves. That means we have to strive for truth and objectivity, even when something doesn’t align with the beliefs we hold.

Unfortunately, truth-seeking runs contrary to the ways we’re naturally wired. For our evolutionary ancestors, questioning new beliefs could be dangerous, so it was low priority. If you hear a lion rustling in the grass, for example, Lion! Run!” than to have heard a rustle and thought, “Hm. Is that a lion? Let’s collect some facts.

When language developed, we could communicate things that our own senses had never experienced, leading to the ability to form abstract beliefs. This skill worked via our old belief-forming methods, though, and questioning remained something we did after belief-forming and only infrequently.

Harvard psychology professor Daniel Gilbert and his colleagues conducted experiments in 1993, showing that this tendency to believe is still with us. In the experiments, participants read statements color-coded as either true or false. Later, they were asked to remember which statements were true and which were false. But this time, they were distracted so as to increase their cognitive load and make them more prone to mistakes. In the end, the subjects’ tendency was to simplify believe that statements had been true – even those that had “false” color-coding.

And as easily as beliefs are formed, they’re equally hard to change. When we believe something, we try to find some reasons for it with motivated reasoning. That is, we seek out evidence that confirms our belief, and ignore or work against anything contradictory. We try to eliminate other options. After all, everyone wants to think well of themselves, and being wrong feels bad. So information that contradicts our beliefs can feel like a threat.

The good news is, we can work around our tendencies with a simple phrase: “Wanna bet?” It is like a game. If we were betting on our beliefs, we’d work a lot harder to confirm their validity. If someone bets you $100 that a statement you made was false, it changes your thinking about the statement right away. It triggers you to look more closely at the belief in question, and motivates you to be objectively accurate. This isn’t just about money. Whenever there’s something riding on the accuracy of our beliefs, we’re less likely to make absolute statements and more likely to validate those beliefs.

Focusing on accuracy and acknowledging uncertainty is a lot more like truth-seeking, which gets us beyond our resistance to new information and gives us something better on which to bet.

We could find out a lot from outcomes, but it’s hard to get a grasp which have something to teach us.

The best way to learn is often by reviewing our past outcomes that failed. Likewise, if we want to have better future outcomes, we’ll have to do some outcome fielding. Outcome fielding is looking at outcomes to see what we can learn from them.

Some outcomes we can attribute to luck and forget about – they were out of our control anyway. It’s the outcomes that seem to have resulted primarily from our decisions that we should learn from. After analyzing those decisions, we can refine and update any beliefs that led to our initial bet.

Some outcomes we can attribute to luck and forget about – they were out of our control anyway. It’s the outcomes that seem to have resulted primarily from our decisions that we should learn from. After analyzing those decisions, we can refine and update any beliefs that led to our initial bet.

Here’s an example: A poker player who has just lost a hand needs to quickly decide whether it was luck or her own poker-playing skill that was responsible. If it was skill, then she needs to figure out where her decision-making went wrong so she doesn’t repeat the mistake. If all the bad results aren’t her fault, if they’re due to luck, there’s nothing to learn from them.

Most outcomes result from a mix of skill, luck, and unknown information. That’s why we often make errors in our fielding. Knowing how much of each is involved is tricky. Plus we’re all subject to self-serving bias. We like to take credit for good outcomes and blame bad outcomes on something or someone else. The problem is we too often take credit for good results and blame bad results on chance; this self-serving bias is a terrible way to learn

For example, social psychologist and Stanford law professor Robert MacCoun examined accounts of auto accidents. In multiple-vehicle accidents, he found that drivers blamed someone else 91 percent of the time. And 37 percent of the time they still refused responsibility when only a single vehicle was involved.

We can try to deal with self-serving bias by looking at other people’s outcomes. But in that case, it just operates in reverse: we blame their successes on luck and their failures on bad decisions.

Chicago Cubs fan Steve Bartman found this out the hard way in 2003 when he accidentally deflected a fly ball from Cubs left fielder Moises Alou. The Cubs lost the game and Bartman became the subject of angry fans’ harassment and even violence for more than a decade.

But why was Bartman held responsible? He tried to catch the ball, just as lots of other fans did. But Bartman had the bad luck of deflecting it. The world saw the other fans’ good outcome, that is, not touching the ball was a result of their good decision not to intervene. Whereas Bartman’s bad outcome was all his fault.

If we want to be more objective about outcomes, we need to change our habits.

Phil Ivey is one of the best poker players in the world. He’s admired by his peers and has been incredibly successful in every type of poker. What is the biggest reason for this? Phil Ivey has good habits.

Habits work in neurological loops that have three parts: cue, routine and reward. Charles Duhigg who is a Pulitzer-prize-winning reporter points out in his book The Power of Habit, the key to changing a habit is to work with this structure, leaving the cue and reward alone but changing the routine.

Let’s say you want to minimize your self-serving bias in poker, but your habit is to win a hand (cue), attribute it to your skill (routine) and feed your positive image of yourself (reward). You might try attributing each win to a combination of luck and skill in order to change the habit. You might analyze the key points of success.

But how do you then get that boost to your self-image? Instead of feeling good about being a winning poker player, you can feel good about being a player who’s good at identifying your mistakes, accurately fielding your outcomes, learning and making decisions.

That’s where Phil Ivey excels. His poker habits are built around truth-seeking and accurate outcome fielding rather than self-serving bias. The author mentions a 2004 poker tournament in which Ivey mopped the floor with his competitors, then spent a celebratory dinner afterward picking apart his play and seeking opinions about what he might have done better.

Unfortunately, most of us don’t have habits as good as Phil Ivey’s, but that doesn’t mean we can’t work with what we’ve got. We can grasp it. One way we can improve the way we field outcomes is to think about them in terms of – you guessed it – bets.

Let’s say we got into a car accident on an icy stretch of road. It might be that we were unlucky, that’s all. But would that explanation satisfy you if you had to bet on it? Chances are, you’d start to consider other explanations, just to be sure. Maybe you were driving too fast, or maybe you should have pumped your brakes differently. Once the stakes are raised, we start to look into the causes a little more seriously, to help us move beyond self-serving bias and become more objective.

As a fringe benefit, this exploration also makes us look at things with a little more perspective. We start to see explicitly that outcomes are a mixture of luck and skill. Despite our hard-wired tendencies, this forces us to be a little more compassionate when evaluating the other peoples’ – and our own – outcomes.

We could have better decision-making by being part of a group, but the group needs to be the right kind of group.

We’ve all got blind spots, which makes finding the truth hard. But it’s a little easier when we are in a help group. After all, others can often show our errors more easily than we can.

But to be effective, a group dedicated to examining decisions isn’t like any other. It has to have a clear focus, a commitment to objectivity and open-mindedness, and a clear charter that all members understand.

The author had a chance early in her career to be brought into a group like this, made up of experienced poker players who helped each other analyze their play. Early on, poker legend Erik Seidel made the group’s charter clear when, during a break in a poker tournament, the author tried to complain to him about her bad luck in a hand. Seidel shut her down, making it crystal clear that he had no interest. He wasn’t trying to be hurtful, he said, and he was always open to strategy questions. But bad-luck stories were just a pointless rehashing of something out of anyone’s control.

If she wanted to seek the truth with Seidel and his group, she would have to commit to objectivity, not complaining about about bad luck.

She did, and over time, this habituated her to working against her own biases, and not just in conversations with the group. Being accountable to committed truth-seekers who challenged each other’s biases made her think differently, even when they weren’t around.

In a decision-examining group committed to objective accuracy, this kind of change is self-reinforcing. Increasing objectivity leads to approval within the group, which then motivates us to strive for ever-greater accuracy by harnessing the deep-seated need for group approval that we all share.

Seeking approval doesn’t mean agreeing on everything, of course. Dissent and diversity are crucial in objective analysis, keeping any group from being more than an echo chamber.

Dissent helps us look more closely at our beliefs. That’s why the CIA has “red teams,” groups responsible for finding flaws in analysis and logic and arguing against the intelligence community’s conventional wisdom. And as NYU’s professor Jonathan Haidt points out, intellectual and ideological diversity in a group naturally produces high-quality thinking.

If we want to work together efficiently, a group needs CUDOS.

Shared commitment and clear guidelines help define a good-quality decision-examining group. But once you’ve got that group, how do you work within it?

You can start by giving each other CUDOS.

CUDOS are the brainchild of influential sociologist Merton R. Schkolnick, guidelines that he thought should shape the scientific community. And they happen also to be an ideal template for groups dedicated to truth-seeking.

The C in CUDOS stands for communism. If a group is going to question decisions together, then it’s important that each member shares all relevant information and strives to be as transparent as possible to get the best analysis. It’s only natural that we are tempted to leave out details that make us look bad, but incomplete information is a tool of our bias.

U stands for universalism – using the same standards for evaluating all information, no matter where it came from. When she was starting out in poker, the author tended to discount unfamiliar strategies used by players that she’d labeled as “bad.” But she soon suspected that she was missing something and started forcing herself to identify something that every “bad” player did well. This helped her learn valuable new strategies that she might have missed and understand her opponents much more deeply.

D is for disinterestedness and it’s about avoiding bias. As American physicist Richard Feynman noted, we view a situation differently if we already know the outcome. Even a hint of what happens in the end tends to bias our analysis. The author’s poker group taught her to be vigilant about this. But, teaching poker seminars for beginners, she would ask students to examine decision-making by describing specific hands that she’d played, omitting the outcome as a matter of habit. It left students on the edge of their seats, reminding them that outcomes were beside the point!

“OS” is for organized skepticism, a trait that exemplifies thinking in bets. In a good group, this means collegial, non-confrontational examination of what we really do and don’t know, which keeps everyone focused on improving their reasoning. Centuries ago, the Catholic church put this into practice by hiring individuals to argue against sainthood during the canonization process – that’s where we get the phrase “devil’s advocate.”

If you know that your group is committed to CUDOS, you’ll be more accountable to these standards in the future. And the future, as we’ll see, can make us a lot smarter about our decisions.

To make better decisions, we need to think about the future.

Comedian Jerry Seinfeld describes himself as a “Night Guy.” He likes to stay up late at night and doesn’t worry about getting by on too little sleep. There are two people in one person. That’s Morning Jerry’s problem, not Night Jerry. No wonder Morning Jerry hates Night Jerry so much – Night Jerry always screws him over.

It’s a funny description, but temporal discounting – making decisions that favor our immediate desires at the expense of our future self – is something we all do. We have to think about our  future as well by avoiding some of the current advantages.

Luckily, there are a few things we can do to take better care of our future selves.

Imagining future outcomes is one. Imagined futures aren’t random. They’re based on memories of the past. That means that when our brains imagine what the future will be like if we stay up too late, they’re also accessing memories of oversleeping and being tired all day long, which might help nudge us into bed.

We can also recruit our future feelings using journalist Suzy Welch’s “10-10-10.” A 10-10-10 brings the future into the present by making us ask ourselves, at a moment of decision, how we’ll feel about it in ten minutes, ten months and ten years. We imagine being accountable for our decision in the future and motivate ourselves to avoid any potential regret we might feel.

And bringing the future to mind can also help us start planning for it.

The best way to do this is to start with the future we’d like to happen and work backward from there. It’s a matter of perspective: the present moment and immediate future are always more attractive to us, so starting our plans from the present tends to make us overemphasize momentary concerns.

We can get around this with backcasting, imagining a future in which everything has worked out, and our goals have been achieved, and then asking, “How did we get there?” This leads to imagining the decisions that have led us to success and also recognizing when our desired outcome requires some unlikely things to happen. If that’s the case, we can either adjust our goals or figure out how to make those things more likely.

Conversely, we can perform premortems on our decisions.  Pre-mortems involve working backward from a negative future instead; this allows us to think of ways in which things could go wrong. Premortems are when we imagine that we’ve failed and ask, “What went wrong?” This helps us identify the possibilities that backcasting might have missed. Over more than 20 years of research, NYU psychology professor Gabrielle Oettingen has consistently found that people who imagine the obstacles to their goals, rather than achieving those goals, are more likely to succeed.

We’ll never be able to control uncertainty, after all. We might as well plan to work with it.

Book Review

Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.

You might not be a gambler. Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. Whether or not there’s money involved, bets make us take a harder look at and then we are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. If we handle it every time to navigating our way through the uncertainty by adjusting our beliefs to move toward, step by step, a more objective representation of the world.

Buy this book from Amazon

Download Pdf’t+Have+All+the+Facts+by+Annie+Duke.pdf

Download Epub

Audiobook Sample

Savaş Ateş

I'm a software engineer. I like reading books and writing summaries. I like to play soccer too :) Good Reads Profile:

Recent Posts