How to Decide: Simple Tools for Making Better Choices by Annie Duke

Why is it so important to have a high-quality decision process? Because there are only two things that determine how your life turns out: luck and the quality of your decisions. You have control over only one of those two things.

The only thing you have control over that can influence the way your life turns out is the quality of your decisions.

Your gut—no matter how much experience or past success you’ve had—is not really a decision tool.

a good decision tool seeks to reduce the role of cognitive bias (such as overconfidence, hindsight bias, or confirmation bias) and a pros and cons list tends to amplify the role of bias.

Any decision is, in essence, a prediction about the future.

Because there are so many possible futures, making the best decision depends on your ability to accurately imagine what the world might look like if you were to choose any of the options you’re considering. That means the ideal decision tool would be a crystal ball.

Determining whether a decision is good or bad means examining the quality of the beliefs informing the decision, the available options, and how the future might turn out given any choice you make.

1. Resulting (Outcomes in the Rearview Mirror May Appear Larger Than They Are)

And this feeling that the result of the decision tells you something significant about the quality of the decision process is so powerful that even when the description of the decision is identical (you quit your job and take a new position), your view of that decision changes as the quality of the result changes.

In every domain, the outcome tail is wagging the decision dog. There’s a name for this: Resulting. When people result, they look at whether the result was good or bad to figure out if the decision was good or bad. (Psychologists call this “outcome bias,” but I prefer the more intuitive term “resulting.”)

Resulting is a way to simplify complex assessments of decision quality. The problem? Simple isn’t always better.

A necessary part of becoming a better decision-maker is learning from experience. Experience contains the lessons for improving future decisions. Resulting causes you to learn the wrong lessons.

It’s not easy to be willing to give up the credit that comes from feeling like you made good things happen, but it is worth it in the long run. Small changes in how much you notice the luck that you would otherwise overlook will have a big influence on the way your life turns out. Those small changes act like compounding interest that pays big dividends on your future decision-making.

Experience can teach you a lot about how to improve your decision-making, but only if you listen well. Developing the discipline to separate the quality of the result from the quality of the decision can help you to figure out which decisions are worth repeating and which aren’t.

An insidious cost of resulting is that you don’t question your assessment when decision quality and outcome quality align. When that happens, especially when things worked out, your decisions are more likely to remain unexamined while you just accept your intuition, which tells you, “Nothing to see here.”

Resulting is the tendency to look at whether a result was good or bad to figure out whether a decision was good or bad.

Outcomes cast a shadow over the decision process, leading you to overlook or distort information about the process, making your view of decision quality fit with outcome quality.

When you make a decision, you can rarely guarantee a good outcome (or a bad one). Instead, the goal is to try to choose the option that will lead to the most favorable range of outcomes.

2. As the Old Saying Goes, Hindsight Is Not 20/20

When you make a decision, there is stuff you know and stuff you don’t know. One of the things you definitely don’t know is which of all the possible outcomes that could happen will be the one that actually happens.

Hindsight bias is the tendency to believe that an outcome, after it occurs, was predictable or inevitable.

Hindsight bias distorts the way you process outcomes in two ways: Should have known and Knew it all along.

Once you know how a decision turns out, you can experience memory creep, where the stuff that reveals itself after the fact creeps into your memory of what you knew or was knowable before the decision.

3. The Decision Multiverse

You can’t fully understand what there is to learn from any outcome without understanding the other things that could have happened. This is the essence of counterfactual thinking.

COUNTERFACTUAL A what-if. A possible outcome of a decision that is not the one that actually occurred. An imagined, hypothetical state of the world.

The might-have-beens and what-ifs put your experiences in their proper context, helping you to: understand how much luck might have been involved in the outcome; compare the outcome you got to the outcomes that might have happened; let go of the feeling of inevitability; and improve the quality of the lessons you take from the experiences of your life.

There’s an asymmetry in our willingness to put outcomes in context: We’d rather do it when we fail than when we succeed.

It might feel good in the moment to accept your success without qualification or examination, but you’re going to lose out on so many learning opportunities by doing so.

The paradox of experience: Experience is necessary for learning, but individual experiences often interfere with learning. This is partly because of the biases that cause us to overfit outcomes and decision quality.

There are many possible futures but only one past. Because of this, the past feels inevitable.

Exploring the other possible outcomes is a form of counterfactual thinking. A counterfactual is something that relates to an outcome that has not happened but could have happened, or an imagined state of the world.

Our willingness to examine outcomes is asymmetrical. We are more eager to put bad outcomes in context than good ones. Becoming a better decision-maker requires us to try (difficult though it may be) to put those good outcomes in perspective.

When evaluating whether the outcome provides a lesson about decision quality, create a simplified decision tree, starting with the following: ☐ Identify the decision. ☐ Identify the actual outcome. ☐ Along with the actual outcome, create a tree with other reasonable outcomes that were possible at the time of the decision. ☐ Explore the other possible outcomes to understand better what is to be learned from the actual outcome you got.

4. The Three Ps: Preferences, Payoffs, and Probabilities

The thing about the past is that you can’t change it. What you can do is apply what you learn from the past to all the new decisions you have yet to make by developing a repeatable process for better decision-making.

Unexpectedness is really hard to evaluate in retrospect. But if you do the work in advance, not only will your decisions get better because you will be laser focused on how the future might unfold, but you will also be able to tell when you didn’t anticipate the way things might turn out because you will actually have a record of what you were thinking at the time you made the decision. That’s the path to supercharging your decision-making skills.

SIX STEPS TO BETTER DECISION-MAKING Step 1—Identify the reasonable set of possible outcomes. Step 2—Identify your preference using the payoff for each outcome—to what degree do you like or dislike each outcome, given your values? Step 3—Estimate the likelihood of each outcome unfolding. Step 4—Assess the relative likelihood of outcomes you like and dislike for the option under consideration. Step 5—Repeat Steps 1–4 for other options under consideration. Step 6—Compare the options to one another.

RISK Your exposure to the downside.

Assessing the quality of a decision involves figuring out whether going for the upside is worth risking the downside.

To figure out whether a decision is good or bad, you need to know not just the things that might reasonably happen and what could be gained or lost, but also the likelihood of each possibility unfolding. That means, to become a better decision-maker, you need to be willing to estimate those probabilities.

This way of thinking, that there is only “right” and “wrong” and nothing in between, is one of the biggest obstacles to good decision-making. Because good decision-making requires a willingness to guess.

Don’t overlook the territory in between right and wrong. Don’t overlook the value in being a little less wrong or a little closer to right.

We have a way to distinguish informed guesses from uninformed ones. We call the informed guesses educated guesses. It’s not a matter of whether or not any guess is educated. It’s a matter of to what degree.

Here’s a secret: All guesses are educated guesses because there is almost no estimate you could make about which you literally know nothing.

You can think about your state of knowledge on a continuum from no information to perfect information.

There’s a lot of value in making an educated guess. The more willing you are to guess, the more you’ll think about and apply what you know. In addition, you’ll start thinking about what you can find out that will get you closer to the answer. Whether you’re estimating the weight of a bison or the likelihood that Kingdom Comb will succeed, your job as a decision-maker is to figure out two things: (1) What do I already know that will make my guess more educated? (2) What can I find out that will make my guess more educated?

It’s okay to acknowledge that you’re not usually going to hit the bull’s-eye; the important thing is to take aim. Aiming for that bull’s-eye by making an educated guess gets you closer to a precise hit because it motivates you to assess what you know and what you don’t know. It motivates you to learn more.

Your choice is always an estimate of the likelihood of different outcomes unfolding.

What can I find out that will make my guess more educated?

Your beliefs are part of the foundation of every decision you make. Your beliefs inform what options you think are available and the ways that your decision might turn out. Your beliefs inform how likely you think things are to occur or things are to be true. They inform what you think the payoffs are, and even inform your goals and your values.

Your chief weapon to improve your decisions is turning some of the “stuff you don’t know” into “stuff you know.”

What you know is more like the size of a speck of dust that could fit on the head of a pin. What you don’t know is more like the size of the universe.

the stuff we do know is riddled with inaccuracies. A lot of our beliefs are not perfectly true.

Your beliefs inform your decisions and you have the ability to improve the quality of those beliefs.

Risk is your exposure to the downside.

Probabilities express how likely something is to occur.

The willingness to guess is essential to improving decisions. If you don’t make yourself guess, you’ll be less likely to ask “What do I know?” and “What don’t I know?”

You can start expressing probabilities by using common terms. That gets you thinking about how often outcomes will occur, presents a view of relative likelihood, and gives you a snapshot of the overall likelihood of the best and worst outcomes.

5. Taking Dead Aim at the Future: The Power of Precision

The tendency toward overconfidence vexes decision-making. In general, we don’t question our own beliefs enough. We have too much confidence in what we think we know and we don’t have a realistic view of what we don’t know. Whether it’s about the things we believe to be true, our opinions, or how we think the future might unfold, we could all use a healthy dose of skepticism. Making it a habit to ask yourself, “If I were wrong, why would that be?” helps get you to approach your own beliefs with more skepticism, disciplining your naturally overly optimistic view of what you know and getting you more focused on what you don’t know.

Doing the advance work of thinking about the things that might change your mind increases the chances that you’ll both be on the lookout for that corrective information in the future and that you’ll be open-minded to it when you come across it.

6. Turning Decisions Outside In

We’re pretty bad at figuring out when our beliefs are inaccurate. We have too much confidence in what we think we know.

Confirmation bias—Our tendency to notice, interpret, and seek out information that confirms or strengthens our existing beliefs.

Disconfirmation bias—Confirmation bias’s sibling. Our tendency to apply a higher, more critical standard to information that contradicts our beliefs than to information that confirms them.

Overconfidence—Overestimating our skills, intellect, or talent, interfering with our ability to make decisions depending on such estimates.

Availability bias—The tendency to overestimate the frequency of events that are easy to recall because they are vivid or because we’ve experienced them a lot.

Recency bias—Believing that recent events are more likely to occur than they actually are.

Illusion of control—Overestimating our ability to control events. In other words, to underestimate the influence of luck.

More than 90% of professors rate themselves as better-than-average teachers. About 90% of Americans rate their driving ability as better than average. Only 1% of students think their social skills are below average.

Accuracy lives in the intersection of the outside view and the inside view.

Research across a variety of settings has shown that being smart makes you better at motivated reasoning, the tendency to reason about information to confirm your prior beliefs and arrive at the conclusion you desire. And just to be clear, in this case “better” is not a good thing.

Being smart doesn’t protect you from your blind spot. It makes it worse.

If you’re trying to solve logic problems about subjects involving your political beliefs, everyone is likely to come to a conclusion consistent with their beliefs when the correct answer disagrees with those beliefs. But if you have prior experience or training in logic, you’re more likely to make this error.

Be thankful when people disagree with you in good faith because they are being kind when they do.

The more you can interact with the world in a way that invites people around you to give you the outside view, the more accurate your model of the world will become.

When it comes to success or failure, it can be painful to explore the outside view, especially when the inside view feels so good. But it is worth the discomfort. You can choose to swat away the skill in your bad outcomes and the luck in your good ones to keep the fabric of your identity intact in the moment. Or you can choose to embrace the outside view and strengthen that fabric so that the input into the decisions you make in the future contains less junk. That’s the trade you should take.

7. Breaking Free from Analysis Paralysis (How to Spend Your Decision-Making Time More Wisely)

The time the average person spends deciding what to eat, watch, and wear adds up to 250 to 275 hours per year.

THE TIME-ACCURACY TRADE-OFF Increasing accuracy costs time. Saving time costs accuracy.

The smaller the penalty, the faster you can go. The bigger the penalty, the more time you should take on a decision. The smaller the impact of a poor outcome, the faster you can go. The bigger the impact, the more time you should take.

THE HAPPINESS TEST Ask yourself if the outcome of your decision, good or bad, will likely have a significant effect on your happiness in a year. If the answer is no, the decision passes the test, which means you can speed up. Repeat for a month and a week. The shorter the time period for which your answer is “no, it won’t much affect my happiness,” the more you can trade off accuracy in favor of saving time.

FREEROLL A situation where there is an asymmetry between the upside and downside because the potential losses are insignificant.

OPPORTUNITY COST When you pick an option, you lose the potential gains associated with the options you don’t pick.

DECISION STACKING Finding ways to make low-impact, easy-to-quit decisions in advance of a high-impact, harder-to-quit decision.

Recognizing when decisions are low impact also maximizes opportunities to poke at the world, which increases your knowledge and helps you learn more about your preferences, improving the quality of all future decisions.

You can identify low-impact decisions with the Happiness Test, asking yourself if how your decision turns out will likely have an effect on your happiness in a week, a month, or a year. If the type of thing you are deciding about passes the Happiness Test, you can go fast.

When you’re facing a decision with a high or prohibitive cost of changing your mind, try decision stacking, making two-way-door decisions ahead of the one-way-door decision.

8. The Power of Negative Thinking

MENTAL CONTRASTING Imagining what you want to accomplish and confronting the obstacles that might stand in the way of accomplishing it.

But the mental contrasting research tells us that the temporary discomfort from imagining failure is worth it, because embracing that discomfort makes it more likely that you’ll actually experience success. Mental pain leads to real-world gains.

PROSPECTIVE HINDSIGHT Imagining yourself at some time in the future, having succeeded or failed at a goal, and looking back at how you arrived at that destination.

STATUS QUO BIAS Our tendency to believe that the way things are today will remain the same in the future.

Premortems and Backcasting: Whether you deserve an autopsy or a parade, you should know why in advance

STEPS FOR A PREMORTEM

  1. Identify the goal you’re trying to achieve or a specific decision you’re considering.

  2. Figure out a reasonable time period for achieving the goal or for the decision to play out.

  3. Imagine it’s the day after that period of time and you didn’t achieve the goal, or the decision worked out poorly. Looking back from that imagined point in the future, list up to five reasons why you failed due to your own decisions and actions or those of your team.

  4. List up to five reasons why you failed due to things outside your control.

  5. If you’re doing this as a team exercise, have each member do steps (3) and (4) independently, prior to a group discussion of reasons.

Broadly speaking, there are two categories of stuff that can interfere with achieving a goal: Stuff within your control—your own decisions and actions or, as is often the case in a business setting, the decision and actions of your team Stuff outside of your control—in addition to luck, the decisions and actions of people you have no influence over

Teams naturally bend toward groupthink. Members confirm one another’s beliefs. Once there is a sense that consensus is being reached, team members will (usually unintentionally) often refrain from sharing what’s in their head if it diverges from what the group thinks.

(The irreverent investor often acts solo)

Premortems reveal and reward the squeak. If you want to peer into the universe of stuff you don’t know to see the stuff that disagrees with your beliefs, premortems are a way to do that.

BACKCASTING Imagining yourself at some time in the future, having succeeded at achieving a goal, and looking back at how you arrived at that destination.

STEPS FOR A BACKCAST (1) Identify the goal you’re trying to achieve or a specific decision you’re considering. (2) Figure out a reasonable time period for achieving the goal or for the decision to play out. (3) Imagine it’s the day after that period of time and you achieved the goal, or the decision worked out well. Looking back from that imagined point in the future, list up to five reasons why you succeeded due to your own decisions and actions or those of your team. (4) List up to five reasons why you succeeded due to things outside of your control. (5) If you’re doing this as a team exercise, have each member do steps (3) and (4) independently, prior to a group discussion of reasons.

Just as accuracy lies in the intersection between the outside view and the inside view, the more accurate view of the future lies in the intersection between a premortem and a backcast.

PRECOMMITMENT CONTRACT An agreement that commits you in advance to take or refrain from certain actions, or raising or lowering barriers to those actions. Such agreements can be with others (for group decisions or to create accountability to another person) or with yourself.

even small increases in the quality of your decisions will accumulate over time, making it much more likely you’ll reach your destination.

In the Dr. Evil game (adapted from Dan Egan), you imagine that Dr. Evil has a mind-control device that he’s using to get you to make decisions that guarantee failure. Dr. Evil, being an evil genius, knows he can’t make you fail unless he avoids detection. If you make obviously bad decisions, he’ll get caught. You and the people around you will notice that you’re making bad decisions and his evil plans will be thwarted. Dr. Evil’s diabolical plan has to make you fail and avoid detection. His solution is to have you make losing decisions that are easy to explain away for any given instance of that type of decision, but that guarantee failure if you repeat them over time.

STEPS TO PLAY THE DR. EVIL GAME (1) Imagine a positive goal. (2) Imagine that Dr. Evil has control of your brain, causing you to make decisions that will guarantee failure. (3) Any given instance of that type of decision must have a good enough rationale that it won’t be noticed by you or others examining that decision. (4) Write down those decisions.

Ask yourself questions such as “How often have I been making exceptions recently?” or “Will I feel these exceptions were worth it in a week or a month?” This added deliberation provides you with a moment to stop and think, as well as a chance to do a bit of time travel to get in touch with your future self.

CATEGORY DECISION When you identify a category of poor decisions that will be hard to spot except in the aggregate, you can decide in advance what options you can and cannot choose that fall within that category.

A common practice of successful professional investors is to make category decisions to avoid investments outside their circle of competence. In facing an opportunity outside of their realm of expertise, particularly one that promises juicy returns, investors run the risk of fooling themselves into thinking that they can make a winning decision. The temptation to wander outside the circle of competence is especially strong if those boundaries are not in place. On the other hand, if they say, “I’m a seed investor” or “I invest only in assets of REITs that are in restructuring or bankruptcy,” they are less likely to consider anything else that comes along. When you make a category decision, you are making a onetime, advance choice about what options you can and cannot choose. This shields you from a series of decisions that are all vulnerable to your worst impulses in the moment.

Dr. Evil doesn’t get you with a guillotine blade chopping off your head. Instead, it’s a death by a thousand cuts. In any particular instance, your decision is easy to justify. He gives you a good reason to make a choice that causes you to lose a little bit on the path to reaching your goal. Then he piles up a lot of those decisions, killing your plans slowly, without allowing you to be aware that you are taking yourself down.

TILT When a bad outcome causes you to be in an emotionally hot state that compromises the quality of your decision-making.

what-the-hell effect.

sunk cost fallacy,

In the wake of positive investing results, you overrate your ability to choose stocks or believe that you no longer need the safety net of diversification.

HEDGING Paying for something that you hope you’ll never use to mitigate the impact of a downside event.

We are pretty good at setting positive goals for ourselves. Where we fall flat is at executing the things we need to do to achieve them. The gap between the things we know we should do and the decisions we later make is known as the behavior gap.

Thinking about how things can go wrong is known as mental contrasting. You imagine what you want to accomplish and confront the barriers in the way of accomplishing it.

mental time travel, picturing yourself in the future having failed to achieve a goal, and then looking back at what got you to that outcome.

Looking back from an imagined future at the route that got you there is called prospective hindsight.

A premortem combines prospective hindsight with mental contrasting. To do a premortem, you place yourself in the future and imagine that you have failed to achieve your goal. You then consider the potential reasons things worked out poorly.

backcasting, where you work backward from a positive future to figure out why you succeeded.

Tilt is a common reaction that occurs in the wake of a bad result. The what-the-hell effect and the sunk cost fallacy are examples of tilt. Planning for your reaction allows you to create precommitments, establish criteria for changing course, and dampen your emotional reaction in the wake of a setback.

The Dr. Evil game helps identify and address additional ways your behavior in the future might undermine your success. In the game, you note the ways that Dr. Evil would control your mind to make you fail through decisions that are justifiable as one-offs but unjustifiable over time.

9. Decision Hygiene (If You Want to Know What Someone Thinks, Stop Infecting Them with What You Think)

IF YOU WANT TO KNOW WHAT SOMEONE THINKS, STOP INFECTING THEM WITH WHAT YOU THINK

One of the best tools for improving your decision-making is to get other people’s perspectives. But you can only do that if you get their actual perspective instead of your perspective parroted back to you.

Where the maps diverge and your opinion and somebody else’s are far apart, three things might be true, and they are all good for improving the quality of your decisions: The objective truth lies somewhere between the two beliefs. When two people are equally well informed and they hold opposite opinions, the truth most likely lies between the two. When that’s the case, it’s obvious why both people benefit from having discovered the divergence. Both people get the opportunity to moderate their beliefs and get closer to the objective truth. You could be wrong, and the other person could be right. If you hold an inaccurate belief, the quality of any decision informed by that belief will suffer. A rational person would welcome the chance to change an inaccurate belief, but we know people like learning they’re wrong about as much as those doctors liked Semmelweis telling them they were killing patients by not washing their hands. As painful as it might be to find out something you believe is incorrect, the opportunity to change that belief will improve the quality of every single subsequent decision informed in any way by that belief. That seems like a fair trade: a little bit of pain in exchange for higher-quality decisions for the rest of your life. You could be right, and the other person could be wrong. When this is the case, you might think that only the person who is wrong benefits by getting the chance to reverse an inaccurate belief, because your belief was right and will remain unchanged. But actually, you benefit from the exchange as well because the act of explaining your belief and conveying it to someone else will improve how well you understand it. The better you understand why you believe the things you do, the higher in quality those beliefs become.

John Stuart Mill said, “He who knows only his own side of the case, knows little of that.”

the opportunity to moderate, change, or better understand your belief depends on your ability to access the map of someone else’s knowledge and see where their map diverges from your own. Because you’re not a mind reader, the primary way to do that is for them to tell you what they believe. But if you infect them with your beliefs before you allow them to give you their own, you’re not going to get a representative sample of their knowledge.

The only way somebody can know that they’re disagreeing with you is if they know what you think first. Keeping that to yourself when you elicit feedback makes it more likely that what they say is actually what they believe.

To get high-quality feedback, it’s important to put the other person as closely as possible into the same state of knowledge that you were in at the time that you made your decision.

FRAMING EFFECT A cognitive bias in which the way that information is presented influences the way that the listener makes decisions about the information.

HALO EFFECT A cognitive bias in which a positive impression of a person in one area causes you to have a positive view of that person in other, unrelated, areas.

A good group process encourages feedback that includes giving people the space to express a lack of understanding. The group as a whole benefits from that because it affords the experts the opportunity to better understand why they believe what they do, and also affords them the opportunity to transfer their knowledge to the other members of the group. And sometimes it gives them an opportunity to repair inaccuracies in the things that they believe.

Partial information doesn’t get you “partially good” feedback.

The decisions you make are like a portfolio of investments. Your goal is to make sure that the portfolio as a whole advances you toward your goals, even though any individual decision in that portfolio might win or lose.

Your future self is depending on you to make quality decisions and keep improving them. Real self-compassion is about not letting that person—all the future versions of yourself—down.

One of the best ways to improve the quality of your beliefs is to get other people’s perspectives. When their beliefs diverge from yours, it improves your decision-making by exposing you to corrective information and the stuff you don’t know.

Beliefs are contagious. Informing somebody of your belief before they give their feedback significantly increases the likelihood that they will express the same belief back to you.

Exercise decision hygiene to stem the infection of beliefs.

Keep your opinions to yourself when you elicit feedback.

The frame you choose can signal whether you have a positive or negative view about what you’re trying to get feedback on. Stay in neutral as much as possible.

The word “disagree” has very negative connotations. Using “divergence” or “dispersion” of opinion instead of “disagreement” is a more neutral way of talking about places where people’s opinions differ.

Outcomes can also infect the quality of feedback. Quarantine others from the way things turned out while eliciting their feedback.

Groups can better fulfill their decision-making potential by exercising group decision hygiene, soliciting initial opinions and rationales independently before sharing with the group.

For lower-impact, easier-to-reverse decisions, the group can still contain the contagion through a quick-and-dirty version of this process, where group members write down their opinions and someone reads them aloud or writes them on a whiteboard before discussion, or where members read their own opinions aloud in reverse order of seniority.

Access the outside view by asking yourself, “If someone came to me asking my opinion about this kind of decision, what would I need to know to give good advice?”

Build a checklist of relevant details for repeating decisions and make that checklist before you’re in the midst of a decision. Such a list should focus on the applicable goals, values, and resources, along with the details of the situation.

Disclaimer: The information provided on this website is for general informational purposes only and should not be considered investment advice. Please read our full disclaimer for more information. You can access it by clicking HERE.

Responses

Subscribe To Our Newsletter

- Get smarter
- Invest wisely
- Never miss a thing

You have Successfully Subscribed!

Shares