Tag: Confirmation bias

The Narratives of History: Applying Lessons from the Past

“History is written by the winners” is the popular view. But your winner may not be my winner. A lot depends on the narrative you are trying to build.

History is rewritten all the time.

Sometimes it is rewritten because new information has come to light, perhaps from an archeological find or previously classified documents. When this happens, it is exciting. We joyfully anticipate that more information will deepen our understanding.

But rewriting frequently happens in the service of building a cultural or national narrative. We highlight bits of the past that support our perceived identities and willfully ignore the details that don’t fit. We like our history uncomplicated. It’s hard for us to understand our groups or our countries, and by extension ourselves, as both good and not-good at the same time.

Culture is collective memory. It’s the interwoven stories that we use to explain who we are as nations, organizations, or just loosely formed groups.

Many of us belong to multiple cultural groups, but only one national group. Margaret MacMillan, in The Uses and Abuses of History, explains that “Collective memory is more about the present than the past because it is integral to how the group sees itself.” And “while collective memory is usually grounded in fact, it need not be.”

We have seen how people justify all kinds of mistakes to preserve the personal narratives they are invested in, and groups also engage in this behavior. Countries rewrite their histories, from the textbook up, to support how they see themselves now. Instinctively we may recoil from this idea, believing that it’s better to turn over all the rocks and confront what is lurking underneath. However, as MacMillan writes, “It can be dangerous to question the stories people tell about themselves because so much of our identity is both shaped by and bound up with our history. That is why dealing with the past, in deciding on which version we want, or on what we want to remember and to forget, can become so politically charged.”

For example, when Canada’s new war museum opened, controversy immediately ensued because part of the World War II exhibit called attention “to the continuing debate over both the efficacy and the morality of the strategy of the Royal Air Force’s bomber command, which sought to destroy Germany’s capacity to fight on by massive bombing of German industrial and civilian targets.” RAF veterans were outraged that their actions were considered morally ambiguous. Critics of the exhibit charged that the veterans should have the final say because, after all, “they were there.”

We can see that this rationale makes no sense. Galilean relativity shows that the pilots who flew the bombing campaigns are actually the least likely to have an objective understanding of the events. And the ends don’t always justify the means. It is possible to do bad things in the pursuit of morally justified outcomes.

MacMillan warns that the danger of abusing history is that it “flattens out the complexity of human experience and leaves no room for different interpretations of the past.”

Which leaves us asking, What do we want from history? Do we want to learn from it, with the hopes that in doing so we will avoid mistakes by understanding the experiences of others? Or do we want to practice self-justification on a national level, reinforcing what we already believe about ourselves in order to justify what we did and what we are doing? After all, “you could almost always find a basis for your claims in the past if you looked hard enough.”

As with medicine, there is a certain fallibility to history. Our propensity to fool ourselves with self-justified narratives is hard to overcome. If we selectively use the past only to reinforce our claims in the present, then the situation becomes precarious when there is pressure to change. Instead of looking as objectively as possible at history, welcoming historians who challenge us, we succumb to confirmation bias, allowing only those interpretations that are consistent with the narrative we are invested in.

Consider what MacMillan writes about nationalism, which “is a very late development indeed in terms of human history.”

It all started so quietly in the nineteenth century. Scholars worked on languages, classifying them into different families and trying to determine how far back into history they went. They discovered rules to explain changes in language and were able to establish, at least to their own satisfaction, that texts centuries old were written in early forms of, for example, German or French. Ethnographers like the Grimm brothers collected German folk tales as a way of showing that there was something called the German nation in the Middle Ages. Historians worked assiduously to recover old stories and pieced together the history of what they chose to call their nation as though it had an unbroken existence since antiquity. Archaeologists claimed to have found evidence that showed where such nations had once lived, and where they had moved to during the great waves of migrations.

The cumulative result was to create an unreal yet influential version of how nations formed. While it could not be denied that different peoples, from Goths to Slavs, had moved into and across Europe, mingling as they did so with peoples already there, such a view assumed that at some point, generally in the Middle Ages, the music had stopped. The dancing pieces had fallen into their chairs, one for the French, another for the Germans and yet another for the Poles. And there history had fixed them as “nations.” German historians, for example, could depict an ancient German nation whose ancestors had lived happily in their forests from before the time of the Roman Empire and which at some time, probably in the first century A.D., had become recognisably “German.” So — and this was the dangerous question — what was properly the German nation’s land? Or the land of any other “nation”? Was it where the people now lived, where they had lived at the time of their emergence in history, or both?

Would the scholars have gone on with their speculations if they could have seen what they were preparing the way for? The bloody wars that created Italy and Germany? The passions and hatred that tore apart the old multinational Austria-Hungary? The claims, on historical grounds, by new and old nations after World War I for the same pieces of territory? The hideous regimes of Hitler and Mussolini with their elevation of the nation and the race to the supreme good and their breathtaking demands for the lands of others?

When we selectively reach back into the past to justify claims in the present, we reduce the complexity of history and of humanity. This puts us in an awkward position because the situations we are confronted with are inherently complex. If we cut ourselves off from the full scope of history because it makes us uncomfortable, or doesn’t fit with the cultural narrative in which we live, we reduce our ability to learn from the past and apply those lessons to the situations we are facing today.

MacMillan says, “There are also many lessons and much advice offered by history, and it is easy to pick and choose what you want. The past can be used for almost anything you want to do in the present. We abuse it when we create lies about the past or write histories that show only one perspective. We can draw our lessons carefully or badly. That does not mean we should not look to history for understanding, support and help; it does mean that we should do so with care.”

We need to accept that people can do great things while still having flaws. Our heroes don’t have to be perfect, and we can learn just as much from their imperfections as from their achievements.

We have to allow that there are at least two sides to every story, and we have to be willing to listen to both. There are no conflicts in which one side doesn’t feel morally justified in their actions; that’s why your terrorist can be my freedom fighter. History can be an important part of bridging this divide only if we are willing to lift up all the rocks and shine our lights on what is lurking underneath.

Confirmation Bias: Why You Should Seek Out Disconfirming Evidence

The Basics

Confirmation bias is our tendency to cherry-pick information that confirms our existing beliefs or ideas. This is also known as myside bias or confirmatory bias. Two people with opposing views on a topic can see the same evidence and come away feeling validated by it. Confirmation bias is pronounced in the case of ingrained, ideological, or emotionally charged views.

Failing to interpret information in an unbiased way can lead to serious misjudgments. By understanding this, we can learn to identify it in ourselves and others. We can be cautious of data that seems to immediately support our views.

“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.”

— Warren Buffett

When we feel as if others “cannot see sense,” a grasp of how confirmation bias works can enable us to understand why. Willard V. Quine and J.S. Ullian described this bias in The Web of Belief as such:

The desire to be right and the desire to have been right are two desires, and the sooner we separate them the better off we are. The desire to be right is the thirst for truth. On all counts, both practical and theoretical, there is nothing but good to be said for it. The desire to have been right, on the other hand, is the pride that goeth before a fall. It stands in the way of our seeing we were wrong, and thus blocks the progress of our knowledge.

Experimentation beginning in the 1960s revealed our tendency to confirm existing beliefs, rather than questioning them or seeking new ones. Other research has revealed our single-minded need to enforce ideas.

Like many mental models, confirmation bias was first identified by the ancient Greeks. In The History of the Peloponnesian War, Thucydides described this tendency as such:

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.

Our use of this cognitive shortcut is understandable. Evaluating evidence (especially when it is complicated or unclear) requires a great deal of mental energy. Our brains prefer to take shortcuts. This saves the time needed to make decisions, especially when we're under pressure. As many evolutionary scientists have pointed out, our minds are unequipped to handle the modern world. For most of human history, people experienced very little new information during their lifetimes. Decisions tended to be survival based. Now, we are constantly receiving new information and have to make numerous complex choices each day. To stave off overwhelm, we have a natural tendency to take shortcuts.

In “The Case for Motivated Reasoning,” Ziva Kunda wrote, “we give special weight to information that allows us to come to the conclusion we want to reach.” Accepting information that confirms our beliefs is easy and requires little mental energy. Contradicting information causes us to shy away, grasping for a reason to discard it.

In The Little Book of Stupidity, Sia Mohajer wrote:

The confirmation bias is so fundamental to your development and your reality that you might not even realize it is happening. We look for evidence that supports our beliefs and opinions about the world but excludes those that run contrary to our own… In an attempt to simplify the world and make it conform to our expectations, we have been blessed with the gift of cognitive biases.

“The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects.”

— Francis Bacon

How Confirmation Bias Clouds Our Judgment

The complexity of confirmation bias arises partly from the fact that it is impossible to overcome it without an awareness of the concept. Even when shown evidence to contradict a biased view, we may still interpret it in a manner that reinforces our current perspective.

In one Stanford study, half of the participants were in favor of capital punishment, and the other half were opposed to it. Both groups read details of the same two fictional studies. Half of the participants were told that one study supported the deterrent effect of capital punishment and the other study opposed it. The other participants read the inverse information. At the conclusion of the study, the majority of participants stuck to their original views, pointing to the data that supported it and discarding that which did not.

Confirmation bias clouds our judgment. It gives us a skewed view of information, even when it consists only of numerical figures. Understanding this cannot fail to transform a person’s worldview — or rather, our perspective on it. Lewis Carroll stated, “we are what we believe we are,” but it seems that the world is also what we believe it to be.

A poem by Shannon L. Alder illustrates this concept:

Read it with sorrow and you will feel hate.
Read it with anger and you will feel vengeful.
Read it with paranoia and you will feel confusion.
Read it with empathy and you will feel compassion.
Read it with love and you will feel flattery.
Read it with hope and you will feel positive.
Read it with humor and you will feel joy.
Read it without bias and you will feel peace.
Do not read it at all and you will not feel a thing.

Confirmation bias is somewhat linked to our memories (similar to availability bias). We have a penchant for recalling evidence that backs up our beliefs. However neutral the original information was, we fall prey to selective recall. As Leo Tolstoy wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

“Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the destruction of their original evidential bases.”

— Lee Ross and Craig Anderson

Why We Ignore Contradicting Evidence

Why is it that we struggle to even acknowledge information that contradicts our views? When first learning about the existence of confirmation bias, many people deny that they are affected. After all, most of us see ourselves as intelligent, rational people. So, how can our beliefs persevere even in the face of clear empirical evidence? Even when something is proven untrue, many entirely sane people continue to find ways to mitigate the subsequent cognitive dissonance.

Much of this is the result of our need for cognitive consistency. We are bombarded by information. It comes from other people, the media, our experience, and various other sources. Our minds must find means of encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive shortcuts and models. These can be either useful or unhelpful.

Confirmation bias is one of the less-helpful heuristics which exists as a result. The information that we interpret is influenced by existing beliefs, meaning we are more likely to recall it. As a consequence, we tend to see more evidence that enforces our worldview. Confirmatory data is taken seriously, while disconfirming data is treated with skepticism. Our general assimilation of information is subject to deep bias. To constantly evaluate our worldview would be exhausting, so we prefer to strengthen it. It can also be difficult to consider multiple ideas at once, making it simpler to focus on just one.

We ignore contradictory evidence because it is so unpalatable for our brains. According to research by Jennifer Lerner and Philip Tetlock, we are motivated to think in a critical manner only when held accountable by others. If we are expected to justify our beliefs, feelings, and behaviors to others, we are less likely to be biased towards confirmatory evidence. This is less out of a desire to be accurate, and more the result of wanting to avoid negative consequences or derision for being illogical. Ignoring evidence can be beneficial, such as when we side with the beliefs of others to avoid social alienation.

Examples of Confirmation Bias in Action

Creationists vs. Evolutionary Biologists

A prime example of confirmation bias can be seen in the clashes between creationists and evolutionary biologists. The latter use scientific evidence and experimentation to reveal the process of biological evolution over millions of years. The former see the Bible as being true in the literal sense and think the world is only a few thousand years old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence that disproves their ideas. Many consider the non-empirical “evidence” for their beliefs (such as spiritual experiences and the existence of scripture) to be of greater value than the empirical evidence for evolution.

Evolutionary biologists have used fossil records to prove that the process of evolution has occurred over millions of years. Meanwhile, some creationists view the same fossils as planted by a god to test our beliefs. Others claim that fossils are proof of the global flood described in the Bible. They ignore evidence to contradict these conspiratorial ideas and instead use it to confirm what they already think.

Doomsayers

Take a walk through London on a busy day and you are pretty much guaranteed to see a doomsayer on a street corner, ranting about the upcoming apocalypse. Return a while later and you will find them still there, announcing that the end has been postponed.

In When Prophecy Fails, Leon Festinger explained the phenomenon this way:

Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting people to his view.

Music

Confirmation bias in music is interesting because it is actually part of why we enjoy it so much. According to Daniel Levitin, author of This Is Your Brain on Music:

As music unfolds, the brain constantly updates its estimates of when new beats will occur, and takes satisfaction in matching a mental beat with a real-in-the-world one.

Witness the way a group of teenagers will act when someone puts on “Wonderwall” by Oasis or “Creep” by Radiohead. Or how their parents react to “Starman” by Bowie or “Alone” by Heart. Or even their grandparents to “The Way You Look Tonight” by Sinatra or “Non, Je ne Regrette Rien” by Edith Piaf. The ability to predict each successive beat or syllable is intrinsically pleasurable. This is a case of confirmation bias serving us well. We learn to understand musical patterns and conventions, enjoying seeing them play out.

Homeopathy

The multibillion-dollar homeopathy industry is an example of mass confirmation bias.

Homeopathy was invented by Jacques Benveniste, a French researcher studying histamines. Benveniste became convinced that as a solution of histamines was diluted, the effectiveness increased due to what he termed “water memories.” Test results were performed without blinding, leading to a placebo effect. Benveniste was so certain of his hypothesis that he found data to confirm it and ignored that which did not. Other researchers repeated his experiments with appropriate blinding and proved Benveniste’s results to have been false. Many of the people who worked with him withdrew from science as a result.

Yet homeopathy supporters have only grown in numbers. Supporters cling to any evidence to support homeopathy while ignoring that which does not.

Scientific Experiments

“One of the biggest problems with the world today is that we have large groups of people who will accept whatever they hear on the grapevine, just because it suits their worldview—not because it is actually true or because they have evidence to support it. The striking thing is that it would not take much effort to establish validity in most of these cases… but people prefer reassurance to research.”
— Neil deGrasse Tyson

In good scientific experiments, researchers should seek to falsify their hypotheses, not to confirm them. Unfortunately, this is not always the case (as shown by homeopathy). There are many cases of scientists interpreting data in a biased manner, or repeating experiments until they achieve the desired result. Confirmation bias also comes into play when scientists peer-review studies. They tend to give positive reviews of studies that confirm their views and of studies accepted by the scientific community.

This is problematic. Inadequate research programs can continue past the point where evidence points to a false hypothesis. Confirmation bias wastes a huge amount of time and funding. We must not take science at face value and must be aware of the role of biased reporting.

“The eye sees only what the mind is prepared to comprehend.”

— Robertson Davies

Conclusion

This article can provide an opportunity for you to assess how confirmation bias affects you. Consider looking back over the previous paragraphs and asking:

  • Which parts did I automatically agree with?
  • Which parts did I ignore or skim over without realizing?
  • How did I react to the points which I agreed or disagreed with?
  • Did this post confirm any ideas I already had? Why?
  • What if I thought the opposite of those ideas?

Being cognizant of confirmation is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information. You need to search out disconfirming evidence.

As Rebecca Goldstein wrote in Incompleteness: The Proof and Paradox of Kurt Godel:

All truths — even those that had seemed so certain as to be immune to the very possibility of revision — are essentially manufactured. Indeed, the very notion of the objectively true is a socially constructed myth. Our knowing minds are not embedded in truth. Rather, the entire notion of truth is embedded in our minds, which are themselves the unwitting lackeys of organizational forms of influence.

To learn more about confirmation bias, read The Little Book of Stupidity or The Black Swan. Be sure to check out our entire latticework of mental models.

Nassim Taleb: How to Not be a Sucker From the Past

"History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative." — Nassim Taleb
“History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative.” — Nassim Taleb

The fact that new information exists about the past in general means that we have an incomplete road map about history. There is a necessarily fallibility … if you will.

In The Black Sawn, Nassim Taleb writes:

History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative. One should learn under severe caution. History is certainly not a place to theorize or derive general knowledge, nor is it meant to help in the future, without some caution. We can get negative confirmation from history, which is invaluable, but we get plenty of illusions of knowledge along with it.

While I don't entirely hold Taleb's view, I think it's worth reflecting on. As a friend put it to me recently, “when people are looking into the rear view mirror of the past, they can take facts and like a string of pearls draw lines of causal relationships that facilitate their argument while ignoring disconfirming facts that detract from their central argument or point of view.”

Taleb advises us to adopt the empirical skeptic approach of Menodotus which was to “know history without theorizing from it,” and to not draw any large theoretical or scientific claims.

We can learn from history but our desire for causality can easily lead us down a dangerous rabbit hole when new facts come to light disavowing what we held to be true. In trying to reduce the cognitive dissonance, our confirmation bias leads us to reinterpret past events in a way that fits our current beliefs.

History is not stagnant — we only know what we know currently and what we do know is subject to change. The accepted beliefs about how events played out may change in light of new information and then the new accepted beliefs may change over time as well.

Falsification: How to Destroy Incorrect Ideas

“The human mind is a lot like the human egg,
and the human egg has a shut-off device.
When one sperm gets in, it shuts down so the next one can’t get in.”

— Charlie Munger

***

Sir Karl Popper wrote that the nature of scientific thought is that we could never be sure of anything. The only way to test the validity of any theory was to prove it wrong, a process he labeled falsification. And it turns out we're quite bad at falsification.

When it comes to testing a theory we don't instinctively try to find evidence we're wrong. It's much easier and more mentally satisfying to find information that proves our intuition. This is known as the confirmation bias.

In Paul Tough's book, How Children Succeed: Grit, Curiosity, and the Hidden Power of Character, he tells the story of an English psychologist Peter Cathcart Wason, who came up with an “ingenious experiment to demonstrate our natural tendency to confirm rather than disprove our own ideas.”

Subjects were told that they would be given a series of three numbers that followed a certain rule known only to the experimenter. Their assignment was to figure out what the rule was, which they could do by offering the experimenter other strings of three numbers and asking him whether or not these new strings met the rule.

The string of numbers the subjects were given was quite simple:

2-4-6

Try it: What’s your first instinct about the rule governing these numbers? And what’s another string you might test with the experimenter in order to find out if your guess is right? If you’re like most people, your first instinct is that the rule is “ascending even numbers” or “numbers increasing by two.” And so you guess something like:

8-10-12

And the experimenter says, “Yes! That string of numbers also meets the rule.” And your confidence rises. To confirm your brilliance, you test one more possibility, just as due diligence, something like:

20-22-24

“Yes!” says the experimenter. Another surge of dopamine. And you proudly make your guess: “The rule is: even numbers, ascending in twos.” “No!” says the experimenter. It turns out that the rule is “any ascending numbers.” So 8-10-12 does fit the rule, it’s true, but so does 1-2-3. Or 4-23-512. The only way to win the game is to guess strings of numbers that would prove your beloved hypothesis wrong—and that is something each of us is constitutionally driven to avoid.

In the study, only 1 in five people was able to guess the correct rule.

And the reason we’re all so bad at games like this is the tendency toward confirmation bias: It feels much better to find evidence that confirms what you believe to be true than to find evidence that falsifies what you believe to be true. Why go out in search of disappointment?

There is also a video explaining Wason's work.

Michael Mauboussin: Three Steps to Effective Decisions

three steps

Making an important decision is never easy, but making the right decision is even more challenging.

Effective decision-making isn't just about accumulating information and going with what seems to make the most sense. Sometimes, internal biases can impact the way we seek out and process information, polluting the conclusions we reach in the process. It's critical to be conscious of those tendencies and to accumulate the sort of fact-based and unbiased inputs that will result in the highest likelihood that a decision actually leads to the desired outcome.

In this video, Michael Mauboussin, Credit Suisse's Head of Financial Strategies, lays out three steps that can help focus a decision-maker's thinking.

How do we take new information that comes in and integrate it with our point of view?

Typically we don't really take into consideration new information. The first major barrier to that is something called the confirmation bias. Once you've decided on something and you think this is the right way to think about it, you either blow off new information or if it's ambiguous you interpret it in a way that's favorable to you. Now the next problem, and we all have this, is called pseudo and subtly-diagnostic information. Pseudodiagnostic means information that isn't very relevant but you think it is. Subtly-diagnostic is information that is relevant and you don't pay attention to it.

So the key in all of this, is we have this torrent of information coming in, how do I sort that in a way that should lead me to increase or decrease my probabilities of a particular event happening.

The Four Villains of Decision Making

You're probably not as effective at making decisions as you could be.

This article explores Chip and Dan Heaths' new book, Decisive. It's going to help us make better decisions both as individuals and in groups.

But before we get to that, you should think about a tough decision you're grappling with right now. Having a decision working in your mind as you're reading this post will help make the advice in here tangible.

Ok, let's dig in.

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”

— Daniel Kahneman

We're quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that's not the only reason we don't make good decisions — there are many others.

We're overconfident. We look for information that fits our thoughts and ignore information that doesn't. We are overly influenced by authority. We choose the short-term over the long-term. Once we've made a decision we find it hard to change our mind. In short, our brains are flawed. I could go on.

Knowing about these and other biases isn't enough; it doesn't help us fix the problem. We need a framework for making decisions.

In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman's visual metaphor, the Heaths refer to the tendency to see only what's in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making.

What's in the spotlight will rarely be everything we need to make good decisions, but we won't always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it's likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn't really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices too narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we've got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there's no new information being added—but it doesn't feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can't deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your set of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you'll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes' with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made.

Read this next: What Matters More in Decisions: Analysis or Process?