Tag: Uncertainty Avoidance

Brené Brown: The Power of Vulnerability

In this TED talk, Brené Brown, who studies vulnerability, brings us into how we can live a more meaningful life.

Brown went back to the research and spent years trying to understand what choices whole-hearted people, who live from a deep sense of worthiness, were making. “What are we doing with vulnerability? Why do we struggle with it so much? Am I alone in struggling with vulnerability?” Here is what she learned:

We numb vulnerability — when we're waiting for the call. It was funny, I sent something out on Twitter and on Facebook that says, “How would you define vulnerability? What makes you feel vulnerable?” And within an hour and a half, I had 150 responses. Because I wanted to know what's out there. Having to ask my husband for help because I'm sick, and we're newly married; initiating sex with my husband; initiating sex with my wife; being turned down; asking someone out; waiting for the doctor to call back; getting laid off; laying off people — this is the world we live in. We live in a vulnerable world. And one of the ways we deal with it is we numb vulnerability.

[…]

One of the things that I think we need to think about is why and how we numb. And it doesn't just have to be addiction. The other thing we do is we make everything that's uncertain certain. Religion has gone from a belief in faith and mystery to certainty. I'm right, you're wrong. Shut up. That's it. Just certain. The more afraid we are, the more vulnerable we are, the more afraid we are. This is what politics looks like today. There's no discourse anymore. There's no conversation. There's just blame. You know how blame is described in the research? A way to discharge pain and discomfort. We perfect. If there's anyone who wants their life to look like this, it would be me, but it doesn't work. Because what we do is we take fat from our butts and put it in our cheeks. (Laughter) Which just, I hope in 100 years, people will look back and go, “Wow.”

And we perfect, most dangerously, our children. Let me tell you what we think about children. They're hardwired for struggle when they get here. And when you hold those perfect little babies in your hand, our job is not to say, “Look at her, she's perfect. My job is just to keep her perfect — make sure she makes the tennis team by fifth grade and Yale by seventh grade.” That's not our job. Our job is to look and say, “You know what? You're imperfect, and you're wired for struggle, but you are worthy of love and belonging.” That's our job. Show me a generation of kids raised like that, and we'll end the problems I think that we see today. We pretend that what we do doesn't have an effect on people. We do that in our personal lives. We do that corporate — whether it's a bailout, an oil spill, a recall — we pretend like what we're doing doesn't have a huge impact on other people. I would say to companies, this is not our first rodeo, people. We just need you to be authentic and real and say, “We're sorry. We'll fix it.”

But there's another way, and I'll leave you with this. This is what I have found: to let ourselves be seen, deeply seen, vulnerably seen; to love with our whole hearts, even though there's no guarantee — and that's really hard, and I can tell you as a parent, that's excruciatingly difficult — to practice gratitude and joy in those moments of terror, when we're wondering, “Can I love you this much? Can I believe in this this passionately? Can I be this fierce about this?” just to be able to stop and, instead of catastrophizing what might happen, to say, “I'm just so grateful, because to feel this vulnerable means I'm alive.”

Still curious? Brown is the author of Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead. I think this talk also ties in nicely to True Refuge: Finding Peace and Freedom in Your Own Awakened Heart.

Blindness to the Benefits of Ambiguity

“Decision makers,” write Stefan Trautmann and Richard Zeckhauser in their paper Blindness to the Benefits of Ambiguity, “often prove to be blind to the learning opportunities offered by ambiguous probabilities. Such decision makers violate rational decision making and forgo significant expected payoffs.”

Trautmann and Zeckhauser argue that we often don't recognize the benefits in commonly occurring ambiguous situations. In part this is because we often treated repeated decisions involving ambiguity as one-shot decisions. In doing so, we ignore the opportunity for learning when we encounter ambiguity in decisions that offer repeat choices.

To put this in context, the authors offer the following example:

A patient is prescribed a drug for high cholesterol. It is successful, lowering her total cholesterol from 230 to 190, and her only side effect is a mild case of sweaty palms. The physician is likely to keep the patient on this drug as long as her cholesterol stays low. Yet, there are many medications for treating cholesterol. Another might lower her cholesterol even more effectively or impose no side effects. Trying an alternative would seem to make sense, since the patient is likely to be on a cholesterol medication for the rest of her life.

In situations of ambiguity with repeated choices we often gravitate towards the first decision that offers a positive payoff. Once we've found a positive payoff we're likely to stick with that decision when given the opportunity to make the same choice again rather than experiment in an attempt to optimize payoffs. We ignore the opportunity for learning and favor the status quo. Another way to think of this is uncertainty avoidance (or ambiguity aversion).

Few individuals recognize that ambiguity offers the opportunity for learning. If a choice situation is to be repeated, ambiguity brings benefits, since one can change one’s choice if one learns the ambiguous choice is superior.

“We observe,” they offer, “that people's lack of a clear understanding of learning under ambiguity leads them to adopt non-Bayesian rules.”

Another example of how this manifests itself in the real world:

In the summer of 2010, the consensus estimate is that there are five applicants for every job opening, yet major employers who expect to hire significant numbers of workers once the economy turns up are sitting by the sidelines and having current workers do overtime. The favorability of the hiring situation is unprecedented in recent years. Thus, it would seem to make sense to hire a few workers, see how they perform relative to the norm. If the finding is much better, suggesting that the ability to select in a very tough labor market and among five applicants is a big advantage, then hire many more. This situation, where the payoff to the first-round decision is highly ambiguous, but perhaps well worthwhile once learning is taken into account, is a real world exemplar of the laboratory situations investigated in this paper.

According to Tolstoi, happy families are all alike, while every unhappy family is unhappy in its own way. A similar observation seems to hold true for situations involving ambiguity: There is only one way to capitalize correctly on learning opportunities under ambiguity, but there are many ways to violate reasonable learning strategies.

From an evolutionary perspective, why would learning avoidance persist if the benefits from learning are large?

Psychological findings suggest that negative experiences are crucial to learning, while good experiences have virtually no pedagogic power. In the current setting, ambiguous options would need to be sampled repeatedly in order to obtain sufficient information on whether to switch from the status quo. Both bad and good outcomes would be experienced along the way, but only good ones could trigger switching. Bad outcomes would also weigh much more heavily, leading people to require too much positive evidence before shifting to ambiguous options. In individual decision situations, losses often weigh 2 to 3 times as much as gains.

In addition, if one does not know what returns would have come from an ambiguous alternative, one cannot feel remorse from not having chosen it. Blame from others also plays an important role. In principal-agent relationships, bad outcomes often lead to criticism, and possibly legal consequences because of responsibility and accountability. Therefore, agents, such as financial advisors or medical practitioners may experience an even higher asymmetry from bad and good payoffs. Most people, for that reason, have had many fewer positive learning experiences with ambiguity than rational sampling would provide.

It might be a good idea to try a new brand the next time you're at the store rather than just making the same choice over and over. Who knows, you might discover you like it better.

Economists Are Overconfident. So Are You.

The point of regression analysis is to make predictions based on past relationships. “My concern,” one of the authors of the paper said, “is that when reading economics journal articles you get the impression that the world is much more predictable than it is.”

What Soyer and Hogarth did was get 257 economists to read about a regression analysis that related independent variable X to dependent variable Y, then answer questions about the probabilities of various outcomes (example: if X is 1, what's the probability of Y being greater than 0.936?).

When the results were presented in the way empirical results usually are presented in economics journals — as the average outcomes of the regression followed by a few error terms — the economists did a really bad job of answering the questions. They paid too much attention to the averages, and too little to the uncertainties inherent in them, thereby displaying too much confidence.

When the economists were shown the numerical results plus scatter graphs of the same data, they did slightly better. The economists who were shown only the graphs and none of the numerical results, meanwhile, actually got most of the answers right, or close to right.

The bigger point here, which Soyer and Hogarth have elaborated in other research, is that we tend to understand probabilistic information much better when it's presented in visual form than if we're just shown the numbers. (This was also a key argument of Sam Savage's edifying and entertaining 2009 book The Flaw of Averages.) What's so interesting is to learn that statistically literate experts are just as likely to glom onto the point estimate and discount the uncertainty as, say, innumerate journalists reporting the results of political polls.

Still curious? Read the paper and The Flaw of Averages.
source.

The Bias Against Creativity: Why People Desire But Reject Creative Ideas

creative

Do people desire creative ideas? If so, why do we so naturally resist them?

We think that creativity is an important educational goal so why does the research indicate that teachers dislike students who exhibit curiosity and creative thinking?

Three researchers took a stab at the answer:

We offer a new perspective to explain this puzzle. Just as people have deeply-rooted biases against people of a certain age, race or gender that are not necessarily overt (Greenwald & Banaji, 1995), so too can people hold deeply-rooted negative views of creativity that are not openly acknowledged. Revealing the existence and nature of a bias against creativity can help explain why people might reject creative ideas and stifle scientific advancement, even in the face of strong intentions to the contrary.

Creative ideas are novel and useful. Yet idea-evaluators (decision-makers) have a hard time “viewing novelty and practicality as attributes that go hand in hand,” and, in fact, often view them as inversely related.

When endorsing a novel idea, people can experience failure, perceptions of risk, social rejection when expressing the idea to others, and uncertainty about when their idea will reach completion.

And we generally like to avoid uncertainty:

Although the positive associations with creativity are typically the focus of attention both among scholars and practitioners, the negative associations may also be activated when people evaluate a creative idea. For example, research on associative thinking suggests that strong uncertainty feelings may make the negative attributes of creativity, particularly those related to uncertainty, more salient

The authors conclude:

Our results show that regardless of how open minded people are, when they feel motivated to reduce uncertainty either because they have an immediate goal of reducing uncertainty, or feel uncertain generally, this may bring negative associations with creativity to mind which result in lower evaluations of a creative idea.

Source: The Bias Against Creativity: Why People Desire: But Reject Creative Ideas

Thinking about Thinking

the thinker

While Daniel Kahneman's book, Thinking Fast, Thinking Slow, gets all the attention, he's also written a few articles that might catch your interest.

Optimistic Bias:

In terms of its consequences for decisions, the optimistic bias may well be the most significant cognitive bias. Because optimistic bias is both a blessing and a risk, you should be both happy and wary if you are temperamentally optimistic.

Competition Neglect

Colin Camerer, who coined the concept of competition neglect, illustrated it with a quote from a chairman of Disney Studios. Asked why so many big-budget movies are released on the same holidays, he said, “Hubris. Hubris. If you only think about your own business, you think, ‘I’ve got a good story department, I’ve got a good marketing department’ … and you don’t think that everybody else is thinking the same way.” The competition isn’t part of the decision. In other words, a difficult question has been replaced by an easier one.

This is a kind of dodge we all make, without even noticing. We use fast, intuitive thinking — System 1 thinking — whenever possible, and switch over to more deliberate and effortful System 2 thinking only when we truly recognize that the problem at hand isn’t an easy one.

The question that studio executives needed to answer is this: Considering what others will do, how many people will see our film? The question they did consider is simpler and refers to knowledge that is most easily available to them: Do we have a good film and a good organization to market it?

Appreciation of uncertainty

As Nassim Taleb, the author of “The Black Swan,” has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued; people and companies reward the providers of misleading information more than they reward truth tellers. An unbiased appreciation of uncertainty is a cornerstone of rationality — but it isn’t what organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred approach.

Theory-induced blindness

Bernoulli invented psychophysics to explain this aversion to risk. His idea was straightforward: People’s choices are based not on dollar values but on the psychological values of outcomes, their utilities. The psychological value of a gamble is therefore not the weighted average of its possible dollar outcomes; it is the average of the utilities of these outcomes, each weighted by its probability.

…That Bernoulli’s theory prevailed for so long is even more remarkable when you see that, in fact, it is seriously flawed. The errors are found not in what it asserts explicitly, but what it tacitly assumes.

The mystery is how a conception that is vulnerable to such obvious counterexamples survived for so long. I can explain it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: Once you have accepted a theory, it is extraordinarily difficult to notice its flaws. As the psychologist Daniel Gilbert has observed, disbelieving is hard work.

Reference points and loss aversion

The economists Devin Pope and Maurice Schweitzer, at the University of Pennsylvania, suggest that golf provides the perfect example of a reference point: par. For a professional golfer, a birdie (one stroke under par) is a gain, and a bogey (one stroke over par) is a loss. Failing to make par is a loss, but missing a birdie putt is a forgone gain, not a loss. Pope and Schweitzer analyzed more than 2.5 million putts to test their prediction that players would try harder when putting for par than when putting for a birdie.

They were right. Whether the putt was easy or hard, at every distance from the hole, players were more successful when putting for par than for a birdie.

…If you are set to look for it, the asymmetric intensity of the motives to avoid losses and to achieve gains shows up almost everywhere. It is an ever-present feature of negotiations, especially of renegotiations of an existing contract, the typical situation in labor negotiations, and in international discussions of trade or arms limitations. Loss aversion creates an asymmetry that makes agreements difficult to reach.

Negotiations over a shrinking pie are especially difficult because they require an allocation of losses. People tend to be much more easygoing when they bargain over an expanding pie.

In the world of territorial animals, the principle of loss aversion explains the success of defenders. A biologist observed that “when a territory holder is challenged by a rival, the owner almost always wins the contest — usually within a matter of seconds.”

Future Babble: Why expert predictions fail and why we believe them anyway

Future Babble has come out to mixed reviews. I think the book would interest anyone seeking wisdom.

Here are some of my notes:

First a little background: Predictions fail because the world is too complicated to be predicted with accuracy and we're wired to avoid uncertainty. However, we shouldn't blindly believe experts. The world is divided into two: foxes and hedgehogs. The fox knows many things whereas the hedgehog knows one big thing. Foxes beat hedgehogs when it comes to making predictions.

  • What we should ask is, in a non-linear world, why would we think oil prices can be predicted. Practically since the dawn of the oil industry in the nineteenth century, experts have been forecasting the price of oil. They've been wrong ever since. And yet this dismal record hasn't caused us to give up on the enterprise of forecasting oil prices. 
  • One of psychology's fundamental insights, wrote psychologist Daniel Gilbert, is that judgements are generally the products of non-conscious systems that operate quickly, on the basis scant evidence, and in a routine manner, and then pass their hurried approximations to consciousness, which slowly and deliberately adjust them. … (one consequence of this is that) Appearance equals reality. In the ancient environment in which our brains evolved, that as a good rule, which is why it became hard-wired into the brain and remains there to this day. (an example of this) as psychologists have shown, people often stereotype “baby-faced” adults as innocent, helpless, and needy.
  • We have a hard time with randomness. If we try, we can understand it intellectually, but as countless experiments have shown, we don't get it intuitively. This is why someone who plunks one coin after another into a slot machine without winning will have a strong and growing sense—the gambler's fallacy—that a jackpot is “due,” even though every result is random and therefore unconnected to what came before. … and people believe that a sequence of random coin tosses that goes “THTHHT” is far more likely than the sequence “THTHTH” even though they are equally likely.
  • People are particularly disinclined to see randomness as the explanation for an outcome when their own actions are involved. Gamblers rolling dice tend to concentrate and throw harder for higher numbers, softer for lower. Psychologists call this the “illusion of control.” … they also found the illusion is stronger when it involves prediction. In a sense, the “illusion of control” should be renamed the “illusion of prediction.”
  • Overconfidence is a universal human trait closely related to an equally widespread phenomenon known as “optimism bias.” Ask smokers about the risk of getting lung cancer from smoking and they'll say it's high. But their risk? Not so high. … The evolutionary advantage of this bias is obvious: It encourages people to take action and makes them more resilient in the face of setbacks.
  • … How could so many experts have been so wrong? … A crucial component of the answer lies in psychology. For all the statistics and reasoning involved, the experts derived their judgements, to one degree or another, from what they felt to be true. And in doing so they were fooled by a common bias. … This tendency to take current trends and project them into the future is the starting point of most attempts to predict. Very often. it's also the end point. That's not necessarily a bad thing. After all, tomorrow typically is like today. Current trends do tend to continue. But not always. Change happens. And the further we look into the future, the more opportunity there is for current rends to be modified, bent, or reversed. Predicting the future by projecting the present is like driving with no hands. It works while you are on a long stretch of straight road but even a gentle curve is trouble, and a sharp turn always ends in a flaming wreck.
  • When people attempt to judge how common something is—or how likely it is to happen in the future—they attempt to think of an example of that thing. If an example is recalled easily, it must be common. If it's harder to recall, it must be less common. … Again, this is not a conscious calculation. The “availability heuristic” is a tool of the unconscious mind.
  • “deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.” (Robert Shiller) … It's tempting to think that only ordinary people are vulnerable to conformity, that esteemed experts could not be so easily swayed. Tempting, but wrong. As Shiller demonstrated, “groupthink” is very much a disease that can strike experts. In fact, psychologist Irving Janis coined the term “groupthink” to describe expert behavior. In his 1972 classic, Victims of Groupthink, Janis investigated four high-level disasters: the defence of Pearl Harbour, the Bay of Pigs invasion, and escalation of the wars in Korea and Vietnam and demonstrated that conformity among highly educated, skilled, and successful people working in their fields of expertise was a root cause in each case.
  • (On corporate use of scenario planning)… Scenarios are not predictions, emphasizes Peter Schwartz, the guru of scenario planning. “They are tools for testing choices.” The idea is to have a clever person dream up a number of very different futures, usually three to four. … Managers then consider the implications of each, forcing them out of the rut of the status quo, and thinking about what they would do if confronted with real change. The ultimate goal is to make decisions that would stand up well in a wide variety of contexts. No one denies there maybe some value in such exercises. But how much value? The consultants who offer scenario planning services are understandably bullish, but ask them for evidence and they typically point to examples of scenarios that accurately foreshadowed the future. That is silly, frankly. For one thing, it contradicts their claim that scenarios are not predictions and al the misses would have to be considered, and the misses vastly outnumber the hits. … Consultants also cite the enormous popularity of scenario planning as proof of its enormous value… Lack of evidence aside, there are more disturbing reasons to be wary of scenarios. Remember that what drives the availably heuristic is not how many examples the mind can recall but how easily they are recalled. … and what are scenarios? Vivid, colourful, dramatic stories. Nothing could be easier to remember or recall. And so being exposed to a dramatic scenario about (whatever)… will make the depicted events feel much more likely to happen.
  • (on not having control) At its core, torture is a process of psychological destruction. and that process almost always begins with the torturer explicitly telling the victim he his powerless. “I decide when you can eat and sleep. I decide when you suffer, how you suffer, if it will end. I decide if you live or die.” …Knowing what will happen in the future is a form of control, even if we cannot change what will happen. …Uncertainty is potent… people who experienced the mild-but-unpredictable shocks experienced much more fear than those who got the strong-but-predictable shocks.
  • Our profound aversion to uncertainty helps explain what would otherwise be a riddle: Why do people pay so much attention to dark and scary predictions? Why do gloomy forecasts so often outnumber optimistic predictions, take up more media space, and sell more books? Part of this predilection for gloom is simply an outgrowth of what is sometimes called negativity bias: our attention is drawn more swiftly by bad news or images, and we are more likely to remember them than cheery information….People who's brains gave priority to bad news were much less likely to be eaten by lions or die some other pleasant death. … (negative) predictions are supported by our intuitive pessimism, so they feel right to us. And that conclusion is bolstered by our attraction to certainty. As strange as it sounds, we want to believe the expert predicting a dark future is less tormenting then suspecting it. Certainty is always preferable to uncertainty, even when what's certain is disaster.
  • Researchers have also shown that financial advisors who express considerable confidence in their stock forecasts are more trusted than those who are less confident, even when their objective records are the same. … This “confidence heuristic” like the availability heuristics, isn't necessarily a conscious decision path. We may not actually say to ourselves “she's so sure of herself she must be right”…
  • (on our love for stories) Confirmation bias also plays a critical role for the very simple reason that none of us is a blank slate. Every human brain is a vast warehouse of beliefs and assumptions about the world and how it works. Psychologists call these “schemas.” We love stories that fit our schemas; they're the cognitive equivalent of beautiful music. But a story that doesn't fit – a story that contradicts basic beliefs – is dissonant.
  • … What makes this mass delusion possible is the different emphasis we put on predictions that hit and those that miss. We ignore misses, even when they lie scattered by the dozen at our feet; we celebrate hits, even when we have to hunt for them and pretend there was more to them that luck.
  • By giving us the sense that we should have predicted what is now the present, or even that we actually did predict it when we did not, it strong suggests that we can predict the future. This is an illusion, and yet it seems only logical – which makes it a particularly persuasive illusion.

If you like the notes you should buy Future Babble. Like the book summaries? Check out my notes from Adapt: Why Success Always Starts With Failure, The Ambiguities of Experience, On Leadership.

Subscribe to Farnam Street via twitteremail, or RSS.

12