Author: Farnam Street

Poker, Speeding Tickets, and Expected Value: Making Decisions in an Uncertain World

“Take the probability of loss times the amount of possible loss from the probability of gain times the amount of possible gain. That is what we're trying to do. It's imperfect but that's what it's all about.”

— Warren Buffett

You can train your brain to think like CEOs, professional poker players, investors, and others who make tricky decisions in an uncertain world by weighing probabilities.

All decisions involve potential tradeoffs and opportunity costs. The question is, how can we make the best possible choices when the factors involved are often so complicated and confusing? How can we determine which statistics and metrics are worth paying attention to? How do we think about averages?

Expected value is one of the simplest tools you can use to think better. While not a natural way of thinking for most people, it instantly turns the world into shades of grey by forcing us to weigh probabilities and outcomes. Once we've mastered it, our decisions become supercharged. We know which risks to take, when to quit projects, when to go all in, and more.

Expected value refers to the long-run average of a random variable.

If you flip a fair coin ten times, the heads-to-tails ratio will probably not be exactly equal. If you flip it one hundred times, the ratio will be closer to 50:50, though again not exactly. But for a very large number of iterations, you can expect heads to come up half the time and tails the other half. The law of large numbers dictates that the values will, in the long term, regress to the mean, even if the first few flips seem unequal.

The more coin flips, the closer you get to the 50:50 ratio. If you bet a sum of money on a coin flip, the potential winnings on a fair coin have to be bigger than your potential loss to make the expected value positive.

We make many expected-value calculations without even realizing it. If we decide to stay up late and have a few drinks on a Tuesday, we regard the expected value of an enjoyable evening as higher than the expected costs the following day. If we decide to always leave early for appointments, we weigh the expected value of being on time against the frequent instances when we arrive early. When we take on work, we view the expected value in terms of income and other career benefits as higher than the cost in terms of time and/or sanity.

Likewise, anyone who reads a lot knows that most books they choose will have minimal impact on them, while a few books will change their lives and be of tremendous value. Looking at the required time and money as an investment, books have a positive expected value (provided we choose them with care and make use of the lessons they teach).

These decisions might seem obvious. But the math behind them would be somewhat complicated if we tried to sit down and calculate it. Who pulls out a calculator before deciding whether to open a bottle of wine (certainly not me) or walk into a bookstore?

The factors involved are impossible to quantify in a non-subjective manner – like trying to explain how to catch a baseball. We just have a feel for them. This expected-value analysis is unconscious – something to consider if you have ever labeled yourself as “bad at math.”

Parking Tickets

Another example of expected value is parking tickets. Let's say that a parking spot costs $5 and the fine for not paying is $10. If you can expect to be caught one-third of the time, why pay for parking? The expected value of doing so is negative. It's a disincentive. You can park without paying three times and pay only $10 in fines, instead of paying $15 for three parking spots. But if the fine is $100, the probability of getting caught would have to be higher than one in twenty for it to be worthwhile. This is why fines tend to seem excessive. They cover the people who are not caught while giving an incentive for everyone to pay.

Consider speeding tickets. Here, the expected value can be more abstract, encompassing different factors. If speeding on the way to work saves 15 minutes, then a monthly $100 fine might seem worthwhile to some people. For most of us, though, a weekly fine would mean that speeding has a negative expected value. Add in other disincentives (such as the loss of your driver's license), and speeding is not worth it. So the calculation is not just financial; it takes into account other tradeoffs as well.

The same goes for free samples and trial periods on subscription services. Many companies (such as Graze, Blue Apron, and Amazon Prime) offer generous free trials. How can they afford to do this? Again, it comes down to expected value. The companies know how much the free trials cost them. They also know the probability of someone's paying afterwards and the lifetime value of a customer. Basic math reveals why free trials are profitable. Say that a free trial costs the company $10 per person, and one in ten people then sign up for the paid service, going on to generate $150 in profits. The expected value is positive. If only one in twenty people sign up, the company needs to find a cheaper free trial or scrap it.

Similarly, expected value applies to services that offer a free “lite” version (such as Buffer and Spotify). Doing so costs them a small amount or even nothing. Yet it increases the chance of someone's deciding to pay for the premium version. For the expected value to be positive, the combined cost of the people who never upgrade needs to be lower than the profit from the people who do pay.

Lottery tickets prove useless when viewed through the lens of expected value. If a ticket costs $1 and there is a possibility of winning $500,000, it might seem as if the expected value of the ticket is positive. But it is almost always negative. If one million people purchase a ticket, the expected value is $0.50. That difference is the profit that lottery companies make. Only on sporadic occasions is the expected value positive, even though the probability of winning remains minuscule.

Failing to understand expected value is a common logical fallacy. Getting a grasp of it can help us to overcome many limitations and cognitive biases.

“Constantly thinking in expected value terms requires discipline and is somewhat unnatural. But the leading thinkers and practitioners from somewhat varied fields have converged on the same formula: focus not on the frequency of correctness, but on the magnitude of correctness.”

— Michael Mauboussin

Expected Value and Poker

Let's look at poker. How do professional poker players manage to win large sums of money and hold impressive track records? Well, we can be certain that the answer isn't all luck, although there is some of that involved.

Professional players rely on mathematical mental models that create order among random variables. Although these models are basic, it takes extensive experience to create the fingerspitzengefühl (“fingertips feeling,” or instinct) necessary to use them.

A player needs to make correct calculations every minute of a game with an automaton-like mindset. Emotions and distractions can corrupt the accuracy of the raw math.

In a game of poker, the expected value is the average return on each dollar invested in the pot. Each time a player makes a bet or call, they are taking into account the probability of making more money than they invest. If a player is risking $100, with a 1 in 5 probability of success, the pot must contain at least $500 for the bet to be safe. The expected value per call is at least equal to the amount the player stands to lose. If the pot contains $300 and the probability is 1 in 5, the expected value is negative. The idea is that even if this tactic is unsuccessful at times, in the long run, the player will profit.

Expected-value analysis gives players a clear idea of probabilistic payoffs. Successful poker players can win millions one week, then make nothing or lose money the next, depending on the probability of winning. Even the best possible hands can lose due to simple probability. With each move, players also need to use Bayesian updating to adapt their calculations. because sticking with a prior figure could prove disastrous. Casinos make their fortunes from people who bet on situations with a negative expected value.

Expected Value and the Ludic Fallacy

In The Black Swan, Nassim Taleb explains the difference between everyday randomness and randomness in the context of a game or casino. Taleb coined the term “ludic fallacy” to refer to “the misuse of games to model real-life situations.” (Or, as the website logicallyfallacious.com puts it: the assumption that flawless statistical models apply to situations where they don’t actually apply.)

In Taleb’s words, gambling is “sterilized and domesticated uncertainty. In the casino, you know the rules, you can calculate the odds… ‘The casino is the only human venture I know where the probabilities are known, Gaussian (i.e., bell-curve), and almost computable.’ You cannot expect the casino to pay out a million times your bet, or to change the rules abruptly during the game….”

Games like poker have a defined, calculable expected value. That’s because we know the outcomes, the cards, and the math. Most decisions are more complicated. If you decide to bet $100 that it will rain tomorrow, the expected value of the wager is incalculable. The factors involved are too numerous and complex to compute. Relevant factors do exist; you are more likely to win the bet if you live in England than if you live in the Sahara, for example. But that doesn't rule out Black Swan events, nor does it give you the neat probabilities which exist in games. In short, there is a key distinction between Knightian risks, which are computable because we have enough information to calculate the odds, and Knightian uncertainty, which is non-computable because we don’t have enough information to calculate odds accurately. (This distinction between risk and uncertainty is based on the writings of economist Frank Knight.) Poker falls into the former category. Real life is in the latter. If we take the concept literally and only plan for the expected, we will run into some serious problems.

As Taleb writes in Fooled By Randomness:

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table, nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

The Monte Carlo Fallacy

Even in the domesticated environment of a casino, probabilistic thinking can go awry if the principle of expected value is forgotten. This famously occurred in Monte Carlo Casino in 1913. A group of gamblers lost millions when the roulette table landed on black 26 times in a row. The probability of this occurring is no more or less likely than the other 67,108,863 possible permutations, but the people present kept thinking, “It has to be red next time.” They saw the likelihood of the wheel landing on red as higher each time it landed on black. In hindsight, what sense does that make? A roulette wheel does not remember the color it landed on last time. The likelihood of either outcome is exactly 50% with each spin, regardless of the previous iteration. So the potential winnings for each spin need to be at least twice the bet a player makes, or the expected value is negative.

“A lot of people start out with a 400-horsepower motor but only get 100 horsepower of output. It's way better to have a 200-horsepower motor and get it all into output.”

— Warren Buffett

Given all the casinos and roulette tables in the world, the Monte Carlo incident had to happen at some point. Perhaps some day a roulette wheel will land on red 26 times in a row and the incident will repeat. The gamblers involved did not consider the negative expected value of each bet they made. We know this mistake as the Monte Carlo fallacy (or the “gambler's fallacy” or “the fallacy of the maturity of chances”) – the assumption that prior independent outcomes influence future outcomes that are actually also independent. In other words, people assume that “a random process becomes less random and more predictable as it is repeated”1.

It's a common error. People who play the lottery for years without success think that their chance of winning rises with each ticket, but the expected value is unchanged between iterations. Amos Tversky and Daniel Kahneman consider this kind of thinking a component of the representativeness heuristic, stating that the more we believe we control random events, the more likely we are to succumb to the Monte Carlo fallacy.

Magnitude over Frequency

Steven Crist, in his book Bet with the Best, offers an example of how an expected-value mindset can be applied. Consider a hypothetical race with four horses. If you’re trying to maximize return on investment, you might want to avoid the horse with a high likelihood of winning. Crist writes,

The point of this exercise is to illustrate that even a horse with a very high likelihood of winning can be either a very good or a very bad bet, and that the difference between the two is determined by only one thing: the odds.”2

Everything comes down to payoffs. A horse with a 50% chance of winning might be a good bet, but it depends on the payoff. The same holds for a 100-to-1 longshot. It's not the frequency of winning but the magnitude of the win that matters.

Error Rates, Averages, and Variability

When Bill Gates walks into a room with 20 people, the average wealth per person in the room quickly goes beyond a billion dollars. It doesn't matter if the 20 people are wealthy or not; Gates's wealth is off the charts and distorts the results.

An old joke tells of the man who drowns in a river which is, on average, three feet deep. If you're deciding to cross a river and can't swim, the range of depths matters a heck of a lot more than the average depth.

The Use of Expected Value: How to Make Decisions in an Uncertain World

Thinking in terms of expected value requires discipline and practice. And yet, the top performers in almost any field think in terms of probabilities. While this isn't natural for most of us, once you implement the discipline of the process, you'll see the quality of your thinking and decisions improve.

In poker, players can predict the likelihood of a particular outcome. In the vast majority of cases, we cannot predict the future with anything approaching accuracy. So what use is expected value outside gambling? It turns out, quite a lot. Recognizing how expected value works puts any of us at an advantage. We can mentally leap through various scenarios and understand how they affect outcomes.

Expected value takes into account wild deviations. Averages are useful, but they have limits, as the man who tried to cross the river discovered. When making predictions about the future, we need to consider the range of outcomes. The greater the possible variance from the average, the more our decisions should account for a wider range of outcomes.

There's a saying in the design world: when you design for the average, you design for no one. Large deviations can mean more risk-which is not always a bad thing. So expected-value calculations take into account the deviations. If we can make decisions with a positive expected value and the lowest possible risk, we are open to large benefits.

Investors use expected value to make decisions. Choices with a positive expected value and minimal risk of losing money are wise. Even if some losses occur, the net gain should be positive over time. In investing, unlike in poker, the potential losses and gains cannot be calculated in exact terms. Expected-value analysis reveals opportunities that people who just use probabilistic thinking often miss. A trade with a low probability of success can still carry a high expected value. That's why it is crucial to have a large number of robust mental models. As useful as probabilistic thinking can be, it has far more utility when combined with expected value.

Understanding expected value is also an effective way to overcome the sunk costs fallacy. Many of our decisions are based on non-recoverable past investments of time, money, or resources. These investments are irrelevant; we can't recover them, so we shouldn't factor them into new decisions. Sunk costs push us toward situations with a negative expected value. For example, consider a company that has invested considerable time and money in the development of a new product. As the launch date nears, they receive irrefutable evidence that the product will be a failure. Perhaps research shows that customers are disinterested, or a competitor launches a similar, better product. The sunk costs fallacy would lead them to release their product anyway. Even if they take a loss. Even if it damages their reputation. After all, why waste the money they spent developing the product? Here's why: Because the product has a negative expected value, which will only worsen their losses. An escalation of commitment will only increase sunk costs.

When we try to justify a prior expense, calculating the expected value can prevent us from worsening the situation. The sunk costs fallacy robs us of our most precious resource: time. Each day we are faced with the choice between continuing and quitting numerous endeavors. Expected-value analysis reveals where we should continue, and where we should cut our losses and move on to a better use of time and resources. It's an efficient way to work smarter, and not engage in unnecessary projects.

Thinking in terms of expected value will make you feel awkward when you first try it. That's the hardest thing about it; you need to practice it a while before it becomes second nature. Once you get the hang of it, you'll see that it's valuable in almost every decision. That's why the most rational people in the world constantly think about expected value. They've uncovered the key insight that the magnitude of correctness matters more than its frequency. And yet, human nature is such that we're happier when we're frequently right.

Footnotes
  • 1

    From https://rationalwiki.org/wiki/Gambler’s_fallacy, accessed on 11 January 2018.

  • 2

    Steven Crist, “Crist on Value,” in Andrew Beyer et al., Bet with the Best: All New Strategies From America’s Leading Handicappers (New York: Daily Racing Form Press, 2001), 63-64.

Complexity Bias: Why We Prefer Complicated to Simple

Complexity bias is a logical fallacy that leads us to give undue credence to complex concepts.

Faced with two competing hypotheses, we are likely to choose the most complex one. That’s usually the option with the most assumptions and regressions. As a result, when we need to solve a problem, we may ignore simple solutions — thinking “that will never work” — and instead favor complex ones.

To understand complexity bias, we need first to establish the meaning of three key terms associated with it: complexity, simplicity, and chaos.

Complexity, like pornography, is hard to define when we’re put on the spot, although most of us recognize it when we see it. The Cambridge Dictionary defines complexity as “the state of having many parts and being difficult to understand or find an answer to.” The definition of simplicity is the inverse: “something [that] is easy to understand or do.” Chaos is defined as “a state of total confusion with no order.”

“Life is really simple, but we insist on making it complicated.”

— Confucius

Complex systems contain individual parts that combine to form a collective that often can’t be predicted from its components. Consider humans. We are complex systems. We’re made of about 100 trillion cells and yet we are so much more than the aggregation of our cells. You’d never predict what we’re like or who we are from looking at our cells.

Complexity bias is our tendency to look at something that is easy to understand, or look at it when we are in a state of confusion, and view it as having many parts that are difficult to understand.

We often find it easier to face a complex problem than a simple one.

A person who feels tired all the time might insist that their doctor check their iron levels while ignoring the fact that they are unambiguously sleep deprived. Someone experiencing financial difficulties may stress over the technicalities of their telephone bill while ignoring the large sums of money they spend on cocktails.

Marketers make frequent use of complexity bias.

They do this by incorporating confusing language or insignificant details into product packaging or sales copy. Most people who buy “ammonia-free” hair dye, or a face cream which “contains peptides,” don’t fully understand the claims. Terms like these often mean very little, but we see them and imagine that they signify a product that’s superior to alternatives.

How many of you know what probiotics really are and how they interact with gut flora?

Meanwhile, we may also see complexity where only chaos exists. This tendency manifests in many forms, such as conspiracy theories, superstition, folklore, and logical fallacies. The distinction between complexity and chaos is not a semantic one. When we imagine that something chaotic is in fact complex, we are seeing it as having an order and more predictability than is warranted. In fact, there is no real order, and prediction is incredibly difficult at best.

Complexity bias is interesting because the majority of cognitive biases occur in order to save mental energy. For example, confirmation bias enables us to avoid the effort associated with updating our beliefs. We stick to our existing opinions and ignore information that contradicts them. Availability bias is a means of avoiding the effort of considering everything we know about a topic. It may seem like the opposite is true, but complexity bias is, in fact, another cognitive shortcut. By opting for impenetrable solutions, we sidestep the need to understand. Of the fight-or-flight responses, complexity bias is the flight response. It is a means of turning away from a problem or concept and labeling it as too confusing. If you think something is harder than it is, you surrender your responsibility to understand it.

“Most geniuses—especially those who lead others—prosper not by deconstructing intricate complexities but by exploiting unrecognized simplicities.”

— Andy Benoit

Faced with too much information on a particular topic or task, we see it as more complex than it is. Often, understanding the fundamentals will get us most of the way there. Software developers often find that 90% of the code for a project takes about half the allocated time. The remaining 10% takes the other half. Writing — and any other sort of creative work — is much the same. When we succumb to complexity bias, we are focusing too hard on the tricky 10% and ignoring the easy 90%.

Research has revealed our inherent bias towards complexity.

In a 1989 paper entitled “Sensible reasoning in two tasks: Rule discovery and hypothesis evaluation,” Hilary F. Farris and Russell Revlin evaluated the topic. In one study, participants were asked to establish an arithmetic rule. They received a set of three numbers (such as 2, 4, 6) and tried to generate a hypothesis by asking the experimenter if other number sequences conformed to the rule. Farris and Revlin wrote, “This task is analogous to one faced by scientists, with the seed triple functioning as an initiating observation, and the act of generating the triple is equivalent to performing an experiment.”

The actual rule was simple: list any three ascending numbers.

The participants could have said anything from “1, 2, 3” to “3, 7, 99” and been correct. It should have been easy for the participants to guess this, but most of them didn’t. Instead, they came up with complex rules for the sequences. (Also see Falsification of Your Best Loved Ideas.)

A paper by Helena Matute looked at how intermittent reinforcement leads people to see complexity in chaos. Three groups of participants were placed in rooms and told that a loud noise would play from time to time. The volume, length, and pattern of the sound were identical for each group. Group 1 (Control) was told to sit and listen to the noises. Group 2 (Escape) was told that there was a specific action they could take to stop the noises. Group 3 (Yoked) was told the same as Group 2, but in their case, there was actually nothing they could do.

Matute wrote:

Yoked participants received the same pattern and duration of tones that had been produced by their counterparts in the Escape group. The amount of noise received by Yoked and Control subjects depends only on the ability of the Escape subjects to terminate the tones. The critical factor is that Yoked subjects do not have control over reinforcement (noise termination) whereas Escape subjects do, and Control subjects are presumably not affected by this variable.

The result? Not one member of the Yoked group realized that they had no control over the sounds. Many members came to repeat particular patterns of “superstitious” behavior. Indeed, the Yoked and Escape groups had very similar perceptions of task controllability. Faced with randomness, the participants saw complexity.

Does that mean the participants were stupid? Not at all. We all exhibit the same superstitious behavior when we believe we can influence chaotic or simple systems.

Funnily enough, animal studies have revealed much the same. In particular, consider B.F. Skinner’s well-known research on the effects of random rewards on pigeons. Skinner placed hungry pigeons in cages equipped with a random-food-delivery mechanism. Over time, the pigeons came to believe that their behavior affected the food delivery. Skinner described this as a form of superstition. One bird spun in counterclockwise circles. Another butted its head against a corner of the cage. Other birds swung or bobbed their heads in specific ways. Although there is some debate as to whether “superstition” is an appropriate term to apply to birds, Skinner’s research shed light on the human tendency to see things as being more complex than they actually are.

Skinner wrote (in “‘Superstition’ in the Pigeon,” Journal of Experimental Psychology, 38):

The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's fortune at cards are good examples. A few accidental connections between a ritual and favorable consequences suffice to set up and maintain the behavior in spite of many unreinforced instances. The bowler who has released a ball down the alley but continues to behave as if he were controlling it by twisting and turning his arm and shoulder is another case in point. These behaviors have, of course, no real effect upon one's luck or upon a ball half way down an alley, just as in the present case the food would appear as often if the pigeon did nothing—or, more strictly speaking, did something else.

The world around us is a chaotic, entropic place. But it is rare for us to see it that way.

In Living with Complexity, Donald A. Norman offers a perspective on why we need complexity:

We seek rich, satisfying lives, and richness goes along with complexity. Our favorite songs, stories, games, and books are rich, satisfying, and complex. We need complexity even while we crave simplicity… Some complexity is desirable. When things are too simple, they are also viewed as dull and uneventful. Psychologists have demonstrated that people prefer a middle level of complexity: too simple and we are bored, too complex and we are confused. Moreover, the ideal level of complexity is a moving target, because the more expert we become at any subject, the more complexity we prefer. This holds true whether the subject is music or art, detective stories or historical novels, hobbies or movies.

As an example, Norman asks readers to contemplate the complexity we attach to tea and coffee. Most people in most cultures drink tea or coffee each day. Both are simple beverages, made from water and coffee beans or tea leaves. Yet we choose to attach complex rituals to them. Even those of us who would not consider ourselves to be connoisseurs have preferences. Offer to make coffee for a room full of people, and we can be sure that each person will want it made in a different way.

Coffee and tea start off as simple beans or leaves, which must be dried or roasted, ground and infused with water to produce the end result. In principle, it should be easy to make a cup of coffee or tea. Simply let the ground beans or tea leaves [steep] in hot water for a while, then separate the grounds and tea leaves from the brew and drink. But to the coffee or tea connoisseur, the quest for the perfect taste is long-standing. What beans? What tea leaves? What temperature water and for how long? And what is the proper ratio of water to leaves or coffee?

The quest for the perfect coffee or tea maker has been around as long as the drinks themselves. Tea ceremonies are particularly complex, sometimes requiring years of study to master the intricacies. For both tea and coffee, there has been a continuing battle between those who seek convenience and those who seek perfection.

Complexity, in this way, can enhance our enjoyment of a cup of tea or coffee. It’s one thing to throw some instant coffee in hot water. It’s different to select the perfect beans, grind them ourselves, calculate how much water is required, and use a fancy device. The question of whether this ritual makes the coffee taste better or not is irrelevant. The point is the elaborate surrounding ritual. Once again, we see complexity as superior.

“Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it. And to make matters worse: complexity sells better.”

— Edsger W. Dijkstra

The Problem with Complexity

Imagine a person who sits down one day and plans an elaborate morning routine. Motivated by the routines of famous writers they have read about, they lay out their ideal morning. They decide they will wake up at 5 a.m., meditate for 15 minutes, drink a liter of lemon water while writing in a journal, read 50 pages, and then prepare coffee before planning the rest of their day.

The next day, they launch into this complex routine. They try to keep at it for a while. Maybe they succeed at first, but entropy soon sets in and the routine gets derailed. Sometimes they wake up late and do not have time to read. Their perceived ideal routine has many different moving parts. Their actual behavior ends up being different each day, depending on random factors.

Now imagine that this person is actually a famous writer. A film crew asks to follow them around on a “typical day.” On the day of filming, they get up at 7 a.m., write some ideas, make coffee, cook eggs, read a few news articles, and so on. This is not really a routine; it is just a chaotic morning based on reactive behavior. When the film is posted online, people look at the morning and imagine they are seeing a well-planned routine rather than the randomness of life.

This hypothetical scenario illustrates the issue with complexity: it is unsustainable without effort.

The more individual constituent parts a system has, the greater the chance of its breaking down. Charlie Munger once said that “Where you have complexity, by nature you can have fraud and mistakes.” Any complex system — be it a morning routine, a business, or a military campaign — is difficult to manage. Addressing one of the constituent parts inevitably affects another (see the Butterfly Effect). Unintended and unexpected consequences are likely to occur.

As Daniel Kahneman and Amos Tversky wrote in 1974 (in Judgment Under Uncertainty: Heuristics and Biases): “A complex system, such as a nuclear reactor or the human body, will malfunction if any of its essential components fails. Even when the likelihood of failure in each component is slight, the probability of an overall failure can be high if many components are involved.”

This is why complexity is less common than we think. It is unsustainable without constant maintenance, self-organization, or adaptation. Chaos tends to disguise itself as complexity.

“Human beings are pattern-seeking animals. It's part of our DNA. That's why conspiracy theories and gods are so popular: we always look for the wider, bigger explanations for things.”

— Adrian McKinty, The Cold Cold Ground

Complexity Bias and Conspiracy Theories

A musician walks barefoot across a zebra-crossing on an album cover. People decide he died in a car crash and was replaced by a lookalike. A politician’s eyes look a bit odd in a blurry photograph. People conclude that he is a blood-sucking reptilian alien taking on a human form. A photograph shows an indistinct shape beneath the water of a Scottish lake. The area floods with tourists hoping to glimpse a surviving prehistoric creature. A new technology overwhelms people. So, they deduce that it is the product of a government mind-control program.

Conspiracy theories are the ultimate symptom of our desire to find complexity in the world. We don’t want to acknowledge that the world is entropic. Disasters happen and chaos is our natural state. The idea that hidden forces animate our lives is an appealing one. It seems rational. But as we know, we are all much less rational and logical than we think. Studies have shown that a high percentage of people believe in some sort of conspiracy. It’s not a fringe concept. According to research by Joseph E. Uscinski and Joseph M. Parent, about one-third of Americans believe the notion that Barack Obama’s birth certificate is fake. Similar numbers are convinced that 9/11 was an inside job orchestrated by George Bush. Beliefs such as these are present in all types of people, regardless of class, age, gender, race, socioeconomic status, occupation, or education level.

Conspiracy theories are invariably far more complex than reality. Although education does reduce the chances of someone’s believing in conspiracy theories, one in five Americans with postgraduate degrees still hold conspiratorial beliefs.

Uscinski and Parent found that, just as uncertainty led Skinner’s pigeons to see complexity where only randomness existed, a sense of losing control over the world around us increases the likelihood of our believing in conspiracy theories. Faced with natural disasters and political or economic instability, we are more likely to concoct elaborate explanations. In the face of horrific but chaotic events such as Hurricane Katrina, or the recent Grenfell Tower fire, many people decide that secret institutions are to blame.

Take the example of the “Paul McCartney is dead” conspiracy theory. Since the 1960s, a substantial number of people have believed that McCartney died in a car crash and was replaced by a lookalike, usually said to be a Scottish man named William Campbell. Of course, conspiracy theorists declare, The Beatles wanted their most loyal fans to know this, so they hid clues in songs and on album covers.

The beliefs surrounding the Abbey Road album are particularly illustrative of the desire to spot complexity in randomness and chaos. A police car is parked in the background — an homage to the officers who helped cover up the crash. A car’s license plate reads “LMW 28IF” — naturally, a reference to McCartney being 28 if he had lived (although he was 27) and to Linda McCartney (whom he had not met yet). Matters were further complicated once The Beatles heard about the theory and began to intentionally plant “clues” in their music. The song “I’m So Tired” does in fact feature backwards mumbling about McCartney’s supposed death. The 1960s were certainly a turbulent time, so is it any wonder that scores of people pored over album art or played records backwards, looking for evidence of a complex hidden conspiracy?

As Henry Louis Gates Jr. wrote, “Conspiracy theories are an irresistible labor-saving device in the face of complexity.”

Complexity Bias and Language

We have all, at some point, had a conversation with someone who speaks like philosopher Theodor Adorno wrote: using incessant jargon and technical terms even when simpler synonyms exist and would be perfectly appropriate. We have all heard people say things which we do not understand, but which we do not question for fear of sounding stupid.

Jargon is an example of how complexity bias affects our communication and language usage. When we use jargon, especially out of context, we are putting up unnecessary semantic barriers that reduce the chances of someone’s challenging or refuting us.

In an article for The Guardian, James Gingell describes his work translating scientific jargon into plain, understandable English:

It’s quite simple really. The first step is getting rid of the technical language. Whenever I start work on refining a rough-hewn chunk of raw science into something more pleasant I use David Dobbs’ (rather violent) aphorism as a guiding principle: “Hunt down jargon like a mercenary possessed, and kill it.” I eviscerate acronyms and euthanise decrepit Latin and Greek. I expunge the esoteric. I trim and clip and pare and hack and burn until only the barest, most easily understood elements remain.

[…]

Jargon…can be useful for people as a shortcut to communicating complex concepts. But it’s intrinsically limited: it only works when all parties involved know the code. That may be an obvious point but it’s worth emphasising — to communicate an idea to a broad, non-specialist audience, it doesn’t matter how good you are at embroidering your prose with evocative imagery and clever analogies, the jargon simply must go.”

Gingell writes that even the most intelligent scientists struggle to differentiate between thinking (and speaking and writing) like a scientist, and thinking like a person with minimal scientific knowledge.

Unnecessarily complex language is not just annoying. It's outright harmful. The use of jargon in areas such as politics and economics does real harm. People without the requisite knowledge to understand it feel alienated and removed from important conversations. It leads people to believe that they are not intelligent enough to understand politics, or not educated enough to comprehend economics. When a politician talks of fiscal charters or rolling four-quarter growth measurements in a public statement, they are sending a crystal clear message to large numbers of people whose lives will be shaped by their decisions: this is not about you.

Complexity bias is a serious issue in politics. For those in the public eye, complex language can be a means of minimizing the criticism of their actions. After all, it is hard to dispute something you don't really understand. Gingell considers jargon to be a threat to democracy:

If we can’t fully comprehend the decisions that are made for us and about us by the government then how we can we possibly revolt or react in an effective way? Yes, we have a responsibility to educate ourselves more on the big issues, but I also think it’s important that politicians and journalists meet us halfway.

[…]

Economics and economic decisions are more important than ever now, too. So we should implore our journalists and politicians to write and speak to us plainly. Our democracy depends on it.

In his essay “Politics and the English Language,” George Orwell wrote:

In our time, political speech and writing are largely the defence of the indefensible. … Thus, political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenceless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called pacification. Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers. People are imprisoned for years without trial, or shot in the back of the neck or sent to die of scurvy in Arctic lumber camps: this is called elimination of unreliable elements.

An example of the problems with jargon is the Sokal affair. In 1996, Alan Sokal (a physics professor) submitted a fabricated scientific paper entitled “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity.” The paper had absolutely no relation to reality and argued that quantum gravity is a social and linguistic construct. Even so, the paper was published in a respected journal. Sokal’s paper consisted of convoluted, essentially meaningless claims, such as this paragraph:

Secondly, the postmodern sciences deconstruct and transcend the Cartesian metaphysical distinctions between humankind and Nature, observer and observed, Subject and Object. Already quantum mechanics, earlier in this century, shattered the ingenious Newtonian faith in an objective, pre-linguistic world of material objects “out there”; no longer could we ask, as Heisenberg put it, whether “particles exist in space and time objectively.”

(If you're wondering why no one called him out, or more specifically why we have a bias to not call BS out, check out pluralistic ignorance).

Jargon does have its place. In specific contexts, it is absolutely vital. But in everyday communication, its use is a sign that we wish to appear complex and therefore more intelligent. Great thinkers throughout the ages have stressed the crucial importance of using simple language to convey complex ideas. Many of the ancient thinkers whose work we still reference today — people like Plato, Marcus Aurelius, Seneca, and Buddha — were known for their straightforward communication and their ability to convey great wisdom in a few words.

“Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius — and a lot of courage — to move in the opposite direction.”

— Ernst F. Schumacher

How Can We Overcome Complexity Bias?

The most effective tool we have for overcoming complexity bias is Occam’s razor. Also known as the principle of parsimony, this is a problem-solving principle used to eliminate improbable options in a given situation. Occam’s razor suggests that the simplest solution or explanation is usually the correct one. When we don’t have enough empirical evidence to disprove a hypothesis, we should avoid making unfounded assumptions or adding unnecessary complexity so we can make quick decisions or establish truths.

An important point to note is that Occam’s razor does not state that the simplest hypothesis is the correct one, but states rather that it is the best option before the establishment of empirical evidence. It is also useful in situations where empirical data is difficult or impossible to collect. While complexity bias leads us towards intricate explanations and concepts, Occam’s razor can help us to trim away assumptions and look for foundational concepts.

Returning to Skinner’s pigeons, had they known of Occam’s razor, they would have realized that there were two main possibilities:

  • Their behavior affects the food delivery.

Or:

  • Their behavior is irrelevant because the food delivery is random or on a timed schedule.

Using Occam’s razor, the head-bobbing, circles-turning pigeons would have realized that the first hypothesis involves numerous assumptions, including:

  • There is a particular behavior they must enact to receive food.
  • The delivery mechanism can somehow sense when they enact this behavior.
  • The required behavior is different from behaviors that would normally give them access to food.
  • The delivery mechanism is consistent.

And so on. Occam’s razor would dictate that because the second hypothesis is the simplest, involving the fewest assumptions, it is most likely the correct one.

So many geniuses, are really good at eliminating unnecessary complexity. Einstein, for instance, was a master at sifting the essential from the non-essential. Steve Jobs was the same.

Comment on Facebook | Twitter

[Episode 27] The Art of Letting Other People Have Your Way: Negotiating Secrets from Chris Voss

Whether you’re buying a car, requesting a raise at work, or just deciding where to eat out with your spouse or partner, your negotiating skills will determine how pleased you are with the outcome.

Today, we have the special opportunity to learn some of the most effective tactics and strategies from a true master, Chris Voss.

Chris is the former lead international kidnapping negotiator for the FBI and author of the excellent book, Never Split the Difference: Negotiating As Though Your Life Depended On It.

In this fascinating conversation, Chris shares how you can use the same techniques that have been field tested in some of the most high-stakes, pressure cooker situations, in your daily life.

If you want to become a better haggler, a better communicator, or a better listener, don’t miss this episode. It’s packed with actionable insights you can start using today to be more persuasive and grab hold of more of what you want in life.

Here are just a few things we cover:

  • What it really takes to be great at negotiating (most people approach it all wrong)
  • How to keep your emotions in check in a negotiation
  • The three different voices you use to connect with your counterpart and put them at ease
  • How many of us “take ourselves hostage” in a negotiation and ruin it before it starts
  • The biggest time-waster (and profit-killer) that plagues so many negotiations
  • The main problems with traditional negotiation techniques (BATNA etc) and how they’re leaving lots on the table
  • The “negotiation one-sheet” Chris uses before entering into any negotiation (and how you can use it to)
  • How to use an “accusations audit” when you’re structuring winning deals (this is brilliant)
  • One technique to get your counterpart to spill their guts when they’re trying to be tight lipped. “Prospect theory” and how to use it to your advantage
  • Maximizing employee satisfaction in the hiring process so you get the best talent…and keep them!
  • How empathy saves time and makes you more likely to get what you want in a negotiation
  • The power of deference (and when to use it)
  • Chris’ go to tools that work best on all personality types, in nearly any situation
  • How intentionally getting the other party to say “no” substantially increases the success rate of a negotiation

And much more.

Listen

Transcript

A transcript is available to members of our learning community or for purchase separately ($9).

More Episodes

A complete list of all of our podcast episodes.

***

Join the conversation on Facebook or Twitter

Footnotes
  • 1

    Image Copyright Kate Haley Photography

Intuition vs. Rationality: Where One Stops the Other Starts

Here's an interesting passage from Anne Lamott, found in Bird by Bird: Some Instructions on Writing and Life, that requires some consideration.

You get your intuition back when you make space for it, when you stop the chattering of the rational mind. The rational mind doesn't nourish you. You assume that it gives you the truth, because the rational mind is the golden calf that this culture worships, but this is not true. Rationality squeezes out much that is rich and juicy and fascinating.

The great French mathematician Henri Poincaré said something adding to our understanding of the roles that both rationality and intuition play in discovery: “It is through science that we prove, but through intuition that we discover.”

Furthering our understanding, I ran across a quote by Steve Jobs on the same topic: “Intuition is a very powerful thing, more powerful than intellect, in my opinion.” The source of that quote is Walter Isaacson's biography of Jobs:

The people in the Indian countryside don’t use their intellect like we do, they use their intuition instead, and the intuition is far more developed than in the rest of the world… Intuition is a very powerful thing, more powerful than intellect, in my opinion. That’s had a big impact on my work.

Western rational thought is not an innate human characteristic, it is learned and it is the great achievement of Western civilization. In the villages of India, they never learned it. They learned something else, which is in some ways just as valuable but in other ways is not. That’s the power of intuition and experiential wisdom.

It's not really acceptable to admit but most of the time we make our decisions on intuition, rationalizing them after the fact by cherry picking. (If you want to see what it looks like to make rational decisions and catalouge your data, try using our decision journal for a month.) Intuition can be thought of as subconscious pattern matching, honed over weeks, years, and decades. The more we are within our circle of competence the more likely our intuition proves correct.

The point isn't choosing between cold rationality and intuition but rather understanding that each serves a purpose. If we let it, intuition can be an able guide but we must check it when the consequences of being wrong are high.

Footnotes

The Best of Farnam Street 2017

Here's a look at the most popular articles we wrote this year, including what really separates amateurs and professionals, a system for remembering what you read, the powerful interviews of Naval Ravikant, Ray Dalio, and Rory Sutherland, how we abuse time, and so much more.

***

1. The Difference Between Amateurs and Professionals — There are a host of other differences, but they can effectively be boiled down to two things: fear and reality. Amateurs believe that the world should work the way they want it to. Professionals realize that they have to work with the world as they find it. Amateurs are scared — scared to be vulnerable and honest with themselves. Professionals feel like they are capable of handling almost anything.

2. How to Remember What You Read — Why is it that some people seem to be able to read a book once and remember every detail of it for life, while others struggle to recall even the title a few days after putting down a book? The answer is simple but not easy. It's not what they read. It's how they read.

3. Naval Ravikant on Reading, Happiness, Systems for Decision Making, Habits, Honesty and More — In this wide-ranging interview, we talk about reading, habits, decision-making, mental models, and life. Just a heads up, this is the longest podcast I’ve ever done. While it felt like only thirty minutes, our conversation lasted over two hours!

4. The Difference Between Open-Minded and Closed-Minded People — The rate at which you learn and progress in the world depends on how willing you are to weigh the merit of new ideas, even if you don’t instinctively like them. Perhaps especially if you don’t like them. What’s more, placing your trust and effort in the right mentor can propel you forward, just as placing it in the wrong person can send you back to the starting point.

5. Maker vs. Manager: How Your Schedule Can Make or Break You – If you’re a maker on a manager’s schedule or a manager on a maker’s schedule, you could be spinning your wheels. Find out the ideal way to schedule your day for maximum results.

6. Charlie Munger on Getting Rich, Wisdom, Focus, Fake Knowledge and More — While we can't have his genetics, we can try to steal his approach to rationality. There's almost no limit to the amount one could learn from studying the Munger mind, so let's at least start with a rundown of some of his best ideas.

7. Habits vs. Goals: A Look at the Benefits of a Systematic Approach to Life — The power of habits comes from their automaticity. This is why they are more powerful than goals. Read this article to harness the power of habits.

8. The Code of Hammurabi: The Best Rule To Manage Risk — King Hammurabi of Babylon, created Hammurabi’s Code. The laws were more effective at containing risk than today's laws. Here's why they were effective.

9. Life Lessons from a Self-Made Billionaire: My Conversation with Ray Dalio — In this interview with billionaire investor and entrepreneur Ray Dalio, you’ll learn the principles Ray prescribes for making better decisions, fewer mistakes, and creating meaningful relationships with the people in your life.

10. 29 of the Most Gifted and Highly Recommended Books — It started with a simple question: What book (or books) have you given away to people the most and why? The email was sent to an interesting subset of people I’ve interacted with over the past year — CEOs, entrepreneurs, best-selling authors, hedge fund managers, and more.

11. The Butterfly Effect: Everything You Need to Know About This Powerful Mental Model — The Butterfly Effect shows that we cannot predict the future or control powerful complex systems. Read to learn more about this mental model.

12. The Wrong Side of Right — One big mistake I see people make over and over is focusing on proving themselves right, instead of focusing on achieving the best outcome. People who are working to prove themselves right will work hard finding evidence for why they’re right. They’ll go to the ends of the earth to disagree with someone who has another idea. Everything becomes about their being right. These otherwise well-intentioned people are making the same costly mistake that I did.

13. The Generalized Specialist: How Shakespeare, Da Vinci, and Kepler Excelled – Should we generalize or specialize? This article explores how Shakespeare and Da Vinci excelled by branching out from their core competencies.

14. Seneca on The Shortness of Time — If we see someone throwing money away, we call that person crazy. This bothers us, in part, because money has value. Wasting it seems nuts. And yet we see others—and ourselves—throw away something far more valuable every day: Time.

15. Rory Sutherland on The Psychology of Advertising, Complex Evolved Systems, Reading, Decision Making — In this wide-ranging interview with Rory Sutherland (the Vice Chairman of Ogilvy & Mather Group, which is one of the largest advertising companies in the world), we talk about: how advertising agencies are solving airport security problems, what Silicon Valley misses, how to mess with self-driving cars, reading habits, decision making, the intersection of advertising and psychology, and so much more.

16. How to Live on 24 Hours a Day: Arnold Bennett on Living a Meaningful Life Within the Constraints of Time — Despite having been published in 1910, Arnold Bennett’s book How to Live on 24 Hours a Day remains a valuable resource on living a meaningful life within the constraints of time. In the book, Bennett addresses one of our oldest questions: how can we make the best use of our lives? How can we make the best use of our time?

17. Thought Experiment: How Einstein Solved Difficult Problems — Read this and learn how the mental model of thought experiment, helped people like Albert Einstein, Zeno, and Galileo solve difficult problems.

Go back in time and see the best of 2016.

Farnam Street’s 2017 Annual Letter to Readers

Most public companies issue an annual letter to shareholders. These letters present an opportunity for the people entrusted to run the company on behalf of the shareholders to communicate with the people who own the company. In 2015, I started a similar tradition at Farnam Street.

To a large extent, I consider you the owners of Farnam Street, but you trust me with something far more valuable than money: time. For all of us, time is finite. Reading Farnam Street means you’re not doing something else. My job is to make sure your investment is getting an above-average return.

***

In almost every reader-related metric, 2017 was a record year.

Readership increased over 40%, which was decent. We surpassed 155,000 subscribers to our weekly newsletter, Brain Food. Tempering this excitement is the fact that email open rates dipped slightly. (We’ve reached the limits of our current mail provider and will be transitioning at some point, likely in Q1 or early Q2, to another provider. For readers, the transition should be seamless.)

Last year I wrote about how most sites try to hijack your animal responses by using outrage to acquire traffic. We’ve never set out to consciously “acquire” an audience. Our readers tell their friends, family members, and co-workers and we grow slowly. I’m okay with that. I write for the million “me’s” out there.

Visitors continued to spend more time on the site (a good proxy for how interested people are in the content). The bounce rate (a fancy phrase for the percentage of people who look at one page and then leave the site) continued to move in the right direction. In short, we had more people who read longer and looked at more pages. Next year we'll to a better job mixing in some shorter articles.

***

Thanks to Grain and Mortar, we completed a significant revamp of the website that has made it cleaner and easier to navigate. Our looks are finally starting to catch up to our content, a happy difference from what happens with people over time.

***

Events

In 2017, we offered two public Re:Think Workshops (Innovation and Decision-Making). We continue to limit attendance at these events to ensure a good experience for everyone. We sold out the 2018 version of Re:Think Decision-Making in only two weeks. Join the waiting list to hear about events first.

One of the most surprising things about the events for attendees is that they get to meet people who are curious, kind, and intelligent. In short, people just like them. The quality of the individuals continues to impress me. In 2017, we had attendees from WordPress, Adobe, Amazon, Shopify, Risebar, Red Bull, Satori Capital, United Way, Convexity Capital, and more.

This summer, we also held our first Think Week event in Europe (Paris, to be specific), and it was a hit. We’ll be doing at least two of these events next year: one in the Bahamas (sorry already sold out) in February and one in Europe in the summer. (These events vary in intensity, so read carefully before signing up. The event in the Bahamas will be low key, more of a “read all day and let’s meet at dinner to discuss interesting things” gathering, whereas the Europe one will be way more intimate and intense.) Because these events are so small, I’ve decided that people will have to apply to come. If you’re going to spend 30 hours with someone over three days, cramped in my apartment, you want to know that I’ve curated the audience.

The Knowledge Project

What a difference a year makes. Despite having the most irregular podcast in the world, we had over 1.5 million downloads this year. To put that in perspective, we released only 10 episodes this year. The quality of our guests is amazing, as you can tell from the interviews with Naval Ravikant, Rory Sutherland, Adam Grant, Ray Dalio, Susan Cain, Gary Taubes, and more.

We already have a terrific roster lined up for next year. I’m aiming to release a new episode once every three to four weeks instead of once every six to eight weeks. Given the intensity of the research that goes into The Knowledge Project guests, I have no idea how other shows release episodes with the frequency they do.

Tools and More

About four years ago, we started to purchase online courses to see what was going on. As students, we didn’t like what we saw, which was a lot of entertainment and not a lot of outcome changes. So, we set to work and created three courses over the last three years, testing various ways to improve outcomes. Through various iterations, we’ve landed on a formula that consistently delivers results for people.

Productivity That Gets Results, our popular productivity seminar, has been closed (we’ve stopped selling it). However, members of our Learning Community will have access to it starting in January.

The Art of Reading—I dropped the ball on this one — my team and I worked really hard updating the content and adding new resources for our students…and then I didn’t tell anyone. I am a terrible marketer. This course will show you how to sift through information more quickly, squeeze the best ideas out of any book or article, and absorb what you read so you can access that knowledge for years to come.

The Art of Focus is being offered for sale until early January and then I’m closing it for good. The course was a smashing success, but the accountability element takes considerable time. (Note: If you join the program before it closes, you'll still get all the feedback, support and open access to the course that previous students enjoy. You're grandfathered in forever.)

I’ve started working on a new course. You’re going to love it.

The Learning Community

The quality of our members—from entrepreneurs and Fortune 500 CEOs to professional coaches, athletes, and bestselling authors—remains remarkable. We’ve really hit on something that delivers.

Last year I said that we would use some of the proceeds from the Learning Community to improve the quality of the regular, free content. To that end, we hired our first professional editor in July and the overall quality of our content has gone way up.

In 2017, I started a weekly email to members of the Learning Community. It’s a short bit of content that is more practical than that in our blog posts.

Rest assured, the majority of FS content will always be free. If you find value in Farnam Street, we hope you’ll consider joining the Learning Community. Now you can give a membership to your smart friends and loved ones.

Team Farnam

Perceptive readers know there is more to Farnam Street than me. Behind the scenes is an evolving team and freelancers who make things happen.

This fall Jeff told me that he wanted to pursue another opportunity. Jeff joined Farnam in November of 2015, and his contributions have been many. Jeff has never been one to stand up and brag about what he’s done, but he’s a great friend.

Sponsorships

I want to thank our main 2017 sponsor, Royce Funds. Other supporting sponsors included Greenhaven Road, Tiny, 2CVElysium Health, Ray Dalio and Principles, the Heath brothersAdventur.es, and Syrus Partners.

Royce Funds, Greenhaven Road, Tiny, Syrus and Elysium will be back in 2018. We still have a few open slots for next year, so if you’d like to inquire about sponsoring the blog, please get in touch with me.

2017 Report Card

Last year I wrote, “In 2017, we will work to better synthesize, connect, and explain timeless ideas that help you make better decisions, avoid stupidity, and kick-ass at life. I’ll try to add more personal stories and anecdotes from my journey.”

I’m pleased but not satisfied with our results.

True to our tagline, we’re focused on mastering the best of what other people have already figured out. I always have a hard time with personal stories, though, because I never view the content as being about me. In fact, any of the ideas that you come across on the site that are useful are not mine. I have started opening up about my experiences a bit in the Learning Community and that has been well received.

I also told you that we’d find great guests for our podcast, The Knowledge Project, increase the value proposition for Learning Community members, and work on some books. Yes, books.

As mentioned earlier, we’ve found amazing guests for the podcast. I’m still the same me, but it turns out that once your audience reaches a certain size, the ratio of “yes” responses to “no” responses inverts. So to all of those people who don’t like the dedicated email about the podcast, remember that it helps us get better guests (and, because it’s sponsored, pays the bills).

Creating value for members of our Learning Community is a tricky proposition, summed up in the words of one member: “I get so much value from your free content that I joined the learning community as a means to support what you’re doing. I didn’t realize there was so much more practical value in the learning community. … What I value most is that you respect my time and don’t make anything longer than it needs to be.” In 2017, we started sending weekly emails that are more practical in nature to the Learning Community. We have many more things coming up for the LC in 2018.

As many of you know, we’re huge fans of mental models. The problem is that when I set out to read about mental models years ago, there wasn’t a good source of information in one place. Where could I find timeless ideas to help me learn, think, and decide?

Farnam Street has filled that void for many, but we’ve been inundated with requests to write a book about mental models. The first volume, internally dubbed Thinking Tools, will be released soon. Rather than being a version of the website, it’s a fresh start at intelligently preparing ourselves for the world. Whether readers are high school students or newly retired seniors, this well-designed book will hopefully have a place on their shelves for generations.

***

You Are What You Consume

The people you spend time with shape who you are. As Goethe said, “tell me with whom you consort and I will tell you who you are.” But Goethe didn’t know about the internet. It’s not just the people you spend your time with in person who shape you; the people you spend time with online shape you as well.

Tell me what you read on a regular basis and I will tell you what you likely think. Creepy? Think again. Facebook already knows more about you than your partner does. They know the words that resonate with you. They know how to frame things to get you to click. And they know the thousands of people who look at the same things online that you do.

When you’re reading something online, you’re spending time with someone. These people determine our standards, our defaults, and often our happiness.

Every year, I make a point of reflecting on how I’ve been spending my time. I ask myself who I’m spending my time with and what I’m reading, online and offline. The thread of these questions comes back to a common core: Is where I’m spending my time consistent with who I want to be?

Am I reading things that challenge me and make me want to be a better person, or am I spending too much time on topical things that are meant to entertain me? If you read indiscriminately, you’re wasting vast amounts of time.

Am I spending my time with people who are consistent with who I want to be as a person? Are they constantly learning? Are they generous and kind? Are they challenging me and calling me out on my bullshit?

These are not easy choices. However, hard decisions about whom you hang around with and what information you consume changes your vector and your velocity. Hard choices make for better decisions, more free time, and a better understanding of reality.

Think about dating. We seem to understand that happy people and unhappy people don’t generally get along. If you’re a happy and ambitious person and you go on a first date with someone who hates their job, complains about past partners, and generally wants to zone out of life, you instantly feel repelled by this person. You know, subconsciously, that this attitude is highly contagious and needs to be removed from your life before it spreads. The longer you’re in contact with people like this, the more likely you’ll become them.

What we don’t understand is that this principle applies to whom, or what, we spend our time with online as well. If you consume shallow content, then before you know it, you’ll have shallow opinions. If you’re not careful, the world will become black and white rather than various shades of grey.

Most of what we spend our time with online doesn’t make us better, but rather shouts at us and distracts us. And most of it is just bullshit click-bait anyway, with no more depth than a book summary on Amazon.

Consider:

  • The article on how to network like a boss offers advice on how to get ahead by thinking of people in terms of what they can do for you. After all, if they can’t do something immediate and gratifying for you, the next person is just a swipe away. Not only does this kind of behavior make you more likely to be selfish, but it also misses the point of networking altogether, which is to spend time with people who think better than you do and connect with them in meaningful ways.
  • The article on how to get a promotion shows you how to position yourself for favorable optics. Not only does this mean that you’re going to spend more of your time demonstrating how much value you deliver and less of your time delivering value, but it’s also going to make you less likely to get along with your co-workers.
  • The article on how to become more productive was written by someone who has no idea of what your life is actually like. And it focuses on how to do email faster instead of on how to do less email, so you only end up getting better at moving widgets. And here’s the thing: when you’re better at moving widgets, your reward is to move more widgets. And if you’re moving more widgets, you never have time to do something better.

Think about it. If the person writing the article churns out an article a day on 200 subjects a year, how much are you really going to learn from them? You have to consider both the content you’re getting and the sources. Are the writers fluent in their subjects? Are they well read? How credible are they?

Not to mention, a vast swath of what we consume makes us miserable. So much of what we are surrounded by is fake happiness. We want people to think we’re happy when we’re not. The louder and more frequently someone says their partner is “just the most amazing person in the whole world,” the more I suspect relationship issues. When we only see other people having happiness — real or fake — our minds trick us into thinking that we’re the only ones who are struggling. So we hide it, and by hiding it, we become more isolated and alone.

Increasingly, the feeds we follow show us an endless array of people having a good time, traveling, partying it up, and more. Individually, our friends might be able to do this once a year, but when you follow a few hundred accounts, you’re virtually assured that on any given day, one of those people is doing something marvelous. This makes us feel like crap: Why can’t I keep the house clean, pick up the kids, and not feel rushed all the time? Why do they have so much free time? Why didn’t they invite me? I want to be there. Why are those people always happy? How did they get so successful? I work just as hard as they do. And so on. We are surrounded by unrealistically positive expectations, which just remind us of what we don’t have: free time, money, an obsessively healthy lifestyle, diamonds, and a soul mate.

Nothing looks the same again. We feel alone. It seems like other people are nothing but successful and we do nothing but struggle. As our misery increases, we hide our struggles more and just show others the good stuff. Only it’s not real; we’ve just become part of the crowd of people pretending there is no struggle.

Well, I hate to break it to you, but I’m human. I struggle. A lot. Here’s what you miss with the curated feeds: In the past year alone, I’ve been on my couch crying; I’ve been betrayed by a close friend; I’ve tried and failed to develop a relationship with my biological father; I’ve had days when I think it would be easier to win an Olympic gold medal than to get my kids to school without losing my patience; I’ve been so exhausted that I can barely keep my eyes open; I’ve looked at a sink full of dishes and said “not tonight”; and there is so much more. The point is, you might see the results, but you don’t see the struggle. And when you see only the best in others, without seeing the reality of others, you are nudged toward thinking less of yourself.

So if your Facebook feed is full of happy people doing things that make you feel bad about your life, either change the feed or be conscious of the fact that everyone struggles from time to time but not everyone lets you see it. Be aware of how what you’re seeing affects you. And remember that the people you allow into your life, both in person and online, are the people you will end up becoming.

Curate carefully. Choose people who add value to your life and meaning to your relationships. And stop giving a damn about what other people think.

***

As I wrote last year, “Velocity is a vector-dependent concept. Moving in two directions that are not 100% aligned creates drag.”

While I still said yes to too many things in 2017, I’m getting better at saying no. The best thing I did this year was to switch my default to “no” for all meetings. If I can’t say no, I schedule the meeting for the afternoon. If I have to do it in person, I make it close to my office.

What helps me say no to meetings? Some simple tests: Am I willing to have this meeting right now? Would I rearrange my calendar for this meeting? If I’m not willing to sacrifice something, even something small, for the meeting, maybe it’s not worth having.

Thank you for your continued time and trust in me and Farnam Street. I will continue to try to earn it.

—Shane

(See what a difference a year makes. You can find the 2016 letter here.)