Author: Farnam Street

[Episode #30] Company Culture, Collaboration and Competition: A Discussion With Margaret Heffernan

Today, I’m joined by speaker, international executive, and five-time author Margaret Heffernan. We discuss how to get the most out of our people, creating a thriving culture of trust and collaboration, and how to prevent potentially devastating “willful blindness.”


As former CEO of five successful businesses, Margaret Heffernan has been on the front lines observing the very human tendencies (selective blindness, conflict avoidance, and self-sabotage to name a few) that cause managers and sometimes entire organizations to go astray.

She has since written five books and has spoken all over the world to warn, educate and instruct leaders to not only be aware of these tendencies, but how to weed them out of our companies, our business, and even our relationships.

In this conversation, we discuss many of the concepts she shares in her books, namely:

  • How to tap into the collective knowledge of your organization so problems are solved quickly, efficiently, and cooperatively.
  • The strange experiment Margaret ran to build “social capital” in one of her early businesses that transformed the way her employees treated and interacted with each other
  • How to build a culture that doesn’t create in-fighting and unhealthy competition within your organization, and how many companies today are missing the mark
  • One simple thing you can do as a leader to increase the buy-in, productivity and overall satisfaction of your team members (and it takes less than 30 seconds to do.)
  • The dangers of binary thinking and how Margaret catches herself from oversimplifying a situation.
  • Why arguing may be one of the purest forms of collaboration — and how to do it correctly.
  • How to identify the environment and context where you do your best work and how to best replicate it.
  • How “willful blindness” has caused catastrophic disasters in business, professional and personal relationships, and what we can do to avoid being another statistic
  • The wonderful advice Margaret gave to her kids when it came to choosing a career path

And much more.

If you interact with other human beings in any capacity, you need to hear what Margaret has to say.

Take a listen and let me know what you think!



An edited copy of this transcript is available to members of our learning community or for purchase separately ($7).

If you liked this, check out all the episodes of the knowledge project.


Members can discuss this post on the Learning Community Forum.

Understanding Speed and Velocity: Saying “NO” to the Non-Essential

It's tempting to think that in order to be a valuable team player, you should say “yes” to every request and task that is asked of you. People who say yes to everything have a lot of speed. They're always doing stuff but never getting anything done. Why? Because they don't think in terms of velocity. Understanding the difference between speed and velocity will change how you work.

I once worked for someone who offered me the opportunity to work on a new project nearly every day. These projects were not the quick ones, where you spend 15 minutes and crank out a solution. They were crap work. And there were strings: my boss wanted to be informed about everything, and there was no way I’d get credit for anything.

I remember my response: “That sounds amazing, but it’s not for me. I’m busy enough.”

Saying no to your boss, especially as often as I did, was thought to be risky to your career. I was the new kid, which is why I was getting all of these shit jobs thrown at me.

The diversity of skill sets needed to accomplish them would have made me look bad (perhaps the subtle point of this initiation). Furthermore, my already heavy workload would have gotten heavier with projects that didn’t move me forward. This was my first introduction to busywork.

My well-intentioned colleagues were surprised. “You’re not going to get anywhere with that attitude,” they’d pull me aside to tell me. The problem was that I wasn’t going to get anywhere by saying yes to a lot of jobs that consumed a lot of time, were not the reason I was hired, and left me no time to develop the craft of programming computers, which is what I wanted to do.

I had turned down a job offer for three times what I was being paid at this job because I wanted to work with the best people in the world on a very particular skill — a skill I couldn’t get anywhere but at an intelligence agency. Anything that got in the way of honing that craft was the enemy.

Over my first seven years, I’d barely leave my desk, working 12- to 16-hour days for six days a week. Working that hard with incredible people was amazing and motivating. I’ve never learned so much in such a short period of time.

“The difference between successful people and very successful people is that very successful people say ‘no' to almost everything.”

— Warren Buffett

Certainly, offers of work are good problems to have. A lot of people struggle to find work, and here I was, a few weeks out of university, saying no to my boss. But saying yes to everything is a quick road to mediocrity. I took a two-thirds pay cut to work for the government so I could work with incredibly smart people on a very narrow skill (think cyber). I was willing to go all in. So no, I wasn’t going say yes to things that didn’t help me hone the craft I’d given up so much to work on.

“Instead of asking how many tasks you can tackle given your working hours,” writes Morten Hansen in Great at Work, “ask how many you can ditch given what you must do to excel.” I did what I needed to do to keep my job. As John Stuart Mill said, “as few as you can, as many as you must.”

Doing more isn’t always moving you ahead. To see why, let’s go back to first-year physics.

The Difference Between Speed and Velocity

Velocity and speed are different things. Speed is the distance traveled over time. I can run around in circles with a lot of speed and cover several miles that way, but I’m not getting anywhere. Velocity measures displacement. It’s direction-aware.

Think of it this way: I want to get from New York to L.A. Speed is flying circles around Manhattan, and velocity is hopping on a direct flight from JFK to LAX.

“People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are. You have to pick carefully. I’m actually as proud of the things we haven’t done as the things I have done. Innovation is saying ‘no’ to 1,000 things.”

— Steve Jobs

When you’re at work, you need to know what you need to do to keep your job. You need to know the table stakes. Then you need to distinguish between tasks that offer a lot of speed and those that offer velocity.

Here are three ways you can increase your velocity:

  1. To the extent possible, ruthlessly shave away the unnecessary tasks, priorities, meetings, and BS. Put all your effort into the projects that really matter.
  2. Don’t rely on your willpower to say no; instead, create systems that help you fend off distractions. I have two friends who were about the same weight several years ago. Around that time, one of them was diagnosed with celiac (gluten intolerance). He immediately started to lose weight after changing his diet. Upon seeing this, my other friend decided that he, too, would go on a diet to lose weight. Because they both ate out a lot, they both were frequently in situations where they would have to make healthy choices. The person with celiac developed “automatic behavior“; he had to avoid gluten if he wanted to stay healthy and pain-free. The other person, however, had to keep making positive choices and ended up falling down after a few weeks and reverting to his previous eating habits. Another example: One of my management principles was “no meeting mornings.” This rule allowed the team to work, uninterrupted, on the most important things. Of course, there were exceptions to this rule, but the default was that each day you had a three-hour chunk of time when you were at your best to really move the needle.
  3. And finally, do as I did, and say “no” to your boss. The best way I found to frame this reply was actually the same technique that negotiation expert Chris Voss mentioned in a recent podcast episode: simply ask, “how am I supposed to do that?” given all the other stuff on your plate. Explain that saying no means that you’re going to be better at the tasks that are most important to your job, and tie those tasks to your boss’s performance.

Members can discuss this post on the Learning Community Forum.

Read Next

Eight Ways to Say No With Grace and Style

  • 1

    The difference between speed and velocity first came to me from Peter Kaufman.

Half Life: The Decay of Knowledge and What to Do About It

Understanding the concept of a half-life will change what you read and how you invest your time. It will explain why our careers are increasingly specialized and offer a look into how we can compete more effectively in a very crowded world.

The Basics

A half-life is the time taken for something to halve its quantity. The term is most often used in the context of radioactive decay, which occurs when unstable atomic particles lose energy. Twenty-nine elements are known to be capable of undergoing this process. Information also has a half-life, as do drugs, marketing campaigns, and all sorts of other things. We see the concept in any area where the quantity or strength of something decreases over time.

Radioactive decay is random, and measured half-lives are based on the most probable rate. We know that a nucleus will decay at some point; we just cannot predict when. It could be anywhere between instantaneous and the total age of the universe. Although scientists have defined half-lives for different elements, the exact rate is completely random.

Half-lives of elements vary tremendously. For example, carbon takes millions of years to decay; that’s why it is stable enough to be a component of the bodies of living organisms. Different isotopes of the same element can also have different half-lives.

Three main types of nuclear decay have been identified: alpha, beta, and gamma. Alpha decay occurs when a nucleus splits into two parts: a helium nucleus and the remainder of the original nucleus. Beta decay occurs when a neutron in the nucleus of an element changes into a proton. The result is that it turns into a different element, such as when potassium decays into calcium. Beta decay also releases a neutrino — a particle with virtually no mass. If a nucleus emits radiation without experiencing a change in its composition, it is subject to gamma decay. Gamma radiation contains an enormous amount of energy.

The Discovery of Half-Lives

The discovery of half-lives (and alpha and beta radiation) is credited to Ernest Rutherford, one of the most influential physicists of his time. Rutherford was at the forefront of this major discovery when he worked with physicist Joseph John Thompson on complementary experiments leading to the discovery of electrons. Rutherford recognized the potential of what he was observing and began researching radioactivity. Two years later, he identified the distinction between alpha and beta rays. This led to his discovery of half-lives, when he noticed that samples of radioactive materials took the same amount of time to decay by half. By 1902, Rutherford and his collaborators had a coherent theory of radioactive decay (which they called “atomic disintegration”). They demonstrated that radioactive decay enabled one element to turn into another — research which would earn Rutherford a Nobel Prize. A year later, he spotted the missing piece in the work of the chemist Paul Villard and named the third type of radiation gamma.

Half-lives are based on probabilistic thinking. If the half-life of an element is seven days, it is most probable that half of the atoms will have decayed in that time. For a large number of atoms, we can expect half-lives to be fairly consistent. It’s important to note that radioactive decay is based on the element itself, not the quantity of it. By contrast, in other situations, the half-life may vary depending on the amount of material. For example, the half-life of a chemical someone ingests might depend on the quantity.

In biology, a half-life is the time taken for a substance to lose half its effects. The most obvious instance is drugs; the half-life is the time it takes for their effect to halve, or for half of the substance to leave the body. The half-life of caffeine is around 6 hours, but (as with most biological half-lives) numerous factors can alter that number. People with compromised liver function or certain genes will take longer to metabolize caffeine. Consumption of grapefruit juice has been shown in some studies to slow caffeine metabolism. It takes around 24 hours for a dose of caffeine to fully leave the body.

The half-lives of drugs vary from a few seconds to several weeks. To complicate matters, biological half-lives vary for different parts of the body. Lead has a half-life of around a month in the blood, but a decade in bone. Plutonium in bone has a half-life of a century — more than double the time for the liver.

Marketers refer to the half-life of a campaign — the time taken to receive half the total responses. Unsurprisingly, this time varies among media. A paper catalog may have a half-life of about three weeks, whereas a tweet might have a half-life of a few minutes. Calculating this time is important for establishing how frequently a message should be sent.

“Every day that we read the news we have the possibility of being confronted with a fact about our world that is wildly different from what we thought we knew.”

— Samuel Arbesman

The Half-Life of Facts

In The Half-Life of Facts: Why Everything We Know Has an Expiration Date, Samuel Arbesman (see our Knowledge Project interview) posits that facts decay over time until they are no longer facts or perhaps no longer complete. According to Arbesman, information has a predictable half-life: the time taken for half of it to be replaced or disproved. Over time, one group of facts replaces another. As our tools and knowledge become more advanced, we can discover more — sometimes new things that contradict what we thought we knew, sometimes nuances about old things. Sometimes we discover a whole area that we didn’t know about.

The rate of these discoveries varies. Our body of engineering knowledge changes more slowly, for example, than does our body of psychological knowledge.

Arbesman studied the nature of facts. The field was born in 1947, when mathematician Derek J. de Solla Price was arranging a set of philosophical books on his shelf. Price noted something surprising: the sizes of the books fit an exponential curve. His curiosity piqued, he began to see whether the same curve applied to science as a whole. Price established that the quantity of scientific data available was doubling every 15 years. This meant that some of the information had to be rendered obsolete with time.

Scientometrics shows us that facts are always changing, and much of what we know is (or soon will be) incorrect. Indeed, much of the available published research, however often it is cited, has never been reproduced and cannot be considered true. In a controversial paper entitled “Why Most Published Research Findings Are False,” John Ioannides covers the rampant nature of poor science. Many researchers are incentivized to find results that will please those giving them funding. Intense competition makes it essential to find new information, even if it is found in a dubious manner. Yet we all have a tendency to turn a blind eye when beliefs we hold dear are disproved and to pay attention only to information confirming our existing opinions.

As an example, Arbesman points to the number of chromosomes in a human cell. Up until 1965, 48 was the accepted number that medical students were taught. (In 1953, it had been declared an established fact by a leading cytologist). Yet in 1956, two researchers, Joe Hin Tjio and Albert Levan, made a bold assertion. They declared the true number to be 46. During their research, Tjio and Levan could never find the number of chromosomes they expected. Discussing the problem with their peers, they discovered they were not alone. Plenty of other researchers found themselves two chromosomes short of the expected 48. Many researchers even abandoned their work because of this perceived error. But Tjio and Levan were right (for now, anyway). Although an extra two chromosomes seems like a minor mistake, we don’t know the opportunity costs of the time researchers invested in faulty hypotheses or the value of the work that was abandoned. It was an emperor’s-new-clothes situation, and anyone counting 46 chromosomes assumed they were the ones making the error.

As Arbesman puts it, facts change incessantly. Many of us have seen the ironic (in hindsight) doctor-endorsed cigarette ads from the past. A glance at a newspaper will doubtless reveal that meat or butter or sugar has gone from deadly to saintly, or vice versa. We forget that laughable, erroneous beliefs people once held are not necessarily any different from those we now hold. The people who believed that the earth was the center of the universe, or that some animals appeared out of nowhere or that the earth was flat, were not stupid. They just believed facts that have since decayed. Arbesman gives the example of a dermatology test that had the same question two years running, with a different answer each time. This is unsurprising considering the speed at which our world is changing.

As Arbesman points out, in the last century the world’s population has swelled from 2 billion to 7 billion, we have taken on space travel, and we have altered the very definition of science.

Our world seems to be in constant flux. With our knowledge changing all the time, even the most informed people can barely keep up. All this change may seem random and overwhelming (Dinosaurs have feathers? When did that happen?), but it turns out there is actually order within the shifting noise. This order is regular and systematic and is one that can be described by science and mathematics.

The order Arbesman describes mimics the decay of radioactive elements. Whenever new information is discovered, we can be sure it will break down and be proved wrong at some point. As with a radioactive atom, we don’t know precisely when that will happen, but we know it will occur at some point.

If we zoom out and look at a particular body of knowledge, the random decay becomes orderly. Through probabilistic thinking, we can predict the half-life of a group of facts with the same certainty with which we can predict the half-life of a radioactive atom. The problem is that we rarely consider the half-life of information. Many people assume that whatever they learned in school remains true years or decades later. Medical students who learned in university that cells have 48 chromosomes would not learn later in life that this is wrong unless they made an effort to do so.

OK, so we know that our knowledge will decay. What do we do with this information? Arbesman says,

… simply knowing that knowledge changes like this isn’t enough. We would end up going a little crazy as we frantically tried to keep up with the ever changing facts around us, forever living on some sort of informational treadmill. But it doesn’t have to be this way because there are patterns. Facts change in regular and mathematically understandable ways. And only by knowing the pattern of our knowledge evolution can we be better prepared for its change.

Recent initiatives have sought to calculate the half-life of an academic paper. Ironically, academic journals have largely neglected research into how people use them and how best to fund the efforts of researchers. Research by Philip Davis shows the time taken for a paper to receive half of its total downloads. Davis’s results are compelling. While most forms of media have a half-life measured in days or even hours, 97 percent of academic papers have a half-life longer than a year. Engineering papers have a slightly shorter half-life than other fields of research, with double the average (6 percent) having a half-life of under a year. This makes sense considering what we looked at earlier in this post. Health and medical publications have the shortest overall half-life: two to three years. Physics, mathematics, and humanities publications have the longest half-lives: two to four years.

The Half-Life of Secrets

According to Peter Swire, writing in “The Declining Half-Life of Secrets,” the half-life of secrets (by which Swire generally means classified information) is shrinking. In the past, a government secret could be kept for over 25 years. Nowadays, hacks and leaks have shrunk that time considerably. Swire writes:

During the Cold War, the United States developed the basic classification system that exists today. Under Executive Order 13526, an executive agency must declassify its documents after 25 years unless an exception applies, with stricter rules if documents stay classified for 50 years or longer. These time frames are significant, showing a basic mind-set of keeping secrets for a time measured in decades.

Swire notes that there are three main causes: “the continuing effects of Moore’s Law — or the idea that computing power doubles every two years, the sociology of information technologists, and the different source and methods for signals intelligence today compared with the Cold War.” One factor is that spreading leaked information is easier than ever. In the past, it was often difficult to get information published. Newspapers feared legal repercussions if they shared classified information. Anyone can now release secret information, often anonymously, as with WikiLeaks. Governments cannot as easily rely on media gatekeepers to cover up leaks.

Rapid changes in technology or geopolitics often reduce the value of classified information, so the value of some, but not all, classified information also has a half-life. Sometimes it’s days or weeks, and sometimes it’s years. For some secrets, it’s not worth investing the massive amount of computer time that would be needed to break them because by the time you crack the code, the information you wanted to know might have expired.

(As an aside, if you were to invert the problem of all these credit card and SSN leaks, you might conclude that reducing the value of possessing this information would be more effective than spending money to secure it.)

“Our policy (at Facebook) is literally to hire as many talented engineers as we can find. The whole limit in the system is that there are not enough people who are trained and have these skills today.”

— Mark Zuckerberg

The Half-Lives of Careers and Business Models

The issue with information having a half-life should be obvious. Many fields depend on individuals with specialized knowledge, learned through study or experience or both. But what if those individuals are failing to keep up with changes and clinging to outdated facts? What if your doctor is offering advice that has been rendered obsolete since they finished medical school? What if your own degree or qualifications are actually useless? These are real problems, and knowing about half-lives will help you make yourself more adaptable.

While figures for the half-lives of most knowledge-based careers are hard to find, we do know the half-life of an engineering career. A century ago, it would take 35 years for half of what an engineer learned when earning their degree to be disproved or replaced. By the 1960s, that time span shrank to a mere decade. Today that figure is probably even lower.

In 1966 paper entitled “The Dollars and Sense of Continuing Education,” Thomas Jones calculated the effort that would be required for an engineer to stay up to date, assuming a 10-year half-life. According to Jones, an engineer would need to devote at least five hours per week, 48 weeks a year, to stay up to date with new advancements. A typical degree requires about 4800 hours of work. Within 10 years, the information learned during 2400 of those hours would be obsolete. The five-hour figure does not include the time necessary to revise forgotten information that is still relevant. A 40-year career as an engineer would require 9600 hours of independent study.

Keep in mind that Jones made his calculations in the 1960s. Modern estimates place the half-life of an engineering degree at between 2.5 and 5 years, requiring between 10 and 20 hours of study per week. Welcome to the treadmill, where you have to run faster and faster so that you don’t fall behind.

Unsurprisingly, putting in this kind of time is simply impossible for most people. The result is an ever-shrinking length of a typical engineer’s career and a bias towards hiring recent graduates. A partial escape from this time-consuming treadmill that offers little progress is to recognize the continuous need for learning. If you agree with that, it becomes easier to place time and emphasis on developing heuristics and systems to foster learning. The faster the pace of knowledge change, the more valuable the skill of learning becomes.

A study by PayScale found that the median age of workers in most successful technology companies is substantially lower than that of other industries. Of 32 companies, just six had a median worker age above 35, despite the average across all workers being just over 42. Eight of the top companies had a median worker age of 30 or below — 28 for Facebook, 29 for Google, and 26 for Epic Games. The upshot is that salaries are high for those who can stay current while gaining years of experience.

In a similar vein, business models have ever shrinking half-lives. The nature of capitalism is that you have to be better last year than you were this year — not to gain market share but to maintain what you already have. If you want to get ahead, you need asymmetry; otherwise, you get lost in trench warfare. How long would it take for half of Uber or Facebook’s business models to be irrelevant? It’s hard to imagine it being more than a couple of years or even months.

In The Business Model Innovation Factory: How to Stay Relevant When the World Is Changing, Saul Kaplan highlights the changing half-lives of business models. In the past, models could last for generations. The majority of CEOs oversaw a single business for their entire careers. Business schools taught little about agility or pivoting. Kaplan writes:

During the industrial era once the basic rules for how a company creates, delivers, and captures value were established[,] they became etched in stone, fortified by functional silos, and sustained by reinforcing company cultures. All of a company’s DNA, energy, and resources were focused on scaling the business model and beating back competition attempting to do a better job executing the same business model. Companies with nearly identical business models slugged it out for market share within well-defined industry sectors.


Those days are over. The industrial era is not coming back. The half-life of a business model is declining. Business models just don’t last as long as they used to. In the twenty-first century business leaders are unlikely to manage a single business for an entire career. Business leaders are unlikely to hand down their businesses to the next generation of leaders with the same business model they inherited from the generation before.

The Burden of Knowledge

The flip side of a half-life is the time it takes to double something. A useful guideline to calculate the time it takes for something to double is to divide 70 by the rate of growth. This formula isn’t perfect, but it gives a good indication. Known as the Rule of 70, it applies only to exponential growth when the relative growth rate remains consistent, such as with compound interest.

The higher the rate of growth, the shorter the doubling time. For example, if the population of a city is increasing by 2 percent per year, we divide 70 by 2 to get a doubling time of 35 years. The rule of 70 is a useful heuristic; population growth of 2 percent might seem low, but your perspective might change when you consider that the city’s population could double in just 35 years. The Rule of 70 can also be used to calculate the time for an investment to double in value; for example, $100 at 7 percent compound interest will double in just a decade and quadruple in 20 years. The average newborn baby doubles its birth weight in under four months. The average doubling time for a tumor is also four months.

We can see how information changes in the figures for how long it takes for a body of knowledge to double in size. The figures quoted by Arbesman (drawn from Little Science, Big Science … and Beyond by Derek J. de Solla Price) are compelling, including:

  • Time for the number of entries in a dictionary of national biographies to double: 100 years
  • Time for the number of universities to double: 50 years
  • Time for the number of known chemical compounds to double: 15 years
  • Time for the number of known asteroids to double: 10 years

Arbesman also gives figures for the time taken for the available knowledge in a particular field to double, including:

  • Medicine: 87 years
  • Mathematics: 63 years
  • Chemistry: 35 years
  • Genetics: 32 years

The doubling of knowledge increases the learning load over time. As a body of knowledge doubles so does the cost of wrapping your head around what we already know. This cost is the burden of knowledge. To be the best in a general field today requires that you know more than the person who was the best only 20 years ago. Not only do you have to be better to be the best, but you also have to be better just to stay in the game.

The corollary is that because there is so much to know, we specialize in very niche areas. This makes it easier to grasp the existing body of facts, keep up to date on changes, and rise to the level of expert. The problem is that specializing also makes it easier to see the world through the narrow focus of your specialty, makes it harder to work with other people (as niches are often dominated by jargon), and makes you prone to overvalue the new and novel.


As we have seen, understanding how half-lives work has numerous practical applications, from determining when radioactive materials will become safe to figuring out effective drug dosages. Half-lives also show us that if we spend time learning something that changes quickly, we might be wasting our time. Like Alice in Wonderland — and a perfect example of the Red Queen Effect — we have to run faster and faster just to keep up with where we are. So if we want our knowledge to compound, we’ll need to focus on the invariant general principles.


Members can discuss this post on the Learning Community Forum.

Friction: The Hidden Reality of What Holds People Back

How is it that two people delivering the same value to organizational outcomes, in the same role at the same pay, can have a massively different value to the organization itself?


Here's a common problem that a lot of people are unaware of: John is a remarkable employee. He delivers day in and day out. Jane is equally remarkable and delivers just as well. They're identical twins except for one difference. That one difference makes Jane incredibly valuable to the organization and makes John much less valuable.

That difference is friction.

To get John to do what he's supposed to do, his boss comes in and hits him over the head every day. John can't keep track of what he's supposed to do, and he does things only when instructed.

Jane, on the other hand, shows up knowing what she's supposed to do and doing it. She delivers without any added work from her boss.

John and Jane have the same boss. The amount of effort required to get John to do something is 10 times the amount of effort required to get Jane to do something.


Let's shift our perspective here.

From John's point of view, he's competent and capable, even if he's not ambitious or highly motivated.

From Jane's point of view, she's equally competent and capable and wonders why she's treated the same as John when she does the same amount of work with way less hassle.

From the boss's point of view, they're both valuable employees, but they are not equally valuable. Jane is much more valuable than John. If one of them had to be let go, it'd be John.


Let's invert the problem a little. Instead of asking what more you can do to add value you can ask what you can remove.

One of the easiest ways to increase your value to an organization is to reduce the friction required to get you to do your job. You don't need to learn any new skills for this; you just have to shift your perspective to your boss's point of view and see how hard it is for them to get you to do something. Like nature, which removes mistakes to progress, you can remove things to not only survive but thrive. (This is one of the ways we can apply via negativa, an important mental model.)

Think about it this way. Your boss, like you, has 100 units of energy a day with which to accomplish something. If you both spend 10 units on getting you to do the thing you already know you should be doing, you're making yourself look bad, despite the amazing quality you deliver. And you're making your boss look less productive than they really are in the process.

When we think of improving our value to an organization, we often think about the skills we need to develop, the jobs we should take, or the growing responsibility we have. In so doing, we miss the most obvious method of all: reducing friction. Reducing friction means that the same 100 units of energy are going to get you further, which is going to get your boss further, which is going to get the organization further.


Members can discuss this post on the Learning Community Forum

Survival of the Kindest: Dacher Keltner Reveals the New Rules of Power

When Pixar was dreaming up the idea for Inside Out, a film that would explore the roiling emotions inside the head of a young girl, they needed guidance from an expert. So they called Dacher Keltner.

Dacher is a psychologist at UC Berkeley who has dedicated his career to understanding how human emotion shapes the way we interact with the world, how we properly manage difficult or stressful situations, and ultimately, how we treat one another.

In fact, he refers to emotions as the “language of social living.” The more fluent we are in this language, the happier and more meaningful our lives can be.

We tackle a wide variety of topics in this conversation that I think you’ll really enjoy.

You’ll learn:

  • The three main drivers that determine your personal happiness and life satisfaction
  • Simple things you can do everyday to jumpstart the “feel good” reward center of your brain
  • The principle of “jen” and how we can use “high-jen behaviors” to bootstrap our own happiness
  • How to have more positive influence in our homes, at work and in our communities.
  • How to teach your kids to be more kind and empathetic in an increasingly self-centered world
  • What you can do to stay grounded and humble if you are in a position of power or authority
  • How to catch our own biases when we’re overly critical of another’s ideas (or overconfident in our own)

And much more. We could have spent an hour discussing any one of these points alone, but there was so much I wanted to cover. I’m certain you’ll find this episode well worth your time.



An edited copy of this transcript is available to members of our learning community or for purchase separately ($7).

If you liked this, check out all the episodes of the knowledge project.


Members can discuss this post on the Learning Community Forum.

The Law of Unintended Consequences: Shakespeare, Cobra Breeding, and a Tower in Pisa

“When we try to pick out anything by itself, we find it hitched to everything else in the universe”

— John Muir

In 1890, a New Yorker named Eugene Schieffelin took his intense love of Shakespeare's Henry VI to the next level.

Most Shakespeare fanatics channel their interest by going to see performances of the plays, meticulously analyzing them, or reading everything they can about the playwright's life. Schieffelin wanted more; he wanted to look out his window and see the same kind of birds in the sky that Shakespeare had seen.

Inspired by a mention of starlings in Henry VI, Schieffelin released 100 of the non-native birds in Central Park over two years. (He wasn't acting alone – he had the support of scientists and the American Acclimatization Society.) We can imagine him watching the starlings flutter off into the park and hoping for them to survive and maybe breed. Which they did. In fact, the birds didn't just survive; they thrived and bred like weeds.

Unfortunately, Schieffelin's plan worked too well. Far, far too well. The starlings multiplied exponentially, spreading across America at an astonishing rate. Today, we don't even know how many of them live in the U.S., with official estimates ranging from 45 million to 200 million. Most, if not all, of them are descended from Schieffelin's initial 100 birds. The problem is that as an alien species, the starlings wreak havoc because they were introduced into an ecosystem they were not naturally part of and the local species had (and still have) no defense against them.

If you live in an area with a starling population, you are doubtless familiar with the hardy, fearless nature of these birds. They gather in enormous flocks, destroying crops, snatching food supplies from native birds, and scavenging in cities. Starlings now consume millions of dollars' worth of crops each year and cause fatal airplane crashes. Starlings also spread diseases, including e. coli infections and salmonella.

Schieffelin's starlings are a prime example of unintended consequences. In Best Laid Plans: The Tyranny of Unintended Consequences and How to Avoid Them, William A. Sherden writes:

Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally their impacts are imperceptible, at other times colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended.

We all know that our actions and decisions can have surprising reverberations that have no relation to our initial intentions. This is why second-order thinking is so crucial. Sometimes we can open a Pandora’s box or kick a hornet’s nest without realizing it. In a dynamic world, you can never do merely one thing.

Unintended consequences arise because of the chaotic nature of systems. When Schieffelin released the starlings, he did not know the minutiae of the ecological and social systems they would be entering. As the world becomes more complicated and interconnected, the potential for ever more serious unintended consequences grows.

All too often when we mess with complicated systems, we have no more control over the outcomes than we would if we performed shamanistic dances. The simple fact is that we cannot predict how a system will behave through mathematical models or computer simulations or basic concepts like cause and effect or supply and demand.

In The Gene: An Intimate History, Siddhartha Mukherjee writes that unintended consequences can be the result of scientists failing to appreciate the complexity of systems:

The parables of such scientific overreach are well-known: foreign animals, introduced to control pests, become pests in their own right; the raising of smokestacks, meant to alleviate urban pollution, releases particulate effluents higher in the air and exacerbates pollution; stimulating blood formation, meant to prevent heart attacks, thickens the blood and results in an increased risk of blood clots to the heart.

Mukherjee notes that unintended consequences can also be the result of people thinking that something is more complex than it actually is:

… when nonscientists overestimate complexity-“No one can possibly crack this code”-they fall into the trap of unanticipated consequences. In the early 1950s, a common trope among some biologists was that the genetic code would be so context dependent-so utterly determined by a particular cell in a particular organism and so horribly convoluted-that deciphering it would be impossible. The truth turned out to be quite the opposite: just one molecule carries the code, and just one code pervades the biological world. If we know the code, we can intentionally alter it in organisms, and ultimately in humans.

As was mentioned in the quote from Sherden above, sometimes perverse unintended consequences occur when actions have the opposite of the desired effect. From The Nature of Change or the Law of Unintended Consequences by John Mansfield:

An example of the unexpected results of change is found in the clearing of trees to make available more agricultural land. This practice has led to rising water tables and increasing salinity that eventually reduces the amount of useable land.

Some additional examples:

  • Suspending problematic children from school worsens their behavior, as they are more likely to engage in criminal behavior when outside school.
  • Damage-control lawsuits can lead to negative media attention and cause more harm (as occurred in the notorious McLibel case).
  • Banning alcohol has, time and time again, led to higher consumption and the formation of criminal gangs, resulting in violent deaths.
  • Abstinence-based education invariably causes a rise in teenage pregnancies.
  • Many people who experience a rodent infestation will stop feeding their cats, assuming that this will encourage them to hunt more. The opposite occurs: well-fed cats are better hunters than hungry ones.
  • When the British government offered financial rewards for people who killed and turned in cobras in India, people, reacting to incentives, began breeding the snakes. Once the reward program was scrapped, the population of cobras in India rose as people released the ones they had raised. The same thing occurred in Vietnam with rats.

This phenomenon, of the outcome being the opposite of the intended one, is known as “blowback” or the Cobra effect, for obvious reasons. Just as with iatrogenics, interventions often lead to worse problems.

Sometimes the consequences are mixed and take a long time to appear, as with the famous Leaning Tower of Pisa. From The Nature of Change again:

When the tower was built, it was undoubtedly intended to stand vertical. It took about 200 years to complete, but by the time the third floor was added, the poor foundations and loose subsoil had allowed it to sink on one side. Subsequent builders tried to correct this lean and the foundations have been stabilised by 20th-century engineering, but at the present time, the top of the tower is still about 15 feet (4.5 meters) from the perpendicular. Along with the unexpected failure of the foundations is the unexpected consequence of the Leaning Tower of Pisa becoming a popular tourist attraction, bringing enormous revenue to the town.

It's important to note that unintended consequences can sometimes be positive. Someone might have a child because they think parenthood will be a fulfilling experience. If their child grows up and invents a drug that saves thousands of lives, that consequence is positive yet unplanned. Pokemon Go, strange as it seemed, encouraged players to get more exercise. The creation of No Man's Lands during conflicts can preserve the habitats of local wildlife, as has occurred around the Berlin Wall. Sunken ships form coral reefs where wildlife thrives. Typically, though, when we talk about the law of unintended consequences, we're talking about negative consequences.

“Any endeavor has unintended consequences. Any ill-conceived endeavor has more.”

— Stephen Tobolowsky, The Dangerous Animals Club

The Causes of Unintended Consequences

By their nature, unintended consequences can be a mystery. I’m not a fan of the term “unintended consequences,” though, as it’s too often a scapegoat for poor thinking. There are always consequences, whether you see them or not.

When we reflect on the roots of consequences that we failed to see but could have, we are liable to build a narrative that packages a series of chaotic events into a neat chain of cause and effect. A chain that means we don’t have to reflect on our decisions to see where we went wrong. A chain that keeps our egos intact.

Sociologist Robert K. Merton has identified five potential causes of consequences we failed to see:

  1. Our ignorance of the precise manner in which systems work.
  2. Analytical errors or a failure to use Bayesian thinking (not updating our beliefs in light of new information).
  3. Focusing on short-term gain while forgetting long-term consequences.
  4. The requirement for or prohibition of certain actions, despite the potential long-term results.
  5. The creation of self-defeating prophecies (for example, due to worry about inflation, a central bank announces that it will take drastic action, thereby accidentally causing crippling deflation amidst the panic).

Most unintended consequences are just unanticipated consequences.

Using logical fallacies and mental models, and keeping Schieffelin’s starlings in mind, we can identify several more possible causes of consequences that we likely should have seen in advance but didn’t. Here they are:

Over-reliance on models and predictions—mistaking the map for the territory. Schieffelin could have made a predictive model of how his starlings would breed and would affect their new habitat. The issue is that models are not gospel and the outcomes they predict do not represent the real world. All models are wrong, but that doesn’t mean they’re not useful sometimes. You have to understand the model and the terrain it’s based on. Schieffelin’s predictive model might have told him that the starlings' breeding habits would have a very minor impact on their new habitat. But in reality, the factors involved were too diverse and complex to take into account. Schieffelin’s starlings bred faster and interacted with their new environment in ways that would be hard to predict. We can assume that he based his estimations of the future of the starlings on their behavior in their native countries.

Survivorship bias. Unintended consequences can also occur when we fail to take into account all of the available information. When predicting an outcome, we have an inherent tendency to search for other instances in which the desired result occurred. Nowadays, when anyone considers introducing a species to a new area, they are likely to hear about Schieffelin’s starlings. And Schieffelin was likely influenced by stories about, perhaps even personal experiences with, successfully introducing birds into new habitats, unaware of the many ecosystem-tampering experiments that had gone horribly wrong.

The compounding effect of consequences. Unintended results do not progress in a linear manner. Just as untouched money in a savings account compounds, the population of Schieffelin’s starlings compounded over the following decades. Each new bird that was hatched meant more hatchlings in future generations. At some point, the bird populations reached critical mass and no attempts to check their growth could be successful. As people in one area shot or poisoned the starlings, the breeding of those elsewhere continued.

Denial. Just as we seek out confirmatory evidence, we are inclined to deny the existence of disconfirming information. We may be in denial about the true implications of actions. Governments in particular tend to focus on the positive consequences of legislation while ignoring the costs. Negative unintended consequences do not always result in changes being made. Open-plan offices are another instance; they were first designed to encourage collaboration and creativity. Even though research has shown that they have the opposite effect, many companies continue to opt for open offices. They sound like a good idea, and airy offices with beanbags and pot plants might look nice, but those who continue building them are in obvious denial.

Failure to account for base rates. When we neglect to consider how the past will affect the future, we are failing to account for base rates. Schieffelin likely failed to consider the base rates of successful species introduction.

Curiosity. We sometimes perform actions out of curiosity, without any idea of the potential consequences. The problem is that our curiosity can lead us to behave in reckless, unplanned, or poorly thought-through ways. The release of Schieffelin’s starlings was in part the result of widespread curiosity about the potential for introducing European species to America.

The tendency to want to do something. We are all biased towards action. We don’t want to sit around — we want to act and make changes. The problem is that sometimes doing nothing is the best route to take. In the case of Schieffelin’s starlings, he was biased towards making alterations to the wildlife around him to bring Shakespeare’s world to life, even though leaving nature alone is usually preferable.

Mental Models for Avoiding or Minimizing Unintended Consequences

We cannot eliminate unintended consequences, but we can become more aware of them through rational thinking techniques. In this section, we will examine some ways of working with and understanding the unexpected. Note that the examples provided here are simplifications of complex issues. The observations made about them are those of armchair critics, not those involved in the actual decision making.

Inversion. When we invert our thinking, we consider what we want to avoid, not what we want to cause. Rather than seeking perfection, we should avoid stupidity. By considering potential unintended consequences, we can then work backwards. For example, the implementation of laws which required cyclists to wear helmets at all times led to a rise in fatalities. (People who feel safer behave in a more risky manner.) If we use inversion, we know we do not want any change in road safety laws to cause more injuries or deaths. So, we could consider creating stricter laws surrounding risky cycling and enforcing penalties for those who fail to follow them.

Another example is laws which aim to protect endangered animals by preventing new developments on land where rare species live. Imagine that you are a landowner, about to close a lucrative deal. You look out at your land and notice a smattering of endangered wildflowers. Do you cancel the sale and leave the land to the flowers? Of course not. Unless you are exceptionally honest, you grab a spade, dig up the flowers, and keep them a secret. Many people shoot, poison, remove, or otherwise harm endangered animals and plants. If lawmakers used inversion, they would recognize that they want to avoid those consequences, and work backwards.

We have to focus on avoiding the worst unintended consequences, rather than on controlling everything.

Looking for disconfirming evidence. Instead of looking for information that confirms that our actions will have the desired consequences, we should rigorously search for evidence that they will not. How did this go in the past? Take the example of laws regarding the minimum wage and worker rights. Every country has people pushing for a higher minimum wage and for more protection of workers. If we search for disconfirming evidence, we see that these laws can do more harm than good. The French appear to have perfected labor laws. All employees are, on the face of it, blessed with a minimum wage of 17,764 euros per year, a 35-hour work week, five weeks paid holiday, and strict protection against redundancy (layoffs). So, why don’t we all just move to France? Because these measures result in a lot of negative unintended consequences. Unemployment rates are high, as many businesses cannot afford to hire many employees. Foreign companies are reluctant to hire French workers, as they can’t fire them during tough economic times. Everyone deserves a fair minimum wage and protection from abuse of their rights, but France illustrates how taking this principle too far can have negative unintended consequences.

Understanding our circle of competence. Each of us has areas we understand well and are familiar with. When we act outside our circle of competence, we increase the risk of unintended consequences. If you decide to fix your boiler without consulting a plumber, you are acting outside of your circle of competence and have a good chance of making the problem worse. When the British government implemented bounties for dead cobras in India, their circle of competence did not include an understanding of the locals. Perhaps if they had consulted some Indian people and asked how they would react to such a law, they could have avoided causing a rise in the cobra population.

Second-order thinking. We often forget that our actions can have two layers of consequences, of which the first might be intended and the second unintended. With Schieffelin’s starlings, the first layer of consequences was positive and as intended. The birds survived and bred, and Shakespeare fans living in New York got to feel a bit closer to the iconic playwright. But the negative second layer of consequences dwarfed the first layer. For the parents of a child who grows up to invent a life-saving drug, the first layer of consequences is that those parents (presumably) have a fulfilling experience. The second layer of consequences is that lives are saved. When we use second-order thinking, we ask: what could happen? What if the opposite of what I expect happens? What might the results be a year, five years, or a decade from now?


Most unintended consequences are just unanticipated consequences. And in the world of consequences intentions often don't matter.  Intentions, after all, only apply to positive anticipated consequences. Only in rare circumstances would someone intend to cause negative consequences.

So when we make decisions we must ask what the consequences be? This is where having a toolbox of mental models becomes helpful.


Members can discuss this post on the Learning Community Forum