These 12 cognitive biases will fool your mind

T

Clear thinking is an art.

That’s why so many people fail at it. 

We are all prone to cognitive biases, which are errors we make in our everyday thinking. 

I hope you accept my warm welcome to this collection of 12 cognitive biases that can really mess up your decision-making, screw up your mind, and even make you look like an idiot unless you become aware of them all. 

The dirty dozen cognitive biases we are about to explore are:  

  1. Loss Aversion
  2. Scarcity Bias
  3. Dunning-Kruger Effect
  4. Status Quo Bias
  5. Framing Effect 
  6. Priming Effect
  7. Focusing Illusion
  8. Outcome Bias
  9. Hindsight Bias 
  10. Survivorship Bias
  11. Availability Bias
  12. Confirmation Bias 

By knowing what these cognitive biases are and how to identify them, you gain at least two advantages: First, you can avoid most of the pain and regret caused by cognitive biases. Secondly, you can make better-informed decisions and get an edge in life. 

1. Loss Aversion

The pain of giving up an object is greater than the value associated with acquiring it

Imagine you sit on $500,000. 

With an average return of 7 percent per year on the stock market, you could expect to double your money in about ten years. Not bad. 

You could also leave the money in your bank account and throw away the key. Not counting inflation, this choice eliminates the risk of losing your money, but it also keeps you from potential gains. 

Even though you’re still young, you fear the impact of a stock market crash. And so, you decide to leave your money in the bank.

Although this choice is irrational, it isn’t surprising. 

Psychologists Amos Tversky and Daniel Kahneman once proved that losing $100 is twice as painful as the pleasure of gaining $100. Loss aversion is a strong emotion, and we don’t like to lose things we own. 

Another example is the fear of missing out. FOMO is when you worry if there are better things you could do and the feeling that others are experiencing something great without you. They win. You lose. Our bodies are itching, if we are on the verge to miss happy hour, show up late for the store discount, or forget to check out our social feed.

We tend to forget that not all opportunities are created equally. We also forget that FOMO is caused by someone else and rarely driven by our own goals. Usually, there’s no loss if we miss out, even if we momentarily feel that way.

2. Scarcity Bias 

The increased desire for things we might lose or has a limited supply

Scarcity bias is the deceiving sister of loss aversion. 

Our minds associate limited access with higher quality. And this is exploited in business everywhere. Moreover, we generally believe that what’s hard to own is more valuable than something easy to get.

Just think of a 2016 Château Mouton Rothschild – one of the world’s most hyped wines. Clocking in at $600 per bottle, this is not for everyone to enjoy. The high price alone creates scarcity. And great storytelling does the rest for these squeezed grapes. 

Other times, less scarce products are presented as if they were scarce. Take airlines or hotels, where sites will create false urgency and show you “7 people are looking at this hotel”, while another message warns you “only 2 rooms left at this price”.

Perception always beats reality. And both examples are effective techniques to use scarcity to generate more sales.

But scarcity bias doesn’t only apply to possessions. It also applies to the people in your life. 

You might take your spouse or best friend for granted. They’ll always be there — until they are not. If your relationships are threatened, people quickly become scarce to you.

The simple reason is that you care more about something when you know you can lose it.

3. Dunning-Kruger Effect 

When people with lower skills overestimate their abilities, while highly skilled often underestimate their abilities

Of all cognitive biases, this is most definitely one you want to avoid. At least if you don’t want to look like a schmuck. 

The Dunning-Kruger Effect is the paradox when less skilled people overestimate their abilities, and people with better than average skills tend to underestimate their capabilities.

Sometimes you feel so confidently sure in what you know. The problem is that you don’t know what you don’t know. With age, you realize how small your circle of competence is and why big mouths belong to idiots. 

Check out this article about the dangers of the Dunning-Kruger Effect and how to save yourself from the embarrassing visit to Mount Stupid. 

4. Status Quo Bias 

When a proposed change from the current situation is perceived as a loss

Change is hard. That’s why people look a bit offended anytime you propose new ideas. 

You have probably felt the tolls of change, too: The new strategy at work. New updates on your computer. Or the feeling of wearing new pants. 

The truth is that most people like things just the way they are (even if they say the opposite). And loads of research explain just how much people hate changes.

For example, when psychologists, Kahneman and Tversky, found out that people feel more regret for negative outcomes of new actions than negative results of inaction. 

Change requires choice, which can be overwhelming and hard work. Psychologist Barry Schwartz warns that too much choice does not only limit your freedom; it can lead to clinical depression. That’s why people prefer to go with the default choice (the status quo), where someone else decided for them. 

Now you also know why your new year’s resolution is pretty much doomed. But there’s hope, my friend.

These two strategies can limit status quo bias: 

  1. When you want to make any change, make fewer changes 
  2. If you’re going to make big changes, go for small but consistent efforts.

By the way, if you improve something by just 1 percent each day, you will be 37 times better after a year!

5. Framing Effect

When reaching different conclusions from the same information, depending on how it’s presented

Imagine yourself sitting in the doctors’ office and presented with these two treatments: 

  • Medical procedure A has a 90 % chance of survival
  • Medical procedure B has a 10 % chance of mortality

Even though the two procedures have the same success rate, you would hardly choose option B. Simply because of the negative framing. 

The framing effect is a cognitive bias you’ll meet everywhere, especially in sales and business.

Let’s say you go out to buy a used car. The salesman might point at the excellent mileage, which he has picked up is important to you. Thus, you might overlook the state of the engine and brakes because ‘mileage’ is framed as the vital selling point. 

Research indicates that people react better to positive framing. So, be careful to accept the first positive offer or message you hear. It is most likely framed to nudge your decision. 

6. Priming Effect 

When initial exposure to a stimulus influences response to a later stimulus

You turn on the news and see a story about a shark attack at a beach in Spain.

A 10-year-old boy is badly injured and hospitalized. The reporter interviews terrified eyewitnesses and roast a startled Head of Tourism. Finally, the news segment reminds you of a similar shark attack at a Greek beach three years ago. 

Now, answer this: which animal is more deadly – a shark or a deer? 

Well, despite the shark’s bad press, Bambi is 300 times more lethal. Deer collisions on the road are much more common than people paddling into Jaws in the waves. 

So, if you care about your safety, you might be better off at this Spanish beach compared to a road trip in your car. 

It’s very human to be influenced by scary, sensational news stories designed to steal your attention. Most people forget to think of probability and exposure when evaluating the real risk of an outcome.

Thus, the Priming Effect can mislead you to draw conclusions based on fear and emotions instead of facts.

7. Focusing Illusion

When you attach too much importance to one factor instead of the sum of factors

For just a moment, let’s dream about moving to the nice weather in Miami. 

You can almost sense the balmy temperatures, gentle sea breeze, and all the beautiful people at the beach. Wouldn’t life be much better if you lived there compared to, for example, Portland, Maine? 

Probably not.

If you consider all everyday life activities – sleeping, commuting, working, paying bills, grocery shopping, cooking, cleaning, etc. – the weather is just a tiny part of life in Miami.

Focusing Illusion is our tendency to compare things on just one attribute instead of looking at the grand scheme of things. 

Get an in-depth understanding of this cognitive bias right here

8. Outcome Bias

Evaluating the quality of a decision when the outcome is already known

Let’s experiment.

You put together one million monkeys and let them pick stocks. They get a computer and a brokerage account. And off they go to buy and sell stocks in a frenzy which, of course, is entirely random. 

After a week, half the monkeys have made a profit, and the other half a loss. You keep the ones that made a profit and send the other half home. 

A week later, the same happens with the remaining monkeys. One half makes money and stays, and the other half losses money and are sent packing. And so forth.

After ten weeks, about a thousand successful monkey investors remain. After twenty weeks, you should expect just one super monkey to remain. This little guy is now a billionaire!

Now, let’s invite more creatures to the story – the media. 

How will they react? 

Most likely, they will throw themselves at this super investor monkey and write about every possible angle on how it became so successful in such a short time. 

What’s his secret? It could be that the ‘chosen monkey’ eats more bananas than the others. Or maybe he meditates while grooming. There has to be something different. Otherwise, how does he stay so brilliantly consistent? 

The monkey investor example shows how we tend to evaluate results on the outcome instead of the decision process. Outcome bias is especially prevalent in situations where luck, randomness, and external factors play.

Let me finish this part with a few encouraging and grounding things for you to remember: 

A bad result does not automatically indicate a wrong decision. And so, a good decision does not always point to a good decision. 

In other words, you need to look at the process, not just the result. 

It’s extremely difficult to pick the winning monkey. 

9. Hindsight Bias 

Filtering past events through present knowledge so that event looks more predictable than it was

History is random. But did you ever notice how predictable events seem after they have already happened? 

Hindsight Bias can cause temporary relief to those overwhelmed by complexity, but it can’t teach you how the world works.  

Today, it seems very plausible that a housing bobble set off the big 2008 financial crisis. It also makes sense to us today that a single shot in Sarajevo in 1914 would upend the world for the next 30 years. 

But to most experts in 2007 and 1913, respectively, this would have sounded ridiculous. No one saw the devastation lurking around the corner – before it happened. 

If you write a journal, you know how poor a forecaster you are. If you don’t keep a journal, just expect that you generally misremember information and distort most of your earlier predictions about any event. 

10. Survivorship Bias 

Overestimating the odds of success while failing to consider the base rate of failure

Did you ever dream of starting the new Amazon or Google? 

Or maybe you wanted to form the next great rock band like Guns N’ Roses or Metallica with your high school mates. 

(Admittedly, the rock group reference is based on my worship of the 1980’s scene and disregard of most new music.) 

Excessive media coverage of the rich and famous makes it difficult not to get caught up in the hype.

And if you didn’t give up the pursuit of making it big, you will most likely see plenty of examples of similar successful companies and rock stars everywhere. 

Let’s skip to the part you’ll hate.

Chances are that you’ll end up in the bottom 99 percent that makes the top 1 percent possible. 

The media doesn’t cover the humungous graveyard of companies and people who didn’t make it. Don’t believe me? When you Google people who failed in business, you find those who succeeded.

Sometimes you hear that “you need to want it bad enough.” But this is just a tiny sum of the parts. We tend to ignore timing, luck, opportunity, connections, preparation, skills, and, not to forget, all the outliers: The 99-year-old chain smoker. Or the high-school dropout that started a successful business. 

These are anything but the norm and a good reminder of survivorship bias. But we love to hear stories of people who defied the odds. 

Let’s end with a historical example. 

During World War II, statistician Abraham Wald investigated how to protect airplanes from damage better. 

The first method was to study the planes that came back, see the plane areas that suffered the most hits, and then strengthen them. 

Wald did something much more efficient on the second try: He studied the planes that didn’t make it back. 

You’ll often learn more if you study failure. 

11. Availability Bias

The tendency to think that examples that come readily to mind are more representative than they are

Consider these two statements: 

“Our neighborhood is safe. I’ve never locked my door, not even when I go on vacation.”

“Regular people win the lottery all the time. I just saw a special on TV. I’ve never won, but I could be next!”

When we use the types of examples, and cognitive bias, to prove our point, we succumb to the availability bias. 

We look at the world based on our own experience and what quickly comes to mind. You could argue that we prefer a wrong map compared to none at all, which of course, is foolish. Things don’t happen more frequently just because we can think of them. 

Because of availability bias, we overestimate the risk of dying in a plane crash or getting shot on the streets. At the same time, we underestimate dying from less spectacular causes like diabetes or cancer. Drama tends to beat reason. 

You should be extra conscious of availability bias when:

  1. Examples that come to mind happened just recently
  2. You can’t think of at least a few different examples
  3. The narrative is dramatic or vivid.

Furthermore, when something gets repeated enough times, you start to believe it or get false expectations of its potential risks. Such as watching the same news headlines over and over.

As your mom told you: “Not everything you see on TV is true.” And mamma is right. That’s why I argue that heavy news consumption makes you anything but informed

We practice what we know. But when you visit a foreign city, you can’t just pull out a map of your hometown. Make sure you don’t live your life that way. 

12. Confirmation Bias

Interpreting new information so that it becomes compatible with our existing theories, beliefs, and convictions

You and I have to make our minds up on many things to save time and energy. We accumulate shortcuts for decisions and beliefs to avoid a mental meltdown.

During this intuitive process, Confirmation Bias lends you a helping hand.

Confirmation Bias explains why you might be wrong even when you know you’re right. You often see it flourish when you desire an outcome, have deeply rooted beliefs, and are emotionally engaged. Or all three at once. 

Just open any news article and get vague prophecies from journalists and economists like this: “In the medium term, rates will go up,” or “The government’s new policy will fail in the coming years.”

A broken clock is right twice a day. Most so-called experts are not. 

We all like to be right and to have been right all along. But make sure to challenge your mind with counterarguments and contradicting evidence. It’s healthy to get a sanity check. 

You should also check out this article to get more in-depth knowledge about Confirmation Bias and how you cope with it. 

By Kristian Magnus

Stay updated