16 Common Cognitive Biases That Lead to Bad Decisions

cognitive biases bad decision making

16 Common Cognitive Biases That Lead to Bad Decisions

We all want to believe that we are rational, logical beings who make wise decisions. But the truth is, our thinking is often clouded by cognitive biases – mental blindspots that cause us to make poor judgments and ill-advised choices.

Cognitive biases are errors in thinking that arise from faulty reasoning, inaccurate perceptions, and emotional influences. They’re mental shortcuts that our brains take to simplify information processing. While sometimes useful, these biases can also lead us astray.

Understanding cognitive biases is key to better decision making and clearer thinking. In this post, we’ll explore what cognitive biases are, why they matter, and take a look at some of the most common culprits that muddy our thought processes.

What is a Cognitive Bias?

A cognitive bias is a systematic error in thinking that occurs when people process and interpret information incorrectly. Our brains rely on mental shortcuts (called heuristics) to make judgments quickly. While useful at times, these shortcuts can also lead to irrational, illogical decisions and distorted thinking patterns.

There are over 100 identified cognitive biases that impact how we think and act. Some cause us to seek out or favor information that confirms our existing beliefs (confirmation bias). Others lead us to place too much weight on anecdotal evidence (availability heuristic). And some cause us to misjudge risks or chances of success (framing effect).

Why Do Cognitive Biases Matter?

Cognitive biases matter because they lead to poor decisions, faulty reasoning, and bad judgments across all domains of life – from business to relationships to public policy. When our thinking is distorted by mental blindspots, we make sub-optimal choices that can have negative real-world consequences.

For example, confirmation bias (the tendency to seek out information that confirms our preexisting beliefs) is thought to contribute to things like political polarization, discrimination, and flawed analyses. The sunk cost fallacy can cause companies to perpetuate failing projects. And overconfidence from the Dunning-Kruger effect can lead to disastrous management decisions.

Being aware of cognitive biases – and working to overcome them – is crucial for effective decision making, clear reasoning, and productive self-reflection. Let’s look at some of the most pernicious offenders.

Common Cognitive Biases That Distort Our Thinking

1. Confirmation Bias

One of the most well-known cognitive biases, confirmation bias refers to our tendency to seek out, interpret, and remember information that aligns with our pre-existing beliefs and assumptions. We filter out contradictory data that doesn’t fit our mental models.

For example, if you believe a co-worker is incompetent, you’re more likely to pay attention to and remember instances where they make mistakes while overlooking examples of their good work.

Confirmation bias can lead us to make poor decisions by basing judgments on partial information and discounting valid counterevidence.

2. Anchoring Bias

Anchoring bias occurs when we rely too heavily on the first piece of information we receive about something (the “anchor”) when making decisions. Our judgments get anchored to that initial data point, even if it’s arbitrary or irrelevant.

For example, if you see a $1,000 price tag on a product, that number may irrationally influence how much you’re willing to pay, even if the real value is quite different.

This bias causes us to make decisions without adjusting sufficiently away from the anchor. It explains why starting prices and first offers carry so much sway in negotiations.

3. Availability Heuristic

The availability heuristic causes us to estimate the probability of something happening based on how easily examples come to mind. If something is more mentally available or memorable, we think it must be more probable.

For instance, you may irrationally fear being attacked by a shark after seeing sensationalized media coverage of a rare attack, rather than logically assessing the actual low risks.

This bias can lead to misjudging risks, good & bad events, and over/underestimating probability when anecdotes or vivid examples are front-of-mind.

4. Framing Effect

The framing effect refers to how the same information can lead to completely different decisions, depending on how it’s presented or “framed.” A single fact can be framed in multiple ways, and the frame powerfully shapes our perceptions and choices.

Framing Effect Examples

  • Cost Framing: “$60 discount” seems preferable to “pay $40 tonight.”
  • Attribute Framing: “92% fat-free” seems healthier than “contains 8% fat.”
  • Goal Framing: “Lives Saved” motivates differently than “Lives Lost.”
  • Risk Framing: People are more willing to take a risk framed as a potential gain rather than a possible loss.

We are unconsciously swayed by superficial differences in framing that don’t change the objective facts. Awareness of this effect helps us evaluate decisions from multiple perspectives.

5. Sunk Cost Fallacy

The sunk cost fallacy occurs when we justify further investment of resources (time, money, effort) into something we can no longer gain benefit from, simply because of what we’ve already sunk into it. Our reluctance to accept loss causes us to “throw good money after bad.”

For example, companies may persist with a failing project because so much has already been invested, rather than cutting losses.

A rational decision would ignore sunk costs (which are irrecoverable) and only consider projected costs and benefits going forward. But humans struggle to make that judicious call.

6. Bandwagon Effect

The bandwagon effect is a psychological bias that causes people to think or act a certain way simply because others are doing so, regardless of their own beliefs. We tend to follow the crowd, adopt popular positions, and go along with majority behavior.

For example, you may vote for a certain political candidate mainly because “everyone else” seems to be supporting them based on polls or social media.

At its core, the bandwagon effect stems from the desire to fit in and “go with the flow.” But blindly embracing majority views circumvents independent, critical analysis.

7. Dunning-Kruger Effect

The Dunning-Kruger effect refers to the cognitive bias where unskilled people mistakenly assess their ability to be much higher than it really is. Essentially, the less you know, the more confident you feel (unaware of your own incompetence).

For instance, a student performing poorly on exams may rate their knowledge of the material very highly due to overconfidence.

Conversely, highly skilled individuals tend to underestimate their abilities out of humility. This bias causes poor decisions based on overconfident ignorance or underconfident expertise.

8. Survivorship Bias

Survivorship bias is the logical error of focusing only on the things/people that “survived” some process and overlooking those that didn’t make it through or were unsuccessful. This skews perceptions and leads to overly optimistic conclusions.

For example, only looking at successful entrepreneurs and businesses while ignoring all the failed startups and companies that didn’t make it. This paints an inaccurately rosy picture.

By only analyzing the “survivors”, we risk failing to understand what differentiates success from failure. Accounting for non-survivors is key to gaining more complete and balanced insights.

9. Loss Aversion

Loss aversion refers to the tendency for people to strongly prefer avoiding losses over acquiring equivalent gains. The psychological impact of losses is more potent than the pleasure from gains.

For instance, someone may choose keeping $100 for certain over taking a 50% chance of winning $200 or $0. Losses “hurt” more than gains “help.”

This bias was discovered in multiple studies showing people placing more weight on avoiding losses rather than achieving gains of the same amount. It explains things like holding losing investments too long out of fear.

10. IKEA Effect

The IKEA effect describes people’s tendency to place a disproportionately high value on things they had a hand in creating, building, or putting effort into – even if the end result is mediocre or flawed.

For example, you may think a simple piece of IKEA furniture you assembled is higher quality than it really is simply because you constructed it yourself.

This bias overrides objective judgments of quality and worth based on the psychological investment of labor and effort into the creation process. It leads to overvaluing self-made products.

11. Hindsight Bias

Hindsight bias, also known as the “knew-it-all-along” effect, is the natural tendency for people to overestimate their ability to predict an outcome that could not possibly have been predicted from the information available at the time.

After a major event occurs (like 9/11 or the 2008 financial crisis), many people insist they knew it was going to happen, when in reality hardly anyone anticipated it accurately.

This cognitive blind spot prevents honest evaluation of the unpredictability or randomness of past events. It causes overconfidence about the ability to predict the future.

12. Halo Effect

The halo effect is a type of cognitive bias where one positive trait of a person, company, brand, or product influences how we view all other traits and characteristics – even if they’re unrelated.

For example, if a job candidate went to an elite college, we may wrongly assume they are highly intelligent, hardworking, and experienced despite lack of evidence.

The halo effect causes us to make overall judgments based heavily on a single positive trait rather than evaluating all characteristics impartially and objectively. It clouds decision-making.

13. Decoy Effect

The decoy effect is an interesting cognitive bias where consumers have a specific change in preference between two options when presented with a third decoy option that is asymmetrically dominated.

For example, imagine you’re buying a subscription. There’s a $60 plan with basic features and a $100 premium plan. Most choose the $60 basic. But when they add an asymmetric decoy option at $90 with fewer features than the $100 plan, more people opt for the $100 premium.

Even though the decoy isn’t meant to be chosen, its presence influences the perceived value and preference between the other two. It’s an irrational distortion of decision-making.

14. Cognitive Dissonance

Cognitive dissonance refers to the mental stress and discomfort experienced when we hold two or more contradictory beliefs, ideas, or values simultaneously, or when our beliefs don’t match our behaviors.

For instance, the classic example is a smoker knowing that smoking causes cancer, yet continuing to smoke despite the dissonance between belief and action.

To reduce this psychological tension, people have a motivational drive to change either their discordant beliefs, attitudes, or behaviors through rationalization, justification, or denial. Dissonance is an uncomfortable state that the mind seeks to eliminate.

15. Authority Bias

Authority bias is the tendency to attribute greater accuracy and truthfulness to the opinions and statements of an authority figure, and to be more influenced by that source, compared to non-authorities on the same matter.

For example, being more persuaded by the conclusions of a study because it was conducted by a prestigious university rather than critically evaluating the actual evidence.

While authority can be a valid decision-making shortcut, taken too far it causes placing excessive and automatic deference to authorities, experts, and dogma over one’s own powers of observation and reasoning.

16. Action Bias

Action bias, also called the “action effect,” refers to the tendency for people to prefer taking action over no action, even if the consequences of the two choices are equal or the no-action alternative is more optimal.

A manager may choose to implement an unnecessary new policy simply because taking “action” feels more productive than maintaining the status quo, even if there’s no clear benefit.

At its core, action bias stems from the innate human desire to control outcomes, with inaction being perceived as inadmissible or irresponsible in many situations. But in reality, disciplined non-action is sometimes the wisest choice.

Recognizing and Overcoming Cognitive Biases

Being aware that cognitive biases exist is step one. But actively recognizing them in our own thinking and decision-making processes takes conscious effort. Here are some tips:

  • Slow down and re-evaluate gut reactions
  • Seek out differing perspectives and contradictory information
  • Support claims with hard data rather than anecdotes
  • Question assumptions and re-frame issues in multiple ways
  • Have others critique your analysis and decision-making process

Ultimately, critical thinking, intellectual humility, and impartial self-reflection are key to overcoming cognitive biases. Our minds are prone to mistakes, so we must routinely check ourselves against objective evidence and logic.

Cognitive biases are pervasive – everyone exhibits them to some degree. But actively working to identify our mental blindspots can mitigate their impact and lead to improved reasoning, better decisions, and clearer thinking.

TL;DR

  • Cognitive biases are systematic errors in thinking that distort our reasoning and decision-making processes.
  • Common cognitive biases like confirmation bias, anchoring bias, availability heuristics, framing effect, sunk cost fallacy, bandwagon effect, and overconfidence bias cause poor judgments.
  • These mental blindspots stem from evolutionary mental shortcuts but often lead us astray in the modern world.
  • Recognizing cognitive biases through active questioning, seeking disconfirming evidence, and re-evaluating gut reactions is key to clearer thinking.
  • While difficult to eliminate, we can mitigate the impact of cognitive biases through critical thinking, intellectual humility, and objective self-analysis.

Q&A

Q: What’s the difference between a cognitive bias and a logical fallacy?

A: Cognitive biases and logical fallacies are related, but distinct concepts:

  • Cognitive biases are flaws in our thinking patterns and the ways we process information automatically. They operate at the subconscious level.
  • Logical fallacies are specific errors in the logical structure of an argument or rhetorical technique used to derive a conclusion. They are rooted in faulty reasoning.

So while cognitive biases can certainly contribute to and underlie logical fallacies being made, the former refers to mental blindspots in perception and judgment, while the latter pertains to faults in the reasoning and argumentation process.

Q: Do cognitive biases affect experts and highly intelligent people too?

A: Yes. Experts and highly intelligent individuals are susceptible to certain unique cognitive biases like:

  • Bias blind spot – failing to compensate for cognitive biases in themselves.
  • Curse of knowledge – inability to share knowledge/skills because unconsciously inflating what others understand.
  • Stated choice bias – artificially bolstering confidence in a previous choice to appear consistent.

In general, expertise in one area does not preclude mental blindspots in other domains, nor general human cognitive limitations. So while intelligence may mitigate some biases, it does not make one immune – experts must remain vigilant too.

Q: How can we help children develop better critical thinking abilities?

A: Fostering strong critical thinking skills from an early age is key to combating cognitive biases down the road. Here are some tips for parents and educators:

  • Encourage questioning and intellectual curiosity by asking open-ended inquiries.
  • Discuss real-world examples of flawed logic or poor decisions.
  • Practice reframing issues from multiple perspectives.
  • Emphasize humility and openness to changing one’s mind in the face of evidence.
  • Model intellectual honesty by admitting mistakes and biases openly.
  • Reward analytical thinking over asserting absolutes based on intuition.

The earlier we instill habits of impartial analysis and skepticism of mental shortcuts, the better prepared children will be to make rational choices as adults.

Leave a Reply