Latest News

How Cognitive Biases affect decision-making (and what we can do about it)

By Clive Lloyd
Principal Consultant at GYST Consulting Pty Ltd
Developer of the Care Factor Program



Cognitive Biases – An Introduction

As leaders, we are required to do a great deal of problem solving, planning and decision-making about key areas such as:

•      Performance Management

•      Cost Forecasting

•      Customer & Stakeholder Management

•      Safe Operations (etc.)

We can tend to think our planning and decision-making activities are conscious acts, involving deep thinking and analysis.  While this is true some of the time, a great deal of our thinking takes place at an unconscious level.  Being aware of this tendency to operate on autopilot, as well as understanding what we can do about our brain’s tendency to switch to unconscious processing is essential to objective decision-making.


The “Lazy” Brain

Our brains represent about 2% of our body weight yet use around 20% of our energy.  Hence, our brains seek to conserve energy through automating movement and cognitive processes (including our thinking).  The average human brain has around 100 billion neurons (nerve cells). About half of these neurons are located in a large brain mass known as the cerebellum. 

The cerebellum is located at the very base of the brain and its main role is to help us acquire new skills and once acquired, to make them automatic. This automation can also apply to our decision-making and planning.

Type 1 and Type 2 thinking

Kahneman (2011) divides our thinking into two subsystems: type 1 and type 2. Type 1 thinking is fast, intuitive, unconscious thought. Most everyday activities (like driving, talking, cleaning, etc.) make heavy use of the type 1 system. 

The type 2 system is slow, calculating, conscious thought. When faced with a difficult maths problem or thinking carefully about a philosophical problem, you're engaging the type 2 system. From Kahneman's perspective, the big difference between type 1 and type 2 thinking is that type 1 is fast and easy but very susceptible to bias, whereas type 2 is slow and requires conscious effort but is much more resistant to cognitive biases.

Traditionally, intelligence has correlated with type 2 thinking. So, it would be reasonable to assume that people who are better at type 2 thinking would use it more and, therefore be less vulnerable to bias. However, research shows that even those who are very good at type 2 thinking are even more vulnerable to cognitive biases. This is a deeply counter-intuitive result. Why is it that people who have a greater capacity to overcome bias have a greater vulnerability to bias?

A number of theories have been put forward to explain this result.  One relates to overconfidence. If you've become accustomed to thinking of yourself as being better at avoiding cognitive bias, you come to be confident in your abilities, to the point where you (ironically and unconsciously) think of yourself as less susceptible to biases.

Too often we become over-confident in how our minds think. We believe we see reality perfectly, and there’s no way our minds can ever be wrong or misjudge a person or situation. But this isn’t the case, and we need to accept these imperfections if we want to make an honest attempt to improve our objective decision-making processes.


What are Cognitive Biases?

A cognitive bias is a systematic error in thinking that affects the decisions and judgments that people make.  They are often referred to in psychology as heuristics (cognitive shortcuts) usually as a result of type 1 thinking.

Some of these biases are quite generalised energy-saving heuristics, while others refer to quite specific areas of unconscious processing. Some examples of generalised cognitive biases include:

  • Black & White thinking
  • Catastrophising
  • Mind Reading
  • Overgeneralising
  • Filtering

All of these biases assist the brain to make quick (type 1) decisions, however, they can lead to major errors in critical thinking.

There are many examples of specific cognitive biases, in fact there are around 100 such biases that have been consistently shown to impact our decision-making, some more potently than others.  The following biases (in particular) have been identified as consistent, powerful and problematic:

1.   Confirmation bias

2.   Planning Fallacy

3.   Anchoring Bias

4.   Fundamental Attribution error


Confirmation Bias

Confirmation bias happens when you look for information that supports your existing beliefs, and reject data that go against what you believe. This can lead you to make biased decisions, because you don't factor in all of the relevant information.

A 2013 study found that confirmation bias could affect the way that people view statistics. Its authors report that people have a tendency to infer information from statistics that support their existing beliefs, even when the data support an opposing view. That makes confirmation bias a potentially serious problem to overcome when you need to make an objective decision.

Confirmation bias is a common and insidious problem that can keep us from making accurate judgements and decisions in our personal and professional lives. Since it is hardwired into our human nature, it is difficult to see and to resist. It is far easier to spot confirmation bias at work in others then in ourselves. 


What can we do about Confirmation Bias?

Look for ways to challenge what you think you see. Seek out information from a range of sources, and use an approach such as De Bono’s “Six Thinking Hats” technique to consider situations from multiple perspectives.

Alternatively, discuss your thoughts with others. Surround yourself with a diverse group of people, and don't be afraid to listen to dissenting views. You can also seek out people and information that challenge your opinions, or assign someone on your team to play "devil's advocate" for major decisions.

The primary defence against confirmation bias is a healthy sense of self-awareness coupled with humility. When making decisions and judgements, keep the following thoughts in mind:

  • Why do I hold my current beliefs?
  • What impact would there be on my ego and pride if I were to learn that my views were incorrect?
  • Have I genuinely sought out alternative viewpoints?
  • Is it possible that I am simply wrong?
  • Pretend that you are supporting an alternative viewpoint. Walkthrough a plausible explanation supporting that perspective.

Having a healthy understanding of confirmation bias can make you a better critical thinker and decision maker. A good starting point is to observe the bias in others, both in the workplace and in your personal life. When you are feeling passionate about an issue or person, stop yourself and run through the bulleted checklist above. See if you can observe yourself falling victim to confirmation bias. While it can be painful to admit that your beliefs were misguided, it can ultimately result in better decisions and improved relationships. 


The Planning Fallacy

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed.

This phenomenon occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias only affects predictions about one's own tasks - when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.

The planning fallacy requires that predictions of current tasks' completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current tasks' completion times are more optimistic than the actual time needed to complete the tasks. In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns.


What can we do about the Planning Fallacy?

The good news is that the planning fallacy is really only a problem for our own work. Pair people up and use group estimating techniques to avoid unrealistic optimism creeping in.

Use past practice to guide future estimates. Have meetings to go over lessons learned, and make sure that you manage and record that organizational knowledge so that it isn’t lost. Then use that knowledge to help with planning similar tasks in the future.

Anchoring Bias

Anchoring is a cognitive bias that leads people to rely too heavily on an initial piece of information offered (known as the "anchor") when making decisions.

During decision-making, anchoring occurs when individuals use this initial piece of information to make subsequent judgments. Those objects near the anchor tend to be assimilated toward it and those further away tend to be displaced in the other direction. Once the value of this anchor is set, all future negotiations, arguments, estimates (etc.) are discussed in relation to the anchor. 

This bias occurs when interpreting future information using this anchor. For example, the initial price offered for a used car, set either before or at the start of negotiations, sets an arbitrary focal point for all following discussions. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car.

Put simply, this bias is the tendency to jump to conclusions – that is, to base your final judgment on information gained early on in the decision-making process. Think of this as a "first impression" bias. Once you form an initial picture of a situation, it's hard to see other possibilities.

What can we do about the Anchoring Bias?

Anchoring may happen if you feel under pressure to make a quick decision, or if you have a general tendency to act hastily. So, to avoid it, reflect on your decision-making history, and think about whether you've rushed to judgment in the past. 

Then, make time to make decisions slowly (type 2 thinking), and be ready to ask for longer if you feel under pressure to make a quick decision. (If someone is pressing aggressively for a decision, this can be a sign that the thing they're pushing for is against your best interests.)

The Fundamental Attribution Error

This is the tendency to blame others when things go wrong, instead of looking objectively at the situation. In particular, you may blame or judge someone based on a stereotype or a perceived personality flaw.

For example, if you're in a car accident, and the other driver is at fault, you're more likely to assume that he or she is a bad driver than you are to consider whether bad weather played a role.

Fundamental attribution error is the opposite of actor-observer bias, in that you tend to place blame on external events.

For example, if you have a car accident that's your fault, you're more likely to blame the brakes or the wet road than your reaction time.

What can we do about the Fundamental Attribution Error?

It's essential to look at situations, and the people involved in them, non-judgmentally. Use empathy and (if appropriate) cultural intelligence, to understand why people behave in the ways that they do. Also, build emotional intelligence, so that you can reflect accurately on your own behavior.

It's hard to spot psychological bias in ourselves, because it often comes from unconscious (type 1) thinking. For this reason, it can often be unwise to make major decisions on your own. Kahneman et al. (2011) reflected on this in a Harvard Business Review article, in which they suggest that you should make important decisions as part of a group process.

In Summary

Psychological bias is the tendency to make decisions or take action in an unknowingly irrational way. To overcome it, look for ways to introduce objectivity into your decision-making, and allow more time for it.

Use tools that help you assess background information systematically, surround yourself with people who will challenge your opinions, and listen carefully and empathetically to their views – even when they tell you something you don't want to hear!


Avoiding Psychological Bias in Decision Making: How to Make Objective Decisions

Kahneman, D. (2011) Thinking Fast and Slow. New York: Farrar, Straus and Giroux,

De Bono, E. (1999). Six thinking hats. Boston: Back Bay Books.

Clive Lloyd is an Australian psychologist specialising in Psychological Safety, well-being and mentally- healthy workplaces. He is the director of GYST Consulting Pty Ltd, and developer of the acclaimed Care Factor Program.


“This workshop very cleverly explains human nature and how it affects safety. I believe this course has completely changed my outlook on safety, as ultimately safety is your own responsibility. It has changed the way I manage myself in pressure situations and stopped me getting upset with things I cannot control. The course was excellent...”


"Clive is a highly motivated and dynamic trainer. He has a deep understanding of his subject and delivers a very powerful message in an incredibly short space of time. This is probably the best training session I have ever attended and I would highly recommend Clive and GYST Consulting."

Business Manager

"The Care Factor Program has enabled me to be conscious of taking responsibility and understanding the how/what/why my thinking is built on. Also, that to change culture, I must empower others by asking questions and not being afraid to intervene"

Mining Engineer  

Clients we work with

Want to find out more?

Please complete the form below so we can respond to your enquiry.
If you would prefer to speak to us over the phone, call GYST Consulting on 07 5533 2103.