How Cognitive Biases Arise and the Role Heuristics Play in This

Text on image: How cognitive biases arise and the role heuristics play in them. Illustration of a brain.
Psychology

How Cognitive Biases Arise and the Role Heuristics Play in This

I have already produced some content about cognitive biases – these shortcuts and automatisms in our thinking that can sometimes lead to hideous misjudgements. Today I would like to take a closer look at how and why biases actually arise. What exactly happens in our brains, and what do heuristics have to do with it?

Where does the term “cognitive bias” come from?

The term “cognitive bias” comes from cognitive psychology, which deals with how we humans process information. It is a collective term for so-called systematic, erroneous tendencies in perception, memory, thinking and judgement. Wow, that’s a lot of smart words. Let’s take a closer look:

  • Cognition is Latin and means “to recognise” or “to learn about”.
  • A bias is basically an error, a deviation or a distortion.
  • Cognitive biases can, therefore, also be described as “errors in recognition”.
  • Systematic in this context means that the errors do not happen by chance.

Researchers have been dealing with this phenomenon as early as the 1950s. At the latest with Daniel Kahneman’s bestseller “Thinking, fast and slow” from 2011, the topic has also reached a wider audience.

How we process information

To understand what exactly goes wrong when cognitive biases creep in, let’s first understand how our brains process information in the first place. As in all (scientific) fields, there are various theories and models on this subject, and the process of information processing is also incredibly complex. I’ll try to explain the whole thing as simple as possible.

Basically, it begins with a so-called external stimulus. We could also simply say that it starts with the information we become attentive to. This now comes to our brain via various possible paths (our sense organs and sensory systems). This is called perception. Simply put, we can see, hear, feel, smell or taste the stimulus.

For better understanding, let’s pretend that our brain is something like a computer. The information we have just perceived now goes into the so-called sensory memory and is filtered there. In this way, a lot of such stimuli pour in on us every second. Only the critical part of these is transformed into a storable code. This way, a cognitive/mental representation of it is created, an image in the brain, so to speak.

During this transformation, the information is reduced, processed and changed by perception and attention processes and evaluation and memory operations. The goal here is to select a reaction based on decision-making processes. Or, to put it more simply: the brain tries to decide how to deal with this information.

How does the brain decide?

In his highly simplified model, Daniel Kahneman (2011) distinguishes between two systems of how the brain processes information. System 1 is responsible for spontaneous, emotional and intuitive evaluations and thought processes. System 2 evaluates information thoughtfully, logically and rationally.

You can already imagine that system 1 – our intuitive system – is many times faster. In addition, it can work in parallel and thus evaluate even more information simultaneously. Compared to this, our rational mind (system 2) is a lame snail.

System 1 was and is essential for our survival. The rustling in the bush could be the sabre-toothed tiger at primaeval times, and a slow reaction could mean our death. But also today, we react immediately to a honking car when we step onto the road and perhaps have not paid sufficient attention to the traffic. An extended, rational evaluation of this sound via system 2 would be fatal.

System 1 thus puts us in a position to make decisions faster than we even realise we have to make one, even based on very little, incomplete information. 

At what points do mistakes happen?

But even in situations that are not life-threatening, we don’t always have the time to fire up system 2 to make decisions. And sometimes, we don’t think we have that time, are “too lazy” to do so, or the information at hand is simply too complex for us to ever penetrate it fully with System 2. And sometimes, it is also the case that we do not have enough information at hand.

This is where the heuristics – helpful rules of thumb, strategies or mental shortcuts – from system 1 come into play. These are also called judgement heuristics, all used unconsciously by us and primarily based on past experiences and our beliefs. Their advantage is that we use them to reach judgements simply and quickly (saving resources) that are usually sufficiently good (“okayish”). However, the more complex the information or situation, the more serious the risk of wrong conclusions or the lower the quality of the judgement made – thus, distortions arise.

Why do judgement heuristics lead to errors?

At this point, I would like to clarify a misunderstanding with cognitive biases and heuristics often being mixed up. There is a crucial difference. Biases fundamentally distort perception, making it difficult to understand reality accurately. They can be based on judgement heuristics, a form of System 1 thinking. Heuristics themselves are by no means erroneous, but they are prone to error.

A distinction is made between three central, classical judgement heuristics. Kahneman and Tversky (1982) have added a fourth. Let us take a closer look at them:

  • Availability heuristic This heuristic is used when judging frequencies or probabilities. For this purpose, memories of similar or comparable events or information are used. The more readily available a reminiscence is in the individual’s mind, the more probable or frequent the currently considered information is judged to be. Why can this be problematic? The individual’s memory of similar or comparable events or details has no causal relationship with the actual probability or frequency of the current stimulus. The perceived correlation is, therefore, imaginary.
  • Representativeness heuristic This heuristic leads to a single piece of information being judged as representative of a class of information. So a judgement is made on a prototypical knowledge moment. A straightforward translation for all these technical terms is: stereotyping. Why can this be problematic? Similarly to the availability heuristic, the fact that something appears representative has no causal connection with the actual probability. Also, how representativeness is assessed is particularly problematic because it often grossly neglects the sample size.
  • Anchor heuristic Here, a value obtained arbitrarily from environmental information is set as a so-called anchor. All subsequent information is evaluated in relation to this anchor. In simpler words: we allow ourselves to be influenced in our decisions by our environment, even when the environmental information is actually irrelevant to the decision. Why can this be problematic? Anchor effects are used extensively in marketing. And without realising it, we let our judgements be guided by them. “Only 50 euros will help” as the slogan of a fundraising campaign directly answers our question of what sum would be appropriate. If we might have tended towards 10€ without this anchor, we are now primed for 50. In other situations, actually irrelevant environmental information may distract us from better judgements in this way. 
  • Simulation heuristic When no information is available for judgement, judgement is formed based on the imagination, which has been shaped by the individual’s past experiences. The unknown is simulated by recourse to the known. We use this heuristic, for example, to understand and predict the behaviour of others and to answer questions that involve counterfactuals. Why can this be problematic? Well, how much do you trust the weather forecast? And what is the weather forecast based on? It is based on many simulations that are evaluated by powerful computers over a period of time. Nevertheless, they are often not accurate or only partially accurate. Yes, our brain is, of course, also a supercomputer. But our simulations in this heuristic are based on much less data than weather forecasts, and we often make our decisions in milliseconds. The probability that our “calculation” is wrong due to faulty basic assumptions – i.e. data on which the simulation is based – is relatively high.

I’ve lost the thread. So how does all this fit together?

So systems that serve a good purpose in the first place can lead to cognitive distortions and, thus, misjudgements. They are turned on in different situations:

  • when we have little time to act or think long and hard about a decision
  • when we have little data on which to base our judgements
  • when we have too much data as a basis for our judgement
  • when the brain decides which information to store in memory

Based on the given data, the brain makes these decisions for us using specific rules of thumb – so we decide unconsciously. 

In the process 

  • we neglect specific details and form stereotypes
  • we prefer simpler over more complex information
  • we assume we know what other people think
  • we value things or people we like more than those we don’t like
  • look for patterns in everything, even in sparse data
  • notice when something has changed
  • prefer information that corresponds to our beliefs
  • etc.

John Manoogian and TilmannR have created a very nice visual summary of cognitive biases based on Buster Benson’s category model:

Jm3, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

Conclusion

Have you now understood better why cognitive biases occur, or have I managed to confuse you even more? 

Feel free to write to me and let me know if you want me to explain particular aspects in more detail in another article. I’ll be happy to do that.

Relevant Resources

Atkinson, R.C., Shiffrin, R.M. (1968). Human Memory: A Proposed System and its Control Processes. Psychology of Learning and Motivation, 1968(2), 89-195. https://doi.org/10.1016/S0079-7421(08)60422-3

Kahneman, D. (2011). Thinking, Fast and Slow. Penguin Random House.


Subscribe to my mailing list. And always stay up to date with my latest articles, videos and offers. Plus exclusive mailing list content. Jump to subscription.
Subscribe to my mailing list. And alsways stay up to date with my latest articles, videos and offers, plus exclusive mailing list content. Jump to subscription.