Cognitive Biases

A cognitive bias is a systematic tendency to make mistakes in the way that we think about something. In recent decades cognitive psychologists have uncovered dozens of different cognitive biases which most people manifest to varying degrees in different situations. Cognitive biases cause us to make systematic mistakes in our reasoning, meaning that we are less likely to form correct beliefs about the world. Many cognitive biases are believed to be the result of subconscious ‘heuristics’, which are simple rules that we tend to follow in making inferences and judgments without even realising it. These heuristics can often be useful in helping us to make decisions while avoiding a very long and costly analysis of all the details, but also lead us to make systematic errors in many settings. Unfortunately, even intelligent, highly-educated people have been shown to be susceptible to these biases. Indeed, when informed that most people exhibit a certain bias, experimental subjects will often deny that they personally were subject to the bias, even when experimental results indicate that they clearly were subject to the bias. Because of their pervasive influence on our reasoning, it is vital to be aware of the major cognitive biases and to take efforts to reduce their effect – it is seldom possible to eliminate them entirely. Some of the more well-established biases are discussed briefly below.

  • Anchoring: over-reliance on the first pieces of information that become available when making decisions, even if this ‘information’ has no real informative value. Charities and other groups use this to encourage people to donate more: they often have predefined categories that might start at fairly high values. Even if the person then chooses to donate a significantly lower amount, they have been anchored by the higher figures as to what an ‘appropriate’ sum might be, and thus are likely to choose to give more. Similar principles are applied by salesmen in making initial offers.
  • Authority bias: many people tend to give undue weight to the views or opinions of a perceived ‘authority’, even if that person has no special experience or knowledge to warrant this deference. Examples include the tendency to believe celebrity endorsements for medicines and products, and the tendency to seek the views of experts even on subjects totally outside their field of expertise.
  • Availability heuristic: people tend to make judgements about how prevalent something is based on how easily it is for them to think of a relevant example. This is one reason why people systematically overestimate the risks of air travel, terrorism, and natural disasters, since there are often prominent examples of such events that people can recall, so they tend to judge them as being much more common than the really are. On the other hand, more mundane risks like dying in a car crash or from heart disease are much less reported, so many people find specific examples harder to think of and thus judge them much less common than they really are.
  • Choice-supportive bias: the tendency to evaluate a choice more positively after it has been made. Post-purchase rationalisation is a very common example many of us will be familiar with. Another example found in research is that people tend to overestimate their college grades even many decades after the fact (a finding that may also be related to the overconfidence effect discussed below).
  • Confirmation bias: the tendency to seek out, interpret, and remember evidence in ways that support our existing beliefs. Many studies have found that evidence which disconfirms our existing beliefs is more likely to be challenged, critiques, or forgotten then evidence which supports our beliefs, which is typically accepted much less critically. Many examples from business, law, academia, and politics have been discussed in the literature, but more familiar will be manifestations in everyday life. For example, if we decide for whatever reason that a particular person is grumpy or disagreeable, we are likely to interpret every subsequent encounter in line with this belief, thus reinforcing and confirming our initial belief about the person. We seldom stop to consider if perhaps there might be other explanations of their behaviour, or whether we are being fair in our judgement.
  • Conjunction fallacy: people tend to rate specific, believable examples as more probable than less specific general cases. A classic study of this involved asking the question “Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable? 1) Linda is a bank teller, 2) Linda is a bank teller and is active in the feminist movement”. Most people choose the second option, even though it is always the case that two things together (feminist and bank teller) are less likely than either of those things alone (feminist or bank teller or both).
  • False consensus effect: a tendency to overestimate how many people agree with our opinions. May be related to the availability heuristic, in that we tend to associate mostly with people like us, so we can think of many more examples of people expressing similar opinions to ours than of people expressing very different opinions.
  • Hindsight bias: the tendency to describe an evident as easily predictable after the fact, even when it was not predicted beforehand and there are few objective indicators by which it could reasonably have been predicted. We like to think that we ‘knew it all along’, even if we really didn’t.
  • Illusory correlation: humans are very good at detecting patterns, and this can often lead us to detect patterns, correlations, and relationships where none in fact exist. Paredolia is a special case of this, referring to cases where ambigious visual stimuli are interpreted as having very particular meaning. It is largely responsible for the phenomena of people reporting seeing the face of famous people on their toast, or patterns in the clouds and stars. Many geological features, including the famous ‘face on Mars’, are interpreted as showing faces or other familiar objects even when objectively it is clear that there are very few clear features of a face and in fact the image is really quite vague.
  • Mere exposure effect: the tendency to prefer things or be more likely to believe claims purely because they are familiar to us. This is the basis for much commercial and political advertising – it doesn’t matter if there is often little actual content to such advertisements, what matters is that we are familiar with the candidate or movie title or brand name, as then we are demonstrably much more likely to vote for them, see the movie, or purchase the product.
  • Naive realism: the tendency to believe that we perceive the world clearly, directly, and objectively the way it really is, and those who disagree with us must therefore be ignorant, biased, or immoral. We tend to dramatically underestimate the degree to which our beliefs and perceptions are shaped by expectations, cultural background, desires, circumstances, our own biases, and many other such non-rational factors.
  • Neglect of probability: the tendency to neglect the probability of an outcome when making a decision under uncertainty, tending to treat outcomes or situations either as ‘safe’ or ‘unsafe’ without specification for degrees of risk.
  • Overconfidence effect: the subjective degree of confidence that people place in their own judgements is often grossly disproportionate to the actual accuracy of the judgement. One famous study found that 93% of American drivers rate themselves as better than the median, while other studies have found that judgements described as “99% certain” turn out to be wrong 40% of the time.
  • Planning fallacy: the tendency to underestimate the amount of time it will take to complete a task, even when past experience clearly indicates a history of such underestimation. Nearly every type of organisation suffers because of the planning fallacy – it is a major reason why programs run over-budget and over-time. It contributes to people being late to appointments, students not leaving enough time to complete assignments, and construction projects taking years longer than expected.
  • Subjective validation: a tendency to judge statements to be true if they have a personal meaning or connection to us. This is the basis for much of the credence given to astrology and psychic readings, because people are inclined to believe that events in the world have a personal connection to and significance for their own lives.
  • Zero-risk bias: the preference for completely eliminating small risks over obtaining greater absolute reductions in larger risks. Examples include demands for the recall of drugs that have only very rare side effects, or labeling of foods containing ingredients that are very unlikely to cause harm, while at the same time showing little concern for the much larger risks associated with (for instance) driving or not exercising.

Further Reading

20 cognitive biases graphic: a poster giving a brief outline of twenty common cognitive biases

Wikipedia list of cognitive biases: a comprehensive resource with links to further information

How scientists fool themselves – and how they can stop: a short but informative article from Nature which focuses on biases often affecting scientists