This guest post by
Leah Goldrick was originally published on her excellent blog, Common Sense Ethics. Leah acknowledges that confirmation bias is linked to pattern
recognition, which serves a useful purpose. The confirmation bias problem
arises when we seek out information to confirm what we believe and ignore
everything else.
The documentary that
Leah refers to in her first paragraph is worth watching. It illustrates how
easy it was for a group of people who did not appear likely to be particularly
gullible to acquire an unshakeable belief that the end of the world would occur
on 21 May 2011.
Why is it so hard to for us the change our beliefs or to get
other people to change their minds? A new documentary film Right
Between Your Ears, examines the science and psychology of
how people form convictions. According to producer Kris De Meyer, a
neuroscientist, certain aspects of human psychology make it very hard for
us to be objective or unbiased.
People usually form beliefs by accepting what they've been told by someone they trust: parents, teachers, media and so on. Our beliefs can change when we learn new things. But when we become convinced of something, it is similar to a religious belief in the way our brain operates. We may react with anger when challenged. This human tendency often leads us to seek out information which confirms what we already believe and ignore everything else - it's a cognitive bias actually - called confirmation bias.
It seems obvious why confirmation bias can be a problem - it can prevent us making good decisions. It makes us rigid thinkers. Someone can easily fool us by simply appealing to our established belief systems. The good news is that there are some practical strategies to overcome this natural human shortsightedness that I'll let you in on at the end of the post.
People usually form beliefs by accepting what they've been told by someone they trust: parents, teachers, media and so on. Our beliefs can change when we learn new things. But when we become convinced of something, it is similar to a religious belief in the way our brain operates. We may react with anger when challenged. This human tendency often leads us to seek out information which confirms what we already believe and ignore everything else - it's a cognitive bias actually - called confirmation bias.
It seems obvious why confirmation bias can be a problem - it can prevent us making good decisions. It makes us rigid thinkers. Someone can easily fool us by simply appealing to our established belief systems. The good news is that there are some practical strategies to overcome this natural human shortsightedness that I'll let you in on at the end of the post.
How We Form Beliefs
Let me back up for just a second. What led me to write this
post (besides my abiding interest in critical thinking) was the Shakespeare authorship course I recently took online
via the University of London. Along with being just about the most interesting
topic ever, the instructor, Dr. Ros Barber, focused the first lesson
on the science of how beliefs are formed, cognitive bias, and how belief
systems can crystallize into orthodoxies which may not be questioned
without ridicule.
Dr. Barber interviews Kris De Meyer, a neuroscientist and documentary film maker currently working at the Department of Neuroimaging at King's Institute for Psychiatry, Psychology and Neuroscience, about how we form our beliefs in the first place.
According to De Meyer, we form fairly rigid belief systems or perceptual frameworks out of necessity as we go through life in order to handle the information continually coming at us. Usually, our perceptual framework serves us quite well. But it can also be a major intellectual handicap when we are confronted with information which undercuts our established belief systems. De Meyer states:
"But beliefs become strongly held and particularly if we build our identity around them, they begin to act as perception filters. Indeed, it might be useful to think of a belief as a perceptual framework, something that helps us make sense of the world around us."
Dr. Barber interviews Kris De Meyer, a neuroscientist and documentary film maker currently working at the Department of Neuroimaging at King's Institute for Psychiatry, Psychology and Neuroscience, about how we form our beliefs in the first place.
According to De Meyer, we form fairly rigid belief systems or perceptual frameworks out of necessity as we go through life in order to handle the information continually coming at us. Usually, our perceptual framework serves us quite well. But it can also be a major intellectual handicap when we are confronted with information which undercuts our established belief systems. De Meyer states:
"But beliefs become strongly held and particularly if we build our identity around them, they begin to act as perception filters. Indeed, it might be useful to think of a belief as a perceptual framework, something that helps us make sense of the world around us."
Confirmation Bias
The problem with our perceptions being filtered through our
belief structures is that it can create something called confirmation
bias. We tend to interpret new information in a way that strengthens
our preexisting beliefs. When we are confronted with information
which conflicts with our beliefs, we will often find ways to discard
it. We also tend to search out information which confirms our beliefs
rather than looking for more neutral or contradictory information.
For our general functioning in the world, we must keep our perceptual frameworks fairly rigid. So even when our brain finds data that is anomalous, confirmation bias can lead us to explain it away as an error. Experiments in the 1960s hinted that people are biased towards their beliefs. Later experiments focused on our natural tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives.
Anyone can suffer from confirmation bias: teachers, Shakespeare scholars, even scientists. In one study on confirmation bias involving scientists, over half of laboratory experimental results were inconsistent with the scientists' original hypotheses. In these cases, the scientists were reluctant to consider that data as valid. The anomalous finding was usually classified as a mistake. Even after scientists had produced an anomaly more than once, they would often choose not to follow up.
When we perceive, we construct systems of beliefs inside of our heads like a lawyer trying to prove a case. The more strongly we are engaged in a topic, the more likely we are to dismiss contradictory evidence. Basically on both sides of any debate, we have a system beliefs that tells us that we are right and the other side is wrong.
According to Ros Barber, "[When any conflict happens] it's been described as "a dialog of the deaf" because people can't hear the other point of view. They just think it's totally invalid."
For our general functioning in the world, we must keep our perceptual frameworks fairly rigid. So even when our brain finds data that is anomalous, confirmation bias can lead us to explain it away as an error. Experiments in the 1960s hinted that people are biased towards their beliefs. Later experiments focused on our natural tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives.
Anyone can suffer from confirmation bias: teachers, Shakespeare scholars, even scientists. In one study on confirmation bias involving scientists, over half of laboratory experimental results were inconsistent with the scientists' original hypotheses. In these cases, the scientists were reluctant to consider that data as valid. The anomalous finding was usually classified as a mistake. Even after scientists had produced an anomaly more than once, they would often choose not to follow up.
When we perceive, we construct systems of beliefs inside of our heads like a lawyer trying to prove a case. The more strongly we are engaged in a topic, the more likely we are to dismiss contradictory evidence. Basically on both sides of any debate, we have a system beliefs that tells us that we are right and the other side is wrong.
According to Ros Barber, "[When any conflict happens] it's been described as "a dialog of the deaf" because people can't hear the other point of view. They just think it's totally invalid."
Cognitive Dissonance
So why does confirmation bias happen? It might be
because of wishful thinking, or because of our limited mental
capacity to process information. It could also have to do with a failure to
imagine alternate possibilities (more on this later). Another explanation
for confirmation bias is that people are afraid
of being wrong, and fail to ask the right probing questions about
their beliefs, instead reasoning from their already held conclusions.
When we are confronted with contradictory evidence, it causes something called cognitive dissonance - mental distress caused by information that doesn't fit in with our current understanding of the world. Cognitive dissonance is uncomfortable and people will sometimes go to great lengths to avoid it.
Cognitive dissonance was first theorized by psychologist Leon Festinger who argued that we have an inner drive to hold consistent beliefs. Holding inconsistent beliefs causes us to feel disharmonious. Festinger studied a cult whose members believed that the earth was going to be destroyed by a flood. He investigated what happened to the cult members, especially the committed ones who had given up their homes and jobs, after the flood did not happen on the proposed date.
The most committed cult members were more likely to rationalize their original beliefs (confirmation bias) even after experiencing cognitive dissonance in the face of the flood not happening. Loosely affiliated members were much more likely to admit that they had simply made a mistake and to move on. The more attached we are to a belief, the harder it is to change it.
When we are confronted with contradictory evidence, it causes something called cognitive dissonance - mental distress caused by information that doesn't fit in with our current understanding of the world. Cognitive dissonance is uncomfortable and people will sometimes go to great lengths to avoid it.
Cognitive dissonance was first theorized by psychologist Leon Festinger who argued that we have an inner drive to hold consistent beliefs. Holding inconsistent beliefs causes us to feel disharmonious. Festinger studied a cult whose members believed that the earth was going to be destroyed by a flood. He investigated what happened to the cult members, especially the committed ones who had given up their homes and jobs, after the flood did not happen on the proposed date.
The most committed cult members were more likely to rationalize their original beliefs (confirmation bias) even after experiencing cognitive dissonance in the face of the flood not happening. Loosely affiliated members were much more likely to admit that they had simply made a mistake and to move on. The more attached we are to a belief, the harder it is to change it.
How To Think (and Debate) With Less Bias
So what are the best strategies to overcome our natural
human shortsightedness and bias? The first is to keep emotional distance
in reasoning, and the second is to consider the other side (or sides) of any
debate, a technique called the "consider the opposite,"
strategy.
1. Keep Emotional Distance When Reasoning
Given the natural human tendency towards confirmation bias, it is important to be at least somewhat dispassionate when reasoning and debating. I like to call this emotional distance. Emotional distance is just as much a character trait of a reasonable person as it is a strategy for handling cognitive biases.
Confirmation bias may in part stem from our desire not to be wrong, so by keeping emotional distance, you essentially are willing admit to yourself that you could have some things wrong. Don't be too attached to any particular piece of evidence. In any difficult debate we all may get certain parts of the puzzle incorrect.
Look out for signs of confirmation bias in yourself. Remember that the more strongly held your beliefs are, the more likely you are to refuse to consider alternative evidence - like the cult members who invested everything in their belief in an impending flood.
Emotional distance also involves viewing debate as dialog rather than an angry fight. If your ego gets too caught up defending a certain belief, you are more likely to get angry when contradicted. Angry people usually double down and become more extreme on their point of view rather than considering someone else's. Keep in mind that politeness might actually be the secret weapon for getting someone to overcome their bias. Kris De Meyer suggests:
"When we do feel a pang of anger at being challenged, rather than responding very quickly we can step away from it for maybe a few hours or a day, think carefully about where that person is coming from, and then see if we can give them more constructive comments that then doesn't spark his or her angry response. Because it's those feelings of anger at being misunderstood and of being misrepresented that really are the ones that drive us towards more certainty. And if the conversation becomes amicable, it can be heated and passionate without being acrimonious and negative. The way to use [your knowledge of confirmation bias] is to question yourself and to reflect on your own assumptions and your own interactions with other people."
Maintaining emotional distance is powerful, but it may not be enough to overcome biases, which is why we should also use this second strategy:
2. Consider the Opposite
Confirmation bias may in part be the result of our limited mental capacity to imagine alternative scenarios. The consider the opposite strategy helps us to envision how else things might be. In a recent study, this technique was proven to work better than just attempting to remain objective.
Considering the opposite in everyday practice works like this: you take a look at a set of facts about something. Generally, you would try to discern whether the facts support your belief or not. If you are experiencing confirmation bias, you would probably imagine that the facts do actually support your belief. But when you force yourself to consider the opposite, you instead imagine that the facts point the opposite way, disproving your belief. This helps you to imagine alternatives to what you already believe.
The consider the opposite strategy works particularly well with diametrically opposed beliefs, but always bear in mind that there may be more than one alternate possibility. Be willing to entertain various possibilities rather than falling victim to false dichotomies.
1. Keep Emotional Distance When Reasoning
Given the natural human tendency towards confirmation bias, it is important to be at least somewhat dispassionate when reasoning and debating. I like to call this emotional distance. Emotional distance is just as much a character trait of a reasonable person as it is a strategy for handling cognitive biases.
Confirmation bias may in part stem from our desire not to be wrong, so by keeping emotional distance, you essentially are willing admit to yourself that you could have some things wrong. Don't be too attached to any particular piece of evidence. In any difficult debate we all may get certain parts of the puzzle incorrect.
Look out for signs of confirmation bias in yourself. Remember that the more strongly held your beliefs are, the more likely you are to refuse to consider alternative evidence - like the cult members who invested everything in their belief in an impending flood.
Emotional distance also involves viewing debate as dialog rather than an angry fight. If your ego gets too caught up defending a certain belief, you are more likely to get angry when contradicted. Angry people usually double down and become more extreme on their point of view rather than considering someone else's. Keep in mind that politeness might actually be the secret weapon for getting someone to overcome their bias. Kris De Meyer suggests:
"When we do feel a pang of anger at being challenged, rather than responding very quickly we can step away from it for maybe a few hours or a day, think carefully about where that person is coming from, and then see if we can give them more constructive comments that then doesn't spark his or her angry response. Because it's those feelings of anger at being misunderstood and of being misrepresented that really are the ones that drive us towards more certainty. And if the conversation becomes amicable, it can be heated and passionate without being acrimonious and negative. The way to use [your knowledge of confirmation bias] is to question yourself and to reflect on your own assumptions and your own interactions with other people."
Maintaining emotional distance is powerful, but it may not be enough to overcome biases, which is why we should also use this second strategy:
2. Consider the Opposite
Confirmation bias may in part be the result of our limited mental capacity to imagine alternative scenarios. The consider the opposite strategy helps us to envision how else things might be. In a recent study, this technique was proven to work better than just attempting to remain objective.
Considering the opposite in everyday practice works like this: you take a look at a set of facts about something. Generally, you would try to discern whether the facts support your belief or not. If you are experiencing confirmation bias, you would probably imagine that the facts do actually support your belief. But when you force yourself to consider the opposite, you instead imagine that the facts point the opposite way, disproving your belief. This helps you to imagine alternatives to what you already believe.
The consider the opposite strategy works particularly well with diametrically opposed beliefs, but always bear in mind that there may be more than one alternate possibility. Be willing to entertain various possibilities rather than falling victim to false dichotomies.
No comments:
Post a Comment