Why your brain’s built-in biases insulate your beliefs from contradictory facts | Opinion
These psychological tendencies explain why an onslaught of facts won’t necessarily change anyone’s mind (Francesco Carta fotografo/Moment via Getty Images)
By Jay Maddock
A rumor started circulating back in 2008 that Barack Obama was not born in the United States. At the time, I was serving as chair of the Hawaii Board of Health. The director and deputy director of health, both appointed by a Republican governor, inspected Obama’s birth certificate in the state records and certified that it was real.
I would have thought that this evidence would settle the matter, but it didn’t. Many people thought the birth certificate was a fabricated document. Today, many people still believe that President Obama was not born in the U.S.
More recently, I was listening to a “Science Friday” podcast on the anti-vaccination movement. A woman called in who didn’t believe that vaccines were safe, despite overwhelming scientific evidence that they are. The host asked her how much proof she would need in order to believe that vaccines were safe. Her answer: No amount of scientific evidence could change her mind.
As a psychologist, I was bothered, but not shocked, by this exchange. There are several well-known mechanisms in human psychology that enable people to continue to hold tight to beliefs even in the face of contradictory information.
Cognitive shortcuts come with biases
In its early days, the science of psychology assumed that people would make rational decisions. But over the decades, it’s become clear that many decisions people make – about choices ranging from romantic partners and finances to risky health behaviors like unsafe sex and health-promoting behaviors – are not made rationally.
Instead, human minds have a tendency toward several cognitive biases. These are systematic errors in the way you think about the world. Given the complexity of the world around you, your brain cuts a few corners to help you process complex information quickly.
For example, the availability bias refers to the tendency to use information you can quickly recall. This is helpful when you’re ordering ice cream at a place with 50 flavors; you don’t need to think about all of them, just one you recently tried and liked. Unfortunately these shortcuts can mean you end up at a nonrational decision.
One form of cognitive bias is called cognitive dissonance. This is the feeling of discomfort you can experience when your beliefs are not in line with your actions or new information. When in this state, people can reduce their dissonance in one of two ways: changing their beliefs to be in line with the new information or interpreting the new information in a way that justifies their original beliefs. In many cases, people choose the latter, whether consciously or not.
For example, maybe you think of yourself as active, not at all a couch potato – but you spend all of Saturday lying on the couch bingeing reality TV. You can either start thinking about yourself in a new way or justify your behavior, maybe by saying you had a really busy week and need to rest up for your workout tomorrow.
The confirmation bias is another process that helps you justify your beliefs. It involves favoring information that supports your beliefs and downplaying or ignoring information to the contrary. Some researchers have called this “my side blindness” – people see the flaws in arguments that are contradictory to their own but are unable to see weaknesses in their own side. Picture fans of a football team that went 7-9 for the season, arguing that their team is actually really strong, spotting failings in other teams but not in theirs.
With the decline of mass media over the past few decades and the increase in niche media and social media, it’s become easier to surround yourself with messages you already agree with while minimizing your exposure to messages you don’t. These information bubbles reduce cognitive dissonance but also make it harder to change your mind when you are wrong.
Shoring up beliefs about yourself
This relationship between beliefs and self-concept can be reinforced by affiliations with groups like political parties, cults or other like-minded thinkers. These groups are often belief bubbles where the majority of members believe the same thing and repeat these beliefs to one another, strengthening the idea that their beliefs are right.It can be especially hard to change certain beliefs that are central to your self-concept – that is, who you think you are. For example, if you believe you’re a kind person and you cut someone off in traffic, instead of thinking that maybe you’re not all that nice, it’s easier to think the other person was driving like a jerk.
Researchers have found that people generally think they are more knowledgeable about certain issues than they really are. This has been demonstrated across a variety of studies looking at vaccinations, Russia’s invasion of the Ukraine and even how toilets work. These ideas then get passed from person to person without being based on fact. For example, only 70 percent of Republicans say they believe the 2020 presidential election was free and fair despite a lack of any evidence of widespread voter fraud.
Belief bubbles and the defenses against cognitive dissonance can be hard to break down. And they can have important downstream effects. For instance, these psychological mechanisms affect the ways people have chosen whether or not to follow public health guidelines around social distancing and wearing masks during the COVID-19 pandemic, sometimes with deadly consequences.
Changing people’s minds is difficult. Given the confirmation bias, evidence-based arguments counter to what someone already believes are likely to be discounted. The best way to change a mind is to start with yourself. With as open a mind as you can summon, think about why you believe what you do. Do you really understand the issue? Could you think about it in a different way?
As a professor, I like to have my students debate ideas from the side that they personally disagree with. This tactic tends to lead to deeper understanding of the issues and makes them question their beliefs. Give it an honest try yourself. You might be surprised by where you end up.
Jay Maddock is a professor of public health at Texas A&M University. He wrote this piece for The Conversation, where it first appeared.
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.