The Not So Fast Blog
A Filter Bubble is Born
4 ways our brains are inclined to bias and what we can do about it
Our brains deal with tons of information every day. One way we process it is to make assumptions, based on social cues, to base our thoughts, opinions, and actions on. Sometimes assumptions can backfire and lead to biases in the way we consume information and media. This information-labeling system means how open-minded and responsive we are to information often depends more on our bias toward the information’s source than the reliability of its content. Here are some of the ways our brains are inclined to these biases, and what you can do about them.
We automatically categorize people into one of two categories; those we identify with (members of the ingroup) and those we don’t identify with (members of the outgroup). These categorizations are based on numerous factors, including but not limited to: appearance, opinions, beliefs, affiliations, interests and even political leanings.
Our brains assume that people we identify with will be like-minded, which is often but not always true. Because of this, we are more likely to blindly believe information coming from people we identify with, regardless of its factual basis. We are more likely to be skeptical of information presented by people we don’t identify with and assume it is incorrect or malicious.
If you meet someone wearing a t-shirt of a band or artist you dislike, you’re far more likely to be biased against them or write them off because they’d be a member of the outgroup. If you meet someone wearing a shirt of a band or artist you enjoy, you’re more likely to identify with them as a member of the ingroup and blindly assume that they’re correct or know what they are talking about.
In addition to ingroup/outgroup bias, we are also more likely to believe information that is consistent with what we already think and believe. This tendency is because of a process called confirmation bias. It also makes us both less likely to believe and more likely to overcriticize or overlook information that might challenge our existing beliefs or assumptions, regardless of its factual basis.
The majority of media coverage about dogs is positive with only a small sensationalized portion being negative. Confirmation bias can be seen in people who dislike, are afraid of, or think dogs are dangerous. These people will tend to put more trust, faith, and weight in negative stories they hear about dogs. They will focus on stories when a dog attacks someone or acts viciously while disregarding or giving little value to positive stories about dogs protecting people, acting as heroes, or just doing something really cute.
Our brains can sometimes mistake quantity (how widely a piece of information has been spread) for quality of information. When the people around us all share the same or similar opinions and statements, we will often shift our own statements and opinions to match, even if we personally disagree. This process is known as conformity and occurs because our brains take cues from others. When there is a noticeable majority, our brain assumes that it must be “correct”. This can become an issue when we consume popular or mass spread opinions or misinformation that aren’t grounded in fact and adopt it as our own stance. The more people who hold the same perspective, the stronger the effect. So, when it occurs it can lead to a domino effect of spreading misinformation.
This effect can occur in almost any context from personal circles and interactions to news media and politics, but the clearest examples are in popular online challenges, trends and viral social media posts. In February 2020, the standing broom challenge became popular. It suggested that “according to NASA,” due to the position of the moon, it was the one day of the year you could get a broom to stand upright on its bristles (NASA has never said this, the broom standing upright has nothing to do with the position of the moon and will balance on its bristles according to its center of gravity.) People who would typically be skeptical of these viral challenges and hoaxes can often get swept up in them because the quantity of people who share them trick our brains into assuming there must be some validity in the claims. Things like the broomstick challenge can usually be confirmed as true or not with a quick google search.
When among a group of people who share similar beliefs, no matter how strongly you hold the belief, interaction with the group tends to intensify these beliefs. This tendency is called group polarization and is a way our beliefs and opinions can be easily swayed with little factual evidence when we consume information. It can be particularly relevant when discussing politics; you may be more inclined to believe and more strongly support facts, claims and statements you were initially skeptical about without sufficient evidence, just through discussion with like-minded people. Something to keep in mind is that it has been found to be even more effective online than in person.
Groups like anti-vaxxers capitalize on the effects of group polarization to recruit more supporters. Parents who may be skeptical or have concerns about vaccines will naturally seek out more information. After consuming anti-vaccination content or engaging in discussion with anti-vaxxers, the initial mild concern can become more serious or extreme. The escalation often leads these parents to become supporters of the anti-vaccination movement. This is effective for anti-vaxxers as the anti-vaccination campaign is built upon a biased (and unethical) study which has been retracted and disproven multiple times. It falsely implies that vaccines cause autism. A causal link has never been established between vaccines and autism, so anti-vaxxers use unreliable and biased information like personal opinions and anecdotes to further their case.
What can you do about this?
These biases mean that it’s important to actively evaluate the information we consume as we are taking it, especially now that we are dealing with widespread media and political misinformation. It never hurts to be skeptical about the information we consume and fact-checking is always a must when developing our own opinions through consuming and spreading information.
Understanding that our brains work in these ways also emphasizes the importance of exposing ourselves to information from multiple different and opposing viewpoints, to consume and engage in them. This variety gives us more opportunities and a broader basis with which to develop our own ideas on a subject. Remember, the best way to avoid this is to stop, think, and check what we are seeing before we join in.
The Not So Fast Campaign thanks Tamsin Mahalingham for today’s post. She is currently completing a Bachelor of Psychology with Honors at Curtin University in Australia and is set to graduate at the end of 2020!
To learn more about the information in this post, check out these great resources.