Mindful Scrolling
A better understanding of cognitive biases can improve your social media habits
Instagram. Snapchat. Facebook. Twitter. Youtube. If you use any of these sites, you are one of 7 in 10 Americans using social media to connect with people around the world – these platforms have become a part of our lives and culture. And if you’ve spent more than five minutes on one of these sites, you have probably been subjected to some… interesting opinions. Maybe your weird aunt retweeted something about vaccines causing autism, or a random page with thousands of followers pops up, selling herbs and crystals to align your chakras and stay healthy. Maybe it’s an account with “fact” in the name, but half of their posts explain that 5G causes covid-19 symptoms. The comments under these posts are a vicious mix of devout believers and non believers, each citing articles to prove their arguments. How did we get here? What forces could lead somebody to advocate for “Flat Earth”? Hint, it happens at the intersection between web design and human psychology. As we move into the future, social media will only expand its influence on the world, for better or for worse. This isn’t on anybody’s mind when they pull out their phone to scroll for a few minutes on the toilet, but maybe it should be.
In order to better understand how people can arrive at these improbable conclusions, let’s take a look at the non-human part of the equation first. If you use social media, you are most likely aware that every user sees different posts when they log onto their “for you page”. Behind the scenes, there is an algorithm that customizes each feed, looking to maximize certain metrics. It’s a built-in part of the social media business model. One of the metrics that they look at is engagement – essentially, how much attention they can capture with likes, comments, and time on site. The longer someone spends on their site, the more ads they can expose them to, and generate revenue. This seems like a pretty straightforward business model, but there are some alarming consequences.
The problem with current social media algorithms is that people are fundamentally biased. Negativity bias is the tendency to focus more of our attention on negative information. This is an example of attentional capture – an unconscious and automatic direction of attention to some kind of stimulus. In the wild, where our brains evolved, this is a lifesaving mechanism (a bear is probably more pressing than a fish, and deserves our attention), but in the social media world it just leads to a “bad news sells” dynamic. Inside each of us, there is an unconscious pull towards posts that make us sad, scared, or angry. This explains why someone who is pro-life might keep clicking on posts surrounding abortion, and consequently see even more of those posts the next time they log in. Now that everybody is on edge and glued to their screens, confirmation bias biases can take effect.
In its most general form, confirmation bias can be defined as the tendency to seek confirmatory evidence while disregarding conflicting evidence based on prior convictions. Sometimes, it is in our best interest to ignore extraneous information, and not all statements are worth our consideration. Occasionally though, it can cause people to mislead themselves. Suppose a football fan believes that rainy weather causes their favorite team to lose. To them it doesn’t matter how many times they lose on sunny days, or win in the rain. If the team loses just a few times in the rain, they will be convinced. This is confirmation bias.

Image source: ‘Matter of fact’ article at unknowitall.com
In the scientific community, a lot of effort goes into counteracting confirmation bias specifically, and for good reason; if you ignore all of the bad data from a clinical trial, you might end up with deaths and lawsuits on you hands a few months down the line. Scientists are people, subject to the same biases as the rest of us, but they have created a culture that seeks to minimize mistakes. In this context, the moral imperative to reduce bias is clear, but it doesn’t follow that bias in the general population can be ignored. For example, the political landscape in America is becoming increasingly polarized in part due to confirmation bias, and we are only beginning to experience the consequences of this shift.
According to Wordstream, 95,000,000 photos and videos are shared on Instagram every day – far too large a dataset for any one person to keep up with – hence the personalized feed. But remember, the algorithm is not trying to show a representative sample; it’s trying to maximize engagement. What this means for your feed is that you see a lot of posts about the things that you are most likely to engage with: posts that get you riled up and posts that support your viewpoint. The posts that you disagree with are not analyzed for factual information, they just pull you in with negativity. The posts that you do agree with are all you need to see to know that you’re right. It is never easy to say ‘I’m sorry, I was wrong,” and one way to avoid those negative emotions is to just disregard frustrating evidence and convince yourself that you are right and everyone else is wrong (aka naïve realism). This theory of confirmation bias as an emotional reflex also explains the satisfaction some people get when calling a disagreer online a “sheeple” and advocating for “doing your own research” as opposed to trusting authorities.
Calling somebody a “sheeple” implies that they do not think for themself. They only follow the herd, complying with authority figures (the metaphorical shepherd).
Image source: True_Christian on memegenerator.net
No matter how wacky an opinion is, there is somebody out there who agrees. Despite the seemingly small number of flat earthers, there are enough people worldwide to make it feel like a community; the algorithm is really good at showing them each other’s posts. Confirmation bias creates a positive feedback loop between source selection and increasing assuredness. For some people, this path can lead to beliefs like ‘flat earth theory’ or ‘vaccines cause autism,’ while it can lead others to get really into working out or becoming an advocate for veganism. Social media platforms are inadvertently designed to amplify human biases in a way that enables people to develop ideologies, and then find others with the same ideas. There is a word for these insulated, ideologically homogenized communities: echo chambers.
Social Media Induced Polarization (SMIP) is a new but very active area of study among social scientists, and the research (real research, not a facebook ‘research’) is illuminating. A 2020 study published in Nature, one of the most prestigious academic journals, used model social systems to explore the factors that create echo chambers. They found that there was a sweet spot of confirmation bias, where echo chambers were least common. This means that we might not necessarily need to eliminate confirmation bias (confirmation bias is a part of normal cognitive processing after all), but it’s worth keeping an eye on. They also found that networks with higher degrees of connection could support larger numbers of biased members without losing accuracy. Essentially, the more contacts each account has, the harder it is to form an echo chamber. The social media algorithms of today do this poorly, but apps can be updated and improved. Unfortunately, it is unlikely that huge companies like Meta (the corporation formerly known as Facebook) will change their business models any time soon.
A 2019 study published in Science Direct provides some hope. This study aimed to separate social media users into different groups. The main differentiating factors were the users’ awareness of algorithms and the extent to which that knowledge influenced their usage. Some people were completely clueless and passive, while the ‘activist’ group would go out of their way to follow accounts that they did not agree with to impede the algorithm, or make use of platform settings to limit the data that these media companies could collect from them. The most compelling aspect of this research is the shift it takes away from portraying users as helpless victims at the mercy of predatory algorithms. People can take measures to protect themselves by getting educated and making informed decisions about their accounts. However, improving the digital component only solves half the problem – It is equally important to have an ‘activist’ mindset about your own biases. When you see 6 posts in a row with the same take on the same topic, you might want to pause and evaluate. Are you only clicking on these posts because they make you angry? If you have a strong opinion on the subject, it might be worth taking a breath. Then, ask yourself if the things you’re reading actually make sense or if they just make you feel secure. Next time you find yourself killing a few minutes in a stale waiting room, scroll mindfully.
References:
Chen, J. (2021). Basic psychosocial and biological contributors to confirmation bias. Proceedings of the 2021 International Conference on Public Relations and Social Sciences (ICPRSS 2021). https://doi.org/10.2991/assehr.k.211020.315
Lee, C., Shin, J., & Hong, A. (2018). Does social media use really make people politically polarized? direct and indirect effects of social media use on political polarization in South Korea. Telematics and Informatics, 35(1), 245–254. https://doi.org/10.1016/j.tele.2017.11.005
Modgil, S., Singh, R. K., Gupta, S., & Dennehy, D. (2021). A confirmation bias view on social media induced polarisation during covid-19. Information Systems Frontiers. https://doi.org/10.1007/s10796-021-10222-9
Min, S. J. (2019). From algorithmic disengagement to algorithmic activism: Charting social media users’ responses to news filtering algorithms. Telematics and Informatics, 43, 101251. https://doi.org/10.1016/j.tele.2019.101251
Sikder, O., Smith, R. E., Vivo, P., & Livan, G. (2020). A minimalistic model of bias, polarization and misinformation in social networks. Scientific Reports, 10(1). https://doi.org/10.1038/s41598-020-62085-w
Recent Comments