As you read this, over 60% of the global population is active on social media. Yet, how many of them are in filter bubbles?
Are you also the one?
“Using Social Media on iPhone – Credit to https://www.lyncconf.com/” by nodstrum is licensed under CC BY 2.0.
The filter bubble
The digital age has transformed how we consume information, and social media platforms have emerged as primary sources. However, this transformation presents a significant challenge: the creation of filter bubbles, often referred to as echo chambers. A filter bubble is a trapped environment where we encounter information or voices that consistently mirror our own ideas or opinions, often devoid of diverse perspectives. A striking example is the YouTube algorithm—an advanced system designed to personalize content based on users’ video-watching behaviour, encouraging them to like or subscribe to similar content and extend their time on the platform.
With time, the algorithm customizes viewers’ feeds to cater to their interests, resulting in a stream of content that reinforces specific perspectives, thus forming filter bubbles. This not only restricts the diversity and quality of information individuals receive but also poses threat to a democratic society.
In this article, we will delve into the complexities surrounding filter bubbles, unveiling the fascinating allure that entraps us within them, and exploring the social-political challenges encountered when attempting to break free.
Unveiling the Enchantment of Social Media
Social media plays a vital role in sustaining filter bubbles. Their design, tailored to retain users on the platform, is intricately linked to revenue streams derived from user engagement. A prime example is the design of the infinite scroll—a feature present in major platforms like Instagram, Facebook, and TikTok. This feature entices us to endlessly swipe through content, keeping them engaged without the need to click.
Shut up and just keep scrolling!
“If you don’t give your brain time to catch up with your impulse, you will simply keep scrolling,” emphasized Aza Raskin, the designer of this feature, underscoring its addictive nature.
However, captivating users isn’t solely about a single feature. Personalization systems also play a significant role, deepening engagement while reinforcing the filter bubble. Facebook’s personalized news feed is a prime example, tailoring content to align with users’ preferences and connect them with everyday stories they care about. Yet, inadvertently, this customization narrows their exposure to diverse perspectives. Similarly, TikTok, with its focus on short video content, contributes to filter bubbles by highly personalizing users’ content selection. The widely used #fyp (representing ‘For Your Page’) hashtag is an example by categorizing content and amplifying its discoverability. Machine learning identifies the kinds of content users interact with the most, allowing the customization of #fyp to maximize user engagement and satisfaction, and that to prolong they stay time on web.
These purposeful designs, aimed at enticing users to spend more time on web and generate revenue through targeted advertising, inadvertently contribute to the formation of tighter filter bubbles. It’s like a vicious cycle.
Personal Bias in Content Selection
On the other side, we can’t help but find personalized information alluring. Scientifically proven, confirmation bias is a part of human nature; we naturally seek out information that confirms our existing beliefs and sources that validate our worldview.
Social media, therefore, become alluring is indeed the opportunity provided to connect with like-minded individuals to create a sense of belonging and that to fulfil a fundamental human need for connection and affiliation. When we’re on social media, we tend to look for news that supports our point of view, reels that pique our interest, and subscribe people with like-minded perspectives. Over time, this habitual consumption of reinforcing information solidifies our worldview, shaping the reality we perceive. An illustrating example of a filter bubble is the flat-earth theory.
Shockingly, more than one-third of American people aged between 18 and 24 believe that the world is flat. Although the idea has been scientifically discredited, the belief seems to grow within a solidified like-minded community.
“I have no problem with anybody that wants to believe we live on a ball. That’s their choice, it’s just not something I resonate with,” says David Weiss, an active flat-earther. He gathers with like-minded people who share this belief in a flat earth. In this community, events like presentations, forums, and awards are often held in many countries, such as Brazil, Britain, and Italy, providing opportunities for believers to meet and interact with influential figures.
Consequently, David remained within a filter bubble, interacting mainly with content and individuals who reaffirm his belief, further reinforcing his conviction in a flat earth and that embodies the nature of filter bubbles.
While personal preference in choosing to receive specific kinds of information from social media is a contributing factor, the sheer volume of information also forces users to make trade-offs between different pieces of information.
There are more than 2.5 quintillion bytes of data created every day. On various social media platforms every minute of the day:
- 46,740 photos are posted on Instagram
- 456,000 tweets are sent on X (aka twitter)
- 527,760 photos are shared on Snapchat
The influx of information leaves people with no choice but to make decisions about what to consume. Therefore, due to human nature, people tend to receive information they are interested in, often shared from like-minded perspectives.
So, why should we bother bursting the filter bubble when it seems to make our time on social media more entertaining?
The development of techlash blowing up the filter bubble, change the original aim of social media. Ideally, Silicon Valley internet serves as a transparent place for free and interoperable flow of information. Yet, ‘Techlash’ no longer become a strange word to many, given the widespread use of social media platforms primarily owned by Silicon Valley giants like Facebook, Instagram, and YouTube. These platforms not only exhibit their dominance over the digital realm but also wield significant geopolitical influence. They have the power to extract economic value and exercise strategic political influence through their control. The 2016 US presidential election is a prime example where social media, particularly Facebook, played a crucial role. By utilizing personalized advertising, Facebook tailored political messages to Trump supporters, significantly contributing to Trump’s success and yielding significant politico-economic profits.
Social media, instead of being a space for free expression and the open flow of information, can be shaped into a echo chamber that only allow certain political voice to access to, reinforce and flow.
Another notable instance highlighting the urgency of breaking the filter bubble can be seen in TikTok, a massively popular app owned by Beijing-based tech company ByteDance. TikTok has garnered immense popularity worldwide, primarily as an excellent source of entertainment. However, what raises concern is the advanced algorithm it employs, which not only tailors content but also delves deeply into users’ privacy.
The success of TikTok is undeniable, boasting 1 billion users across 154 countries and holding the title of the most downloaded app in 2020.
Its entertainment value is appealing to a vast audience. However, the algorithms behind TikTok contribute to the formation of filter bubbles, restricting the breadth of democracy by reinforcing specific content and perspectives.
TikTok’s recommendation system significantly differs from traditional approaches. While conventional platforms rely primarily on active online actions like following, subscribing, and liking to understand preferences, TikTok takes it a step further by analysing subtle and passive behavioural cues. These cues include how frequently a video is replayed, the speed at which one scrolls past content, and preferences for effects and sounds. This highly responsive recommendation system caters even to passive users, presenting an engaging, personalized content feed more rapidly than other platforms.
The rapid advancement of machine learning enhances the appeal of social media platforms, enabling them to access to extensive behavioural and privacy data from users. However, this advancement sparks worries regarding the potential transformation of social media into tailored environments that amplify certain political narratives over being forums for open, free flow of information. This direction is deeply concerning and jeopardizes the fundamental principles of free speech and the diverse array of perspectives that should characterize social media. Despite its seductive allure, it is important to burst the bubble.
Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
Andersson, H. (2018, July 4). Social media apps are “deliberately” addictive to users. BBC News. https://www.bbc.com/news/technology-44640959
Bryant, L. (2020). The YouTube Algorithm and the Alt-Right Filter Bubble. Open Information Science, 4(1), 85-90. https://doi.org/10.1515/opis-2020-0007
Chaffey, D. (2023, June 7). Global Social Media Research Summary 2022. Smart Insights; Smart Insights. https://www.smartinsights.com/social-media-marketing/social-media-strategy/new-global-social-media-research/
Furze, A. (2019, January 11). Why do some people believe the Earth is flat? Pursuit. https://pursuit.unimelb.edu.au/articles/why-do-some-people-believe-the-earth-is-flat
Gray, J. E. (2021). The geopolitics of “platforms”: the TikTok challenge. Internet Policy Review, 10(2). https://policyreview.info/articles/analysis/geopolitics-platforms-tiktok-challenge
How social media filter bubbles work. (2016). [YouTube Video]. In YouTube. https://www.youtube.com/watch?v=doWZHFnVPQ8
iCapture Media(2021). woman scrolling social media [Review of woman scrolling social media]. In pexels.com. https://www.pexels.com/video/woman-scrolling-social-media-10238038/
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2622.214.171.124
O’Hara, K., & Hall, W. (2018, December 7). Four Internets: The Geopolitics of Digital Governance. Centre for International Governance Innovation. https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance/
Pariser, E. (2011). The filter bubble : what the Internet is hiding from you. Viking. https://hci.stanford.edu/courses/cs047n/readings/The_Filter_Bubble.pdf
Sciences, G. (2015, June 29). Big data – how much of your information is out there? Griffith Sciences Impact. https://impact.griffith.edu.au/how-much-of-your-information/
TikTok – Make Your Day. (n.d.). Www.tiktok.com. https://www.tiktok.com/@bellapoarch/video/6862153058223197445
TikTok. (2020, June 18). How TikTok recommends videos #ForYou. Newsroom | TikTok. https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you
Zha, X. (2020, October 7). The unique power of TikTok’s algorithm. Www.lowyinstitute.org. https://www.lowyinstitute.org/the-interpreter/unique-power-tiktok-s-algorithm