Invisible but Mighty: Are Filter Bubbles and Echo chambers responsible for information polarisation?

Research reveals Facebook echo chambers.
“Echo chambers” by Luke O’Brien is licensed CC by-NC 2.0.


Pick any of the big topics of the day- referendum, climate change or even Trump’s re-election– and wander online. What one is likely to find is radical polarisation- different groups of people living in different worlds, populated with utterly different facts (Nguyen, 2019).

Enter filter bubbles and echo chambers: the powerful but often overlooked phenomena that influences not only what we see but what we believe and trust.

In today’s digitally driven world, navigating through a sea of information has become a challenge of unprecedented proportions, with discussions on pivotal topics inevitably encountering radical polarisation. This phenomenon is propelled by filter bubbles and echo chambers. Therefore, as we dive into an intricate web of digital information, it is imperative to understand how these forces operate and their far-reaching implications on our collective consciousness.

Defining and Clarifying

Let’s begin by clarifying what ‘filter bubbles’ and ‘echo chambers’ are referring to.

Filter bubbles, a potent metaphor coined by Pariser (2011), are personalised information ecosystem created by algorithms. These cocoon individuals in a world tailored to their preferences, shielding them from dissenting opinions. Importantly, social media platforms, employing advanced algorithms, play a pivotal role in facilitating this phenomenon. By carefully curating content based on user behaviour, platforms skilfully create an environment where users are exposed primarily to information that aligns with their existing beliefs, in turn being more inclined to spend time on these sites (Mims, 2017).

However, it doesn’t just stop there. Echo chambers, as first articulated by Sunstein (2017), take this concept further, suggesting the creation of bounded spaces that magnify messages and insulate them from rebuttal. The concept of echo chambers therefore opens suggestions that the underlying issue, may not lie in what people hear, but rather in whom they choose to believe. In echo chambers, insiders come to distrust everybody on the outside creating a closed loop of information that reinforces pre-existing beliefs.


So, the next question is how do these ‘bubbles’ and ‘chambers’ form?

Take TikTok.

A close-up of the TikTok app on a smartphone
“TikTok II” by Focal Foto is licensed under CC by-NC 2.0.

It’s 2am, you are lying on your bed, phone in hand, scavenging the depths of TikTok. Post after post, video after video, eyes almost shut, you continue to scroll through hundreds of videos.

Whilst to us it seems mindless and almost habitual, all whilst we do this, TikTok’s algorithm, a veritable mastermind, employs a multi-phase approach to create the perfect echo chamber.

Let’s break it down:

The first phase, monitoring app activity, starts from the moment a user downloads the app. Every like, comment, follow, and interaction is meticulously analysed to build a personalised ‘For You’ page. This page becomes a customised stream of content, carefully selected to align with all previous interactions.

The second phase delves into content information. In addition to user interactions, TikTok’s algorithm scrutinises captions and hashtags associated with each post. Again, this analysis significantly influences the echo chamber, particularly in a political context, that individuals are embedded in.

The third phase considers device settings. Factors such as location, language, device type and repeated viewership play a crucial role in determining the type of videos that populate a user’s ‘For You’ page. This level of customisation ensures that users are continuously exposed to content that not only aligns with their preference, but further confirms these, deepening their immersion into the chamber.

Hence, TikTok characterised by its unique and astute algorithm, has emerged as a prominent example and formidable force in discussions about filter bubbles and echo chambers. It has evolved into a vital platform for political messaging and information consumption, particularly among the younger generations. Importantly, politicians have swiftly recognised this potential, utilising distinct features to bolster their influence and reinforce trust among like-minded followers. This marks a significant shift in how politicians engage with audience, capitalising on the power of echo chambers (Valenzula, 2022).

But are they responsible?

In the current digital age, the phenomenon of information polarisation has gained significant attention. The question that looms larger is: to what extent are filter bubbles and echo chambers responsible for this polarisation?

While the existence of filter bubbles and echo chambers is undeniable, their role in societal polarisation remains a topic of rigorous debate among scholars.

So, let’s break down the debate.

Filter bubbles and echo chambers: A Closer Look

Advocates of the view that filter bubbles and echo chambers primarily drive information polarisation emphasise the transformative impact of digital algorithms on societies information consumption. Eli Pariser, a prominent voice in this discourse, argues that it is personalised content algorithms which cocoon users within an environment that reinforce existing beliefs, therefore effectively sheltering individuals from diverse perspectives.

Pariser’s work, notably his book “The Filter Bubble: What the Internet is Hiding from You” and his TED talk “Beware Filter Bubbles”, highlight how these algorithms subtly edit the web, resulting in users being exposed only to information that aligns with preconceived notions. He contends that this isolation prevents meaningful connections and fosters passive consumption of information, likening it to ‘information junk food’.

Furthermore, he draws on Eric Schmidt’s statement which acknowledges that tailored content has become a dominant mode of consumption, blurring the line between unbiased information and curated content. Netflix researchers add that this customisation represents an ongoing struggle between one’s present and future self, perpetuating the cycle of reinforcement.

“Beware online filter bubbles” by Eli Pariser (2011)
As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.

Selective Exposure: The Psychological Underpinnings

Affirming Pariser’s perspective, Beam et al (2018) delve into the psychological construct of selective exposure, asserting that individuals naturally seek information that affirms their existing beliefs. This inclination stems from cognitive dissonance, an innate protective mechanism that shields individuals from discomfort of challenging established viewpoints. Hence, algorithms employed by news aggregators and social media platforms capitalise on this tendency, perpetuating a cycle where individuals are predominantly exposed to information that mirrors their existing beliefs.

Bruns’ perspective: A Nuanced View

However, not everyone is convinced by Pariser. Axel Bruns, another prominent figure in this field, brings a nuanced perspective to this debate, cautioning against oversimplifying the issue. He contends that assigning exclusive blame on filter bubbles and echo chambers fails to account for the agency of individuals in shaping their information environment. Bruns suggests that these phenomena serve as amplifiers rather than originators of polarisation, intensifying existing divisions rooted in socio-political rifts.

Bruns also highlights the complexity of the digital information ecosystem, where various components, including search engines, social media platforms and new aggregators, contribute to content exposure. While algorithms undoubtedly play a role, user agency in selecting sources cannot be overlooked. Moreover, the role of cognitive biases like confirmation bias and motivated reasoning operates independently of algorithmic recommendations. Garrett (2009), challenges the binary view of selective exposure, emphasising how different opinions within a news story influence an individual’s use of that information.

Historical precedents: Echo chambers through time

Bruns also draws critical attention to historical precedents, underscoring that polarisation and ideological divisions are not unique to the digital age. Partisan newspapers of the 18th and 19th centuries serve as powerful examples of echo chambers long before algorithms shaped our information landscape.

Popping the bubble & escaping the chamber

Whether or not you are convinced of filter bubbles and echo chambers being responsible for societies polarisation, it is undeniable that these mechanisms are ever-present. Therefore, to counter their influence is critical.

 Importance of popping your bubble
“Bubble pop” by KansasJayhawk17 is licensed by CC by-NC-ND 2.0

Diversifying information sources:

To counter the influence of filter bubbles and echo chambers, actively seeking diverse information sources is paramount (Sunstein, 2017). This deliberate exposure to varying viewpoints challenges entrenched beliefs, leading to a more comprehensive understanding of complex issues, from politics to news. Such practices foster intellectual growth and dismantle the insularity of echo chambers (Pariser, 2011).

Media literacy and critical thinking:

Media literacy and critical thinking skills are also indispensable tools for navigating the digital information-rich landscape (Wineburg & McGrew, 2017). The ability to discern reliable sources, fact-check information and critically analyse content is imperative. Educational initiatives and awareness campaigns promoting media literacy empower individuals to confidently navigate the digital realm. These skills collectively contribute to breaking down echo chambers and facilitating meaningful discourse (Grizzle et al, 2017)

“Challenge the Echo Chamber” by Adam Greenwood (2019)
In this talk Adam gives us an insight into what that future might look like, and the choices we can make right now to shape it.


Conclusively, echo chambers and filter bubbles subtly shape our information consumption in today’s digital landscape. While they contribute to polarisation, Bruns’ nuanced stance urges a broader perspective, considering the intricate interplay of elements in our information ecosystem. Therefore, while filter bubbles and echo chambers are factors, appreciating the complexity of the issue provides a more accurate assessment of polarisation. However, as we move forward, a critical engagement with information is paramount and dismantling division in favour of diverse perspectives becomes a shared responsibility.


Arguedas, A., Robertson, C., Fletcher, R., & Nielson, R. (n.d.). Echo chambers, filter bubbles, and polarisation: A literature review. Reuters Institute for the Study of Journalism. Retrieved September 27, 2023, from

Bruns, A. (n.d.). Echo chambers? Filter bubbles? The misleading metaphors that obscure t. Retrieved September 29, 2023, from

Bruns, A. (2019, July 31). Filter Bubbles and Echo Chambers: Debunking the Myths. DMRC at Large.

Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter Bubbles, Echo Chambers, and Online News Consumption. Public Opinion Quarterly, 80(S1), 298–320.

Gao, Y., Lui, F., & Gao, L. (n.d.). Echo chamber effects on short video platforms | Scientific Reports. Retrieved September 26, 2023, from

Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users1. Journal of Computer-Mediated Communication, 14(2), 265–285.

Grimes, D. R. (2017, December 4). Echo chambers are dangerous –  we must try to break free of our online bubbles. The Guardian.

Hu, T. (2017). Opportunities for Media and Information Literacy in the Middle East and North Africa. European Journal of Communication, 32(1), 79–81.

Interian, R., G. Marzo, R., Mendoza, I., & Ribeiro, C. C. (2023). Network polarization, filter bubbles, and echo chambers: An annotated review of measures and reduction methods. International Transactions in Operational Research, 30(6), 3122–3158.

Jackson, J., & @JaspJackson. (2017, January 8). Eli Pariser: Activist whose filter bubble warnings presaged Trump and Brexit. The Guardian.

Kitchens, B., Johnson, S. L., & Gray, P. (2020). Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption. MIS Quarterly, 44(4), 1619–1649.

Nguyen, C. T. (2019, September 11). The problem of living inside echo chambers. The Conversation.

Pariser, E. (n.d.). Eli Pariser: Beware online “filter bubbles” | TED Talk. Retrieved October 3, 2023, from

Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You. Penguin UK.

Parmelee, J. H., & Roman, N. (2020). Insta-echoes: Selective exposure and selective avoidance on Instagram. Telematics and Informatics, 52, 101432.

Pérez-Escolar, M. (n.d.). Hate Speech and Polarization in Participatory Society.

Rhodes, S. C. (2022). Filter Bubbles, Echo Chambers, and Fake News: How Social Media Conditions Individuals to Be Less Critical of Political Misinformation. Political Communication, 39(1), 1–22.

Sustein. (2017). #Republic.

Valenzuela, B. (2022, January 10). The Weapon of the Century: Contemporary Politics Through the TikTok Algorithm. Harvard Political Review.

Wineburg, S., & McGrew, S. (2017). Lateral Reading: Reading Less and Learning More When Evaluating Digital Information (SSRN Scholarly Paper 3048994).