“Information Cocoons” – The lack of true information diversity on Internet

Digital information background by Fishmen via torange and licensed for reuse under this Creative Commons Licence 

In today’s digital age, the Internet has become essential to people’s lives. The information on the Internet flows freely and is accessed easily; people assume that a diverse range of perspectives and information can be accessed effortlessly (Pariser, 2011). It may seem counterintuitive to discuss the lack of true information diversity. However, the “information cocoons” concept has emerged, highlighting the alarming trend of individuals being surrounded by a limited range of perspectives and opinions.

The concept of “information cocoons” (ICs) pertains to the occurrence wherein individuals find themselves enclosed by a restricted range of information that corresponds with their preexisting convictions and viewpoints, leading to a deficiency in genuine diversity of information(Hou, 2023). Various factors drive this phenomenon, including media consolidation, algorithmic bias, and echo chambers. This lack of exposure to diverse perspectives limits intellectual growth and perpetuates polarization and the spread of misinformation.

What is “Information Cocoons”?

The concept is initially a hypothesis that Harvard Law School professor Sunstein proposed in his 2006 book Infotopia: How Many Minds Produce Knowledge.

According to Sunstein (2017), individuals’ selective attention toward information that aligns with their specific needs or brings them happiness during the dissemination process can lead to a gradual loss of exposure to diverse topics and limited opportunities for engagement. Consequently, they become confined within an insular “cocoon room.”

Matrix style binary code digital falling numbers blue background by Starline via Freepik and licensed for reuse under this Creative Commons Licence 

With the development of digital, platforms that focus on providing users with personalized and accurate information services based on interests appeared. Social media platforms distributed the content through algorithmic filtering and feedback processing to form a unique category for users (Fu, 2023).

You must have the experience when you continue to watch a particular type of TikTok video, and then you will see more and more similar videos. Over time, the information you receive from the internet becomes singular. This kind of personalized recommendation, from the social media through the filtering mechanism recommended connections, the search engine’s personalized search such as music platforms based on your “music tastes” and recommended for you, and to the customized services of the news media, and so on. When this phenomenon happens, you already lived in this cocoon but without realizing it.

Necessity and inevitability of “Information Cocoons”

“In the new world of the Internet, it is easier than ever to find people who agree with us. Indeed, it’s almost too easy, and when it happens, it may be a problem. Social media can make it simple for us to insulate ourselves inside our own echo chambers, hearing only from those who are like us and reinforcing what we believe.” —– Sunstein (2017)

One of the main reasons for the existence of information cocoons is the Subjective choices of people. The Internet’s extensive utilization has enabled universal access to preferred information. The various of available data necessitates users to make choices. From a psychological perspective, individuals tend to seek out information that match with their existing beliefs, which is commonly referred to as confirmation bias(Klayman, 1997). In an era where information is readily available at our fingertips, people have the ability to curate their own reality and surround themselves with like-minded individuals and content. According to the research from the Pew Research Center (2014), results indicated that individuals have a tendency to opt for information aligning with their personal preferences, implying that people possess the ability to subjectively determine their desired knowledge. For example, when you want to watch the Lakers and Warriors NBA games on Youtube. As a Laker fans, you would recommend seeing game tips analyzed about the Lakers on your Youtube homepage or even follow their hashtags in Instagram and Twitter, and therefore even joining their fans communities, but avoid seeing season analysis about other teams. This creates an echo chamber effect, where diverse perspectives are often overlooked, and alternative viewpoints are dismissed such as subjectively filtered to ignore Warriors games information.

Furthermore, the algorithmic nature of social media platforms plays a significant role in creating information cocoons. These platforms use complex algorithms to personalize the content shown to users, based on their past behavior and interests (Zhang, 2023). While popular social media platforms and search engines use algorithms that are designed to cater to our preferences, enhance user experience by providing relevant content, it also limits exposure to diverse opinions and viewpoints. Individuals often find themselves in the company of others who share similar opinions, thus limiting exposure to diverse viewpoints. Consequently, individuals may find themselves confined within an information bubble that only serves to strengthen their preexisting beliefs.

The Hazards of IGs

Data filter © Copyright freesvg and licensed for reuse under this Creative Commons Licence 

While this may seem appealing by creating a personalized “bubble’ of information aligns with their own belief, it can lead to a severe lack of diversity in perspectives and opinions. When individuals are exclusively exposed to information that conforms to their own beliefs, they tend to decrease their likelihood of considering alternative perspectives or participating in constructive discussions (Guess, 2018). This can lead to a lack of empathy and understanding, as well as a decline in critical thinking skills.

Additionally, in the digital age, contacts and connections between people and communities and individuals have been diminishing (Gossart, 2014). Coupled with the information cocoon, as a consequence, this can lead to a breakdown in social bonding. As communities become divided and individuals lose the ability to empathize and understand different perspectives from others.

Another hazard is the potential for manipulation and exploitation by platforms. The form of Information Cocoons or echo chambers helps platforms to understand their users in different ways. When platforms prioritize user engagement and content that aligns with individuals’ beliefs, platforms may maliciously exploit these algorithms for their personal gain (Guess, 2018). This can include the spread of propaganda, fake news, and targeted disinformation campaigns. By leveraging the information cocoon and echo chamber effect, these actors can manipulate public opinion and influence societal discourse in ways that serve their own agendas (Hou, 2023). Moreover, other issues such as algorithmic discrimination in Social media and information filtering, platforms may use algorithms to filter or display content, which can lead to information discrimination where the voices of certain groups are silenced or amplified.

Break the “Cocoons”

Break on through the other side by Jasper via Flickr.

As mentioned above, the problem of “information cocooning” brought about by personalized algorithms has many potential hazards. A study conducted by sociologists Matthew Gentzkow and Jesse M. Shapiro in 2018 found that individuals who relied on a diverse set of news sources had a more accurate understanding of political facts compared to those who solely relied on ideologically aligned sources. This highlights the importance of breaking free from this cocoon is crucial for personal growth, fostering critical thinking skills, and promoting a more inclusive society. Fortunately, users are gradually realizing the existence of the information cocoon and are consciously trying to break out of it. By being aware of own biases and questioning the information preserved, users now actively seek out diverse perspectives and challenge their own beliefs.

Reference:

Carr, P. R., Hoechsmann, M., & Thésée, G. (2018). Democracy 2.0: Media, Political Literacy, and Critical Engagement. BRILL.

Fishmen, V. (2012). Digital information background №173007. Retrieved from https://torange.biz/fx/digital-information-background-173007

Gossart, C. (2014). Can Digital Technologies Threaten Democracy by Creating Information Cocoons?. In J. Bishop (Ed.), Transforming Politics and Policy in the Digital Age (pp. 145-154). IGI Global. https://doi.org/10.4018/978-1-4666-6038-0.ch010

Guess, A., Nyhan, B., Lyons, B., & Reifler, J. (2018). Avoiding the echo chamber about echo chambers. Knight Foundation2(1), 1-25.

Hou, L., Pan, X., Liu, K., Yang, Z., Liu, J., & Zhou, T. (2023). Information cocoons in online navigation. IScience, 26(1), 105893. https://doi.org/10.1016/j.isci.2022.105893

Klayman, J. (1994). Varieties of Confirmation Bias. Psychology of Learning and Motivation32, 385-418. https://doi.org/10.1016/S0079-7421(08)60315-1

Fu, S., & Dai, H. (2023). Promote identification or prevent expansion? The effect of uploader-viewer similarity on viewer inspiration in the context of short video community. Frontiers in Psychology, 14, 1120641. https://doi.org/10.3389/fpsyg.2023.1120641

Nyhan, B. (2014). Americans Don’t Live in Information Cocoons. Retrieved from https://www.nytimes.com/2014/10/25/upshot/americans-dont-live-in-information-cocoons.html

Pariser, E. (2011). The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think.

Sunstein, C. (2017). #Republic: Divided Democracy in the Age of Social Media.

Zhang, X., Cai, Y., Zhao, M., & Zhou, Y. (2023). Generation Mechanism of “Information Cocoons” of Network Users: An Evolutionary Game Approach. Systems11(8), 414. MDPI AG. http://dx.doi.org/10.3390/systems11080414