The inevitability and defects of the information cocoon in the digital age

Introduction

The concept of “information cocoon” was first proposed by Professor Sunstein of Harvard Law School in his book “Information Utopia—How People Produce Knowledge” released in 2006. It means that extracting what you want from the massive information on the Internet and rejecting other information is like a cocoon(Sunstein,2006).The rapid development of the Internet and social media in the digital age is seen as a technological revolution. It is possible to connect with almost anyone who wants to be connected via the Internet. Today, the reality of social media is increasingly moving in the opposite direction of the original vision. People are enclosed in a cocoon of personalized information, and opposing views are automatically isolated by big data algorithms. In the age of big data, information overload forces us to choose effective ways to filter information, trying to ignore content that is too different from our own views (Gossart, 2014). The information cocoon does provide us with convenience, but the shortcomings of the information cocoon are also very obvious. It exacerbates pre-existing personal notions and makes it easy for us to pass them off as truth. It also divides public discourse and affects true democracy.

Information cocoon room exists for the reason

Digital technology allows us to filter unwanted messages and contacts in a very efficient way. With that comes the risk – that we can only exchange information with like-minded people (Gossart, 2014). The sheer volume of information on the Internet creates information overload, which leads to the development of techniques and psychological protection mechanisms for screening information, which contribute to the development of information cocoons. Information cocoons mean that the information people receive is filtered and adjusted to each person’s tastes and prevailing views (Bishop, 2014). Digital technology allows people to efficiently filter information and contacts, allowing people to decide what information and people they want to receive based on their interests. The application software push information will push content according to the user’s preferences. User acceptance is limited. Big data analyzes what users are interested in through their preferences. Digital technology is not only a tool, but also controls the economy and organizes people’s daily lives. The internet is setting its own agenda for the world around people (David, 2018).

Disadvantages of the information cocoon

Information cocoons can lead to the public reinforcing existing personal values and making it difficult to interact with groups of different values. Information cocoons limit the free flow of information and shrink the space in the ideological sphere. This leads to the problem of cognitive dissonance, and people tend to accept their way of thinking as truth by ignoring one thing that is too different from their own point of view. Especially in emergencies, “information cocoons” can hinder the flow and dissemination of emergency information (Liu & Zhou, 2022). In addition, “information cocoons” will exacerbate the application of irrational algorithm programs of the platform. Accelerate big data killing, algorithm discrimination and other behaviors.

Noble exposed discrimination against people of color embedded in search engines. She suggested that when people search for “black girl” on Google, a series of sexually explicit terms pop up. But if people type “white girl” into Google, the results are completely different. (Harrison,2021)

Reinforce personal perceptions and prejudices

Algorithms use vast amounts of macro and micro data to influence people’s decisions on everything from what movies to watch to help banks confirm personal creditworthiness. Search engines and social media platforms personalize recommendations based on what people have browsed and searched for in the past. Instagram, Facebook and other platforms will recommend similar content to your Page based on the posts you have viewed. So much so that our existing interests and biases are reinforced ( Turner et al., 2019).


Beware online “filter bubbles”Eli Pariser |TED2011 March 2011

The inevitability of the information cocoon

Network technology and platforms are important conditions for promoting information cocoons. In the digital age, users have to make trade-offs. Personal subjective choice is a necessary reason for information cocooning, and users will tend to choose information that matches their own cognition. Algorithms selectively provide us with content that matches our interests, thereby limiting exposure to opposing perspectives (Pariser, 2011).

Eli Pariser’s TED talk proposed the concept of “filter bubbles”. On how “filter bubbles” could change what we read and watch online, he argued it could fragment public discourse. Google search is an obvious example. He mentioned that two people searching for the same information on Google at the same time may get very different results. One engineer told Eli Pariser that even if you sign out of your account, there are 57 signals that recognize almost everything about you: From the model of device you use to the browser you use and your location, Google will personalize it for you. query results. But it is difficult for people to find the difference between their own query results and those of others. In addition to Google and Facebook, such “filter bubbles” have spread all over the Internet. “Filter bubbles” are unique information worlds customized by the Internet for each individual, which is the online world in which people live. The problem is that people can’t decide what information can pass through the “Filter bubbles”, and more importantly, people can’t see the things that have been deleted and Big Data doesn’t want you to see. These algorithms and personalized filters mainly refer to what people click on first, which makes it difficult to achieve an information balance. In a broadcast society, reviewers and editors control the flow of information. However, when the Internet appeared, it replaced the previous way of information circulation. The Internet has made everyone connected, but things are no longer the same. Algorithms create a new internet for people. But without adequate information, citizens cannot achieve effective democracy.

Effects on democracy

Technology that helps us tailor information and disseminate relevant content to our needs may be the greatest threat to democratic freedoms. In a democratic society, people need to meet opinions that are different from their own in order to fully express their ideas. Otherwise one risks falling into a spiral of reinforcing existing views (Sunstein 2002). The consequences of personalized information filtering affect the health of democracy. When we are only exposed to news and opinions that confirm our existing biases, we fragment consensus and hinder compromise. This polarization poses serious risks to democratic discussion and decision-making.

Reference list

Bishop, J. (Ed.). (2014)Transforming politics and policy in the digital age. IGI Global.

https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/reader.action?docID=3312972&ppg=165

Liu, W., & Zhou, W. (2022). Research on solving path of negative effect of “information cocoon room” in emergency. Discrete Dynamics in Nature and Society, 2022, 1–12. https://doi.org/10.1155/2022/1326579

David, K. (2018). The Future Was So Delicious, I Ate It All. Wired, 26(10), 112.

https://link.gale.com/apps/doc/A555563379/ITOF?u=usyd&sid=bookmark-ITOF&xid=700c99eb

Gossart, C. (2014). Can digital technologies threaten democracy by creating information

cocoons?. In Transforming politics and policy in the digital age (pp. 145-154). IGI Global.

https://gossart.wp.imt.fr/files/2013/08/Gossart_V03.pdf

Sunstein, C. R. (2002). Republic.com. Princeton, NJ: Princeton University Press.

https://jolt.law.harvard.edu/articles/pdf/v14/14HarvJLTech753.pdf

Turner, N. T., Resnick, P., & Barton, G. (2019, May 22). Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Brookings. https://www.brookings.edu/articles/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/

Sunstein, C.R.( 2006).Infotopia: How Many Minds Produce Knowledge; Oxford University Press: New York, NY, USA.

https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/reader.action?docID=271677&ppg=2

Harrison, L. M. (2021). ALGORITHMS OF OPPRESSION: HOW SEARCH ENGINES REINFORCE RACISM. College Student Affairs Journal, 39(1), 103-105.

http://ezproxy.library.usyd.edu.au/login?url=https://www-proquest-com.ezproxy.library.sydney.edu.au/scholarly-journals/algorithms-oppression-how-search-engines/docview/2504871702/se-2