The lack of diversity in information on the Internet: revealing its social and personal harms

With the continuous updating of science and technology, the Internet is known as the beacon of information democracy and has become one of the indispensable and leading products of the human world. It was once based on providing a steady flow of information exchange, but now it is being discussed by scholars because of the lack of diversity and fairness in information. In addition to the semi-monopoly information control of technology giants, the development of individualism in democratic societies also seems to be one of the reasons for the gradual expansion of the limitations of information reception. Although this phenomenon may seem harmless at the moment, it has profound consequences for society and individuals. In this blog post, we’ll take a closer look at the lack of diversity on the Internet, exploring how it undermines social cohesion, promotes and exacerbates narrow-mindedness, and suggests ways to avoid it.

Echo Chamber by Kevin Hodgson is marked with CC0 1.0 

Platform reliance on algorithms hinders the communication of diverse information

The most intuitive manifestation of the lack of information diversity is the limitation of user information acceptance, which stems from the platform’s dependence on algorithms.

Algorithms typically filter and recommend as much similar content as possible based on a user’s historical behavior, interests, and preferences, while ignoring other diverse information. Over time, it is programmed, forming the phenomenon of “information cocoon” and “echo chamber“. They mean that people’s information field will be habitually guided by their own interests, resulting in only contact with the information they choose and feel comfortable, while ignoring other visual information, and their lives are shackled in a “room” like a cocoon. (Sunstein, 2006) This reduction in exposure to different points of view leads to people being surrounded by like-minded people who share only their beliefs and values, forming an “information echo chamber” (Moller et al., 2022). The process of the formation of the two phenomena is considered to be imperceptible, unavoidable and meaningful, and every independent choice or even tiny action of a user on the Internet platform will be captured and collected by the platform.

Conditions for the formation of information ”echo chambers“ and “Information cocoons”

Driven by individual user subjective choices

When people acquire information, they are often affected by their own psychological preferences and cognitive biases, so they choose to meet their expectations and needs but avoid information that is contrary to them or causes discomfort. However, if each person chooses the news he or she likes to see according to his or her own mind, then each person’s world will be just what he or she wants it to be rather than what it is (Sunstein, 2006). Again, this phenomenon is seen as inevitable because information is so widely available, free and easily accessible. Because information is freely available, the public has to make choices and trade-offs in the face of countless news (Sunstein, 2006).

Platform promotion

In order to attract users to stay and better consolidate the relationship with users, the platform will continue to present the content that users are interested in and provide more profitable and convenient information services. However, as (Karpf, 2018) mentioned in his article, the Internet is constantly mutating, yielding to interests and pressures. The personalized recommendation and user customization functions introduced by the platform may, on the one hand, be for more accurate advertising, so as to promote the number of transactions and earn commission. For example, when a user searches for beauty posts on ins, the platform will continue to push similar advertising information, and the user will be guided to pay step by step because of their interest. In addition, the algorithm-based acquisition of user information and preferences does not violate personal privacy, but it also helps the platform to avoid a series of censorship (Rainie, 2017). As a result, the platform is more dependent on the algorithm, which leads to the phenomenon of limited information reception is more common, causing harm, and relevant policy restrictions are needed.

The harms of limited dissemination of information

The proliferation of algorithmic content management has often confined users to an information bubble, where their views and opinions are constantly reinforced rather than challenged.

Exacerbating the formation of personal bias

 Although the Internet allows the existence of different voices, overly limited and targeted user information flow and transmission may exacerbate the polarization of thinking and bias (Jasrotia,2023). To some extent, this phenomenon may lead to the Internet becoming a breeding ground for extremism and anger (Moller et al., 2022). Similarly, for individuals, blind confidence thinking patterns may emerge as a result of direct exposure to information that is consistent with their own views, believing that their biases are the truth. As a result, critical thinking will be hindered, and users will lose the ability to ask questions, analyze, and adjust their opinions in light of new information (Cherry, 2022). The consequence of this phenomenon may be that groups or individuals instinctively reject others and deny views contrary to their own, leading to arguments. The phenomenon of blind confidence can be highlighted by groups and individuals who remain skeptical about climate change, who are exposed only to information that supports the idea that climate change is a hoax and instinctively reject any evidence or even scientific evidence to the contrary.

The loss of social solidarity

The lack of diversity of Internet information recommendation has a far greater negative impact on society than on individuals, including contributing to the reduction of social inequality and stickiness. Since ancient times, human beings as social animals have supported and helped each other on the basis of mutual sharing of experience. However, this situation may lead to people from different backgrounds being unable to connect, interact and empathize with each other, which becomes a problem of fear (Liu & Zhou, 2022). This lack of empathy and connection can breed misunderstanding within societies, and divisions can ultimately undermine the fabric of community and cooperation.

Canadian COVID-19 protesters by michael_swan is marked with CC0 1.0 

For example,during the COVID-19 pandemic, a lot of disinformation and conspiracy theories were widely circulated on the Internet. This leads some people to be misled and believe only certain false information sources rather than the advice of scientific and public health experts, leading to increased social health risks and divided public opinion (Tagliabue et al., 2020). At the same time, the lack of trust caused by the extremely one-sided way of thinking makes the society lose its unity in the face of natural disasters, and the reduced ability to empathize also leads to people becoming indifferent to each other and giving up mutual help.

How to reasonably supervise and break through the situation of information closure:

When considering the harm and challenge to individuals and society caused by the lack of information diversity caused by the platform’s over-reliance on algorithms, the avoidance of individual behaviors and reasonable supervision by the government are urgently needed.

  • Personal breakthrough:

From the perspective of individuals, users themselves are the main drivers of the information echo chamber, possessing the ability and power to break out of the information limits (Moller et al., 2022). Therefore, we must actively seek out different perspectives and try to challenge our own theories. For example, try to use critical thinking to put yourself in others’ shoes and seek out conversations with people who hold different views and beliefs.

  • Platform collaboration:

Media platforms can avoid algorithmic bias by trying to ensure that their algorithms are designed to promote diverse and balanced content. At the same time, different viewpoints and opinions should be displayed as much as possible to promote diversified content, promote more open exchange of ideas and avoid the formation of “information echo chamber” (Yuan & Wang, 2022).

  • Government and social supervision:

The solution to this lack of diversity is for governments to try to regulate and address the over-reliance of social media platforms on algorithms. Second, attempts to invest knowledge in society and citizens, such as teaching citizens how to critically evaluate information and identify biased content, will help to facilitate the flow and connection of diverse information in society.

Conclusion

As the Internet continues to play a central role in our lives, the chronic lack of diversity of information is seen as fuelling extremism and undermining solidarity among members of society, making regulation and solutions urgent. Therefore, in addition to the attempt of users to seek diversified views and the platform to reduce the use of algorithms, the government and society should also include algorithms in the ranks of privacy regulation, and try to popularize the knowledge of biased content to the society, so as to break through this grim situation.


Reference:

Cherry, K. (2022, November 8). Dunning-kruger effect: Why incompetent people think they are superior. Verywell Mind. https://www.verywellmind.com/an-overview-of-the-dunning-kruger-effect-4160740 

Jasrotia, A. (2023, March 1). The Dark Side of algorithmic curation: Bias and manipulation in content recommendations. BookJelly. https://bookjelly.com/dark-side-of-algorithmic-curation/ 

Karpf, D. (2018, 10). The Future Was So Delicious, I Ate It All. Wired, 26, 112. http://ezproxy.library.usyd.edu.au/login?url=https://www.proquest.com/magazines/future-was-so-delicious-i-ate-all/docview/2111753908/se-2

Liu, W., & Zhou, W. (2022). Research on solving path of negative effect of “information cocoon room” in emergency. Discrete Dynamics in Nature and Society, 2022, 1–12. https://doi.org/10.1155/2022/1326579 

Möller, J., Gregorio, G. D., Cohen, A., & Sharon, A. (2022, March 4). What are filter bubbles and digital echo chambers?: Heinrich-Böll-Stiftung: Tel Aviv – israel. Heinrich-Böll-Stiftung. https://il.boell.org/en/2022/03/04/what-are-filter-bubbles-and-digital-echo-chambers 

Rainie, L. (2017, February 8). Code-dependent: Pros and cons of the algorithm age. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/ 

Sunstein, C. R. (2006). Infotopia: How many minds produce knowledge. Oxford University Press.

Tagliabue, F., Galassi, L., & Mariani, P. (2020). The “pandemic” of disinformation in covid-19. SN Comprehensive Clinical Medicine, 2(9), 1287–1289. https://doi.org/10.1007/s42399-020-00439-1

Yuan, X., & Wang, C. (2022). Research on the formation mechanism of information cocoon and individual differences among researchers based on information ecology theory. Frontiers in Psychology, 13. https://doi.org/10.3389/fpsyg.2022.1055798