
The massive number of Internet users and the amount of information it contains determine the importance of diversity on the Internet. The discussion that takes place on the network platform marks the emergence of democratic discussion and encourages more people to find out and express their opinions freely (Popiel, 2018). However, the information people receive over time becomes poor instead. Under the influence of the filter bubble caused by the algorithm, the public’s unequal and unbalanced reception of information leads to a lack of diversity. This article will first explain the gender diversity crisis on the Internet through survey data, then explore the influence of racial diversity in the Internet environment with a case study, and finally, explain the influence of the Internet on the political information visible to users by analyzing a well-known event in China in recent years, and summarize how the lack of diversity harms society and individuals with the continuous evolution of the Internet from three angles.
How “Filter Bubble” Was Discovered
Pariser discovered in 2011 that two people used Google to search for the same word, and the resulting pages might be completely different (Pariser, 2011); People with different political positions browse the same news event, and the news trends they see may be completely different. For example, in 2010, there was a shocking BP oil spill in the Gulf of Mexico. Pariser commissioned two friends who lived in the north and had similar education levels to search for relevant news on Google. One got the information about its Deepwater Horizon oil spill; The other got information about the company’s investment. Praiser found that in the Internet age, search engines can know users’ preferences at any time, filter out heterogeneous information, and create a personalized information world for users, but at the same time, a “wall” of information and ideas will be built, making users in an “internet bubble” environment, which hinders the exchange of diverse views (Pariser, 2011). Pariser named this phenomenon a “filter bubble”.

Male Chauvinism and Anti LGBT
According to the survey data of the World Wide Web Foundation, the possibility of women surfing the Internet is still 21% lower than that of men, and this gap may increase to 52% under the influence of different levels of national development (Carlos, 2020).This is because male internet participants obviously have no facts to support their view that women are inferior to men, but their views on women’s use of the Internet are influenced by the fact that they don’t have enough opportunities to know women’s behavior on the Internet, and they have a very special reverse logic. They think that women are not enough to use the Internet as freely as they are (Gumbus& Grodzinsky, 2004), which makes their bubble unable to be penetrated, the exclusion of women on the Internet directly leads to a large number of social and economic losses, in 2020, countries around the world lost nearly $126 billion (n.d.). Gradually, this oppression led to women’s fear of negative interaction on the Internet, and affected the online behavior of women groups, forming a vicious circle (Kanai& McGrane, 2021). Only the information showing a positive attitude towards gender imbalance will appear in front of the men surrounded by bubbles, while the critical voice will be actively influenced by the affected women; Or passively filtered by algorithms for their preferences.

Similarly, the Internet has exerted pressure on LGBT people, creating “the digital closet” to show opposition and restraint to this group. On social media, these sexual minorities are marginalized, resulting in some internet users no longer being willing to express their identity and compromise with the majority. Internet users who choose LGBT-hating groups live in the information cocoon created by the filter bubble for a long time, which is easy to cause people to have bad psychology such as blind self-confidence and narrow-mindedness. Their way of thinking will inevitably regard their own prejudice as truth, thus rejecting the intrusion of other reasonable viewpoints, especially when they are recognized by the same kind of people (Bruns, 2019), they will gradually evolve into extreme thoughts and extreme behaviors(Monea, 2022). This has resulted in the lack of diversity in the social scope, as well as the oppression of individuals’ freedom to express their sexual orientation.
Racial Discrimination
The impact of the digital divide on the inclusiveness of the Internet is also reflected in the race issue. In Australia, some non-governmental online groups have created online communities containing all kinds of radical information for specific races, and their participants make racist remarks and interact online without breaking the law (n.d.). These participants have not been regulated and punished. In March 2020, the bombing and harassment of the Zooms Meeting online lecture room organized by the school were also aimed at students from specific countries and races (Elmer, et al., 2020). The freedom of the Internet makes it an extremely easy tool to spread racist ideas,
The communication channels of personal online racism are mainly online spaces with fewer constraints, such as online forums, chat rooms, blogs, and videos mentioned above. It can be seen that the Internet not only allows static symbols such as texts and images to spread but also allows interactive information such as videos and music to be downloaded to spread racism (Klein, 2017). Such communication media is almost as rich as face-to-face. Racism denies the root cause of a specific group of people in society, and this negative attitude pushes the discriminators and the discriminated against to have a confrontation. Affected individuals are likely to become radicalized because they feel threatened or disgusted (Klein, 2017), This phenomenon endangering social balance and personal reputation is undoubtedly due to the lack of tolerance and diversity on the Internet..
Unbalanced Political Information

The objects, contents, methods, and effects of political communication are all contained in the framework of the algorithm. The algorithm is the auxiliary force of political communication. After understanding public opinion and collecting data-driven political information, the algorithm filters and chooses to push political information to users (Bellanova, 2017). The risk of algorithm transmission lies in that one-sided information transmission may lead to blind recognition of political audience and political paranoia of network users. Take China’s Internet censorship as an example. In the political information board published on the government’s official account, comments containing political discussions and social vision of human rights are strictly controlled.
A protest in Hong Kong that happened in 2020 attracted wide attention from society. A report from The Guardian News said that in the months after the protest, China’s government’s interference in residents’ networks includes, but is not limited to, deleting search results related to protests on search engines, tracking and monitoring online users’ behavior records in Hong Kong’s administrative region, and pursuing the legal responsibility of the publishers after discovering statements that are defined as bad information about protests, which leads to dissatisfaction among the masses. This protest is widely claimed to be a challenge to national sovereignty and a bad act of splitting the country (Kuo, 2020). People can’t recognize whether there are other reasons behind this protest in Hong Kong. The public’s voice of seeking knowledge is drowned in a large-scale regulatory trend, and they can only form opinions according to the official definition given in the filter bubble, but they have no way of knowing how the protesters think their human rights have been suppressed to take such radical actions. Controlling the direction of politics by controlling individual information sources may lead to a distorted understanding of the subject of communication (Bellanova, 2017), which is undoubtedly harmful to society and individuals.
Conclusion
To sum up, the filtering bubble controls the information reception of network users, which negatively affects the diversity of the network. Its manifestations include that men have more opportunities to contact the internet than women, and minority sexual groups are marginalized; Leading to the spread of racial hatred; To some extent, it controls the freedom of the masses’ political thought.
Reference List
Bellanova. (2017). Digital, politics, and algorithms: Governing digital data through the lens of data protection. European Journal of Social Theory, 20(3), 329–347. https://doi.org/10.1177/1368431016679167
Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1426
Carlos Iglesias (2020). The gender gap in internet access: using a women-centred
method. World Wide Web Foundation. Retrieved October 12, 2022, from
https://webfoundation.org/2020/03/the-gender-gap-in-internet-access-using-a-women-centred-method/
Costs of Exclusion Report. (n.d.). World Wide Web Foundation. Retrieved October 14, 2022, from https://webfoundation.org/research/costs-of-exclusion-report/
Deepwater Horizon – BP Gulf of Mexico Oil Spill. (2022). US EPA. Retrieved October 12, 2022, from https://www.epa.gov/enforcement/deepwater-horizon-bp-gulf-mexico-oil-spill
Elmer, G., Burton, A. G., & Neville, S. J. (2020, June 10). Zoom-bombings disrupt online events with racist and misogynist attacks. The Conversation. Retrieved October 15, 2022, from https://theconversation.com/zoom-bombings-disrupt-online-events-with-racist-and-misogynist-attacks-138389
Examples of Racist Material on the Internet | Australian Human Rights Commission. (n.d.). Retrieved October 15, 2022, from https://humanrights.gov.au/our-work/publications/examples-racist-material-internet
Gillespie, T., (2014). The Relevance of Algorithms. In Media Technologies. MIT Press.
Gumbus, A., & Grodzinsky, F. (2004). Gender bias in internet employment: A study of career advancement opportunities for women in the field of ICT. Journal of Information, Communication & Ethics in Society (Online), 2(3), 133–142. https://doi.org/10.1108/14779960480000248
Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the Filter Bubble?: Effects of personalization on the diversity of Google News. Digital Journalism, 6(3), 330–343. https://doi.org/10.1080/21670811.2017.1338145
Kanai, A., & McGrane, C. (2021). Feminist filter bubbles: ambivalence, vigilance and labour. Information, Communication & Society, 24(15), 2307–2322. https://doi.org/10.1080/1369118X.2020.1760916
Klein, A. (2017). Fanaticism, Racism, and Rage Online Corrupting the Digital Sphere (1st ed. 2017.). Springer International Publishing. https://doi.org/10.1007/978-3-319-51424-6
Monea, A. (2022). The Digital Closet. The MIT Press. https://doi.org/10.7551/mitpress/12551.001.0001
Pariser. (2011). The filter bubble : what the Internet is hiding from you. Viking.
Popiel. (2018). The Tech Lobby: Tracing the Contours of New Media Elite Lobbying Power. Communication, Culture & Critique, 11(4), 566–585.
https://doi.org/10.1093/ccc/tcy027