How the Lack of Diversity on the Internet Affects Society and Individuals: Lead to Social Inequality and Ignoring Fake news through Filter Bubbles.

Group Hand Fist Bump – Credit to http://homedust.com/” by Homedust is licensed under CC BY 2.0.

Introduction

The Internet is playing an indispensable role in our daily lives. It has changed the way we encode and decode information and even shaped our worldview. However, there is a potential problem behind the screen – a lack of diversity. This blog will talk about how the lack of diversity on the Internet has affected society and individuals. It perpetuates social inequalities, filters information in a biased way and isolates groups. In order to support this view, we will discuss the theories of filtering bubbles, information cocoons and fake news. Furthermore, we will question the neutrality of industrial intelligence. Against the bias that limits the growth of diversity on the internet and provides solutions to the lack of diversity.

Marginal Groups

The lack of diversity aggravates social inequality. The unequal distribution of power and resources on the internet, some groups are marginalized while others enjoy more opportunities. This inequality extends to the fields of representation, employment opportunities and education opportunities. Social media platforms and cyberspace are often criticized because of the lack of diversity of employees, which affects the creation and supervision of content. Differences in gender ratings still exist in a professional environment. For example, a study by the Pew Research Center found that most women working in science, technology, engineering or mathematics (STEM) have suffered from gender discrimination in their work (Funk & Parker, 2018, p. 6). This has created stereotypes in the workplace.

Although women account for 50% of all workers in STEM occupations in the United States, they are underrepresented in individual STEM occupational clusters, especially in computer work and engineering (Funk & Parker, 2018, p. 24). Because of this, women in STEM fields pay more attention to gender differences (Funk & Parker, 2018, page 55). These women groups are better able to consider gender diversity through first-hand experience. Especially for technology companies, gender diversity has brought valuable commercial roles for the diversity of the Internet.

In addition, racial and ethnic minorities are underrepresented in technology companies. Among employees in STEM fields, blacks are worried about paying too little attention to the growing racial and ethnic diversity in their work (Funk & Parker, 2018, p. 71). This is because Blacks experience high rates of discrimination in the workplace. They believe that Blacks are often not treated fairly in hiring decisions or promotion opportunities. Therefore, this lack of diversity perpetuates bias in algorithms and content recommendations, which prevents access to different viewpoints.

Filtering Bubbles Drown Out Diverse Thinking

One consequence of the lack of diversity on the Internet is the emergence of filtering bubbles, which traps users in the information cocoon. The concept of filtering bubbles explains how personalized algorithms can trap individuals in a information cocoon customized according to their preferences, which means that individuals will only be exposed to information consistent with their existing views, but not to different views. For example, Facebook hopes to give people at all levels a voice through anti-speech, thinking that participation of bottom users can reform prejudice more effectively than the censorship at the top, and encourage active speech and debate (Napoli, 2018, p. 65). This gives everyone the right to have a fair conversation. However, the likelihood of Facebook’s approach to this online civic initiative being met with factually refuted speech is reduced. Because people are willing to go ahead and make their statements and discuss an idea that is considered ‘correct’ on the platform, there will be no counter-arguments to that ‘correct’ idea. Filter bubbles are a direct contributor to the lack of diversity on the internet, with algorithms prioritizing content that reinforces a user’s preexisting ideas, limiting diverse thinking.

Facebook Screenshot” by codemastersnake is licensed under CC BY 2.0.

Ignoring fake news passing information filtering

The recent announcement that Lisa, a Thai member of South Korean girl group BLACKPINK, will perform at the Crazy Horse Paris has sparked heated debate among netizens, especially on China’s popular media platform Weibo today, where a group photo was questioned by netizens, who accidentally found Lisa’s hips much wider than they were before, especially when compared to another member, Jennie Jisoo, who was very obvious, and it is suspected Lisa has undergone buttock augmentation in order to please the it is suspected that Lisa had her buttocks enlarged to please the European and American markets. Some people claim that this is Lisa’s dedication, while others think it is a matter of the shooting angle. No one bothered to think about whether the news was true or not, but paid more attention to the content. As a result, such statements have been controversial on the Internet, which has made fake news widely spread. Only when there is partisan bias in a person’s filter bubble will the possibility of fake news passing through the filter bubble increase, which is the empirical relationship between partisan bias and fake news (Napoli, 2018, p 70). At the same time, the possibility of legal news offsetting fake news will decrease. This isolation from opposing views can lead to polarization in media platforms.

王元爱音乐. (2023a). 爆Lisa疑似为疯马秀丰臀,通体雪白不遮羞,疑LV的VIP称收到门票 [Online Image]. In 手机网易网. Www 163.com. https://m.163.com/dy/article/IF5KE8B005563QW8.html?spss=sps_sem

Neutrality of AI

Opponents tend to think that although artificial intelligence controls most content recommendation and filtering on Internet platforms, it is always neutral and will not deliberately favor a group. Platform algorithms are designed for media platforms to enhance the user experience by analyzing patterns and providing users with content that matches their interests. Google search engine is a good example. Google strives to be an objective information transmitter, helping users to find the information they want (Bilić, 2016) In other words, Google’s algorithm provides the most relevant results for users according to their search queries and browsing history. The official version of Google’s Search Quality Rating Guidelines (SQRGs) provides insights into the company’s curated activities and user construction, maintaining technological neutrality and objectivity in structuring working relationships, algorithmic relevance, and information search on the web (Bilić, 2016). This shows that the algorithms in the platforms does not have subjective judgments, and creates a diversified and diffuse network world, leave all the decisions to the users, who can achieve free speech on the network and do not limit the user’s innovative comments on content and events.

AI biases from programmers 

This technical reflection of users’ preferences seems harmless, but it hinders the diversity of internet development. Algorithms were finally created and programmed by humans. Programmers make choices about the data used, the algorithms developed and the targets assigned to the AI system. These choices may bring bias into the decision-making process of artificial intelligence. For example, if historical data contains biases, some groups are underrepresented or narratives are biased, then artificial intelligence may inadvertently strengthen these biases. Because people who are privileged in racism, misogyny and disability discrimination in society are often overrepresented in lm training data (Bender et al.). 2021). Therefore, the prejudice that artificial intelligence learns from humans will limit the diversity of network. The Splinternet phenomenon on the Internet, which refers to the gradual fragmentation of the Internet into separate ecosystems, creates challenges for content review and filtering. Subgrouping is evident (e.g. male vs. female, or native vs. non-native categorization). As Benjamin (2019) argues, technology may inadvertently represent and reshape existing inequalities. This aggravates the polarization of the Internet and immerses users in filter bubbles that do not provide diversity of discourse.

Solutions to the lack of diversity on the internet

https://youtu.be/BRRNeBKwvNM?si=vECPhKBJtAkvDMM9

1. Be aware of our own prejudice and the deviation of the machine.

  • Correcting the creators’ prejudice in inputting programs and our role in the network to maintain fairness.

2. Make sure that different teams are developing technology.

  • People of different genders, races and backgrounds are needed to form a diverse team. Avoid stereotypes and subjective ideas about technology entering the Internet platforms.

3. Be sure to give it different experiences for reference.

  • The Internet world was created by humans and will be improved by humans, shaping a more egalitarian and inclusive environment together.

Conclusion

The lack of diversity of the Internet has led to persistent disparities in representation and access, isolating and marginalizing individual groups. In addition, it encourages filtering bubbles, limits access to different opinions and prevents meaningful dialogue. Today’s society needs more different perspectives to look at the world. Lack of diversity restricts the diversified development of the Internet. Leading to limited platform content and technological innovation; increasing social inequality by limiting the diversity of information; and perpetuates unfair employment stereotypes. The development of diversity on the Internet requires the combined efforts of technology and society, and each of us to create an Internet era of equality, diversity, and win-win outcomes.

Reference List

Adams, R. (2018, June 11). Group Hand Fist Bump – Credit to http://homedust.com/. Flickr. https://www.flickr.com/photos/159630537@N08/41834948455/in/photostream/

Bender, E., Mcmillan-Major, A., Shmitchell, S., Gebru, T., & Shmitchell, S.-G. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? FAccT ’21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. https://doi.org/10.1145/3442188.3445922

Benjamin, R. (2019). Race after Technology: Abolitionist Tools for the New Jim Code. Engineered Inequity: Are Robots Racist? http://ebookcentral.proquest.com/lib/gmul-ebooks/detail.action?docID=5820427

Bilić, P. (2016). Search algorithms, hidden labour and information control. Big Data & Society, 3(1), 205395171665215. https://doi.org/10.1177/2053951716652159

Borgesius, F., Trilling, D., Möller, J., De Vreese, C., & Helberger, N. (2016). INTERNET POLICY REVIEW Journal on internet regulation Should we worry about filter bubbles? Should we worry about filter bubbles? Internet Policy Review, 5(1). https://doi.org/10.14763/2016.1.401

Center, N. G. S. F. (2018, November 3). NASA Goddard Hosts Young Women for STEM Girls Night In. Flickr. https://www.flickr.com/photos/35278629@N08/30830491637

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1

Funk, C., & Parker, K. (2018, January 9). Women and Men in STEM Often at Odds Over Workplace Equity. Vtechworks.lib.vt.edu. http://hdl.handle.net/10919/92671

Kumar, N. (2010, November 12). Facebook Screenshot. Flickr. https://www.flickr.com/photos/50623029@N07/5169004822

Napoli, P. M. (2018). What If More Speech Is No Longer the Solution? First Amendment Theory Meets Fake News and the Filter Bubble. Federal Communications Law Journal, 70(1), 55–105. https://link.gale.com/apps/doc/A539774158/AONE?u=usyd&sid=bookmark-AONE&xid=1f0a3a5c

Soh, J. (2023, September 26). Crazy Horse Paris preview? Blackpink’s Lisa raises eyebrows with raunchy dance. The Straits Times. https://www.straitstimes.com/life/entertainment/crazy-horse-paris-preview-blackpink-s-lisa-raises-eyebrows-with-raunchy-dance-on-instagram

TED. (2019). How to keep human bias out of AI | Kriti Sharma. In www.youtube.com. https://youtu.be/BRRNeBKwvNM?si=vECPhKBJtAkvDMM9

王元爱音乐. (2023, September 21). 曝Lisa疑为疯马秀丰臀,通体雪白不遮羞,疑LV的VIP称收到门票_手机网易网. Www.163.com. https://m.163.com/dy/article/IF5KE8B005563QW8.html?spss=sps_sem