The lack of diversity in the development of the internet
The lack of diversity on the internet is a widely discussed topic as a result of the rapid growth of the internet, as well as increased use of algorithms. The most often studied being gender discrimination (Noble, 2018), racial discrimination (Matamoros-Fernández, 2017), and socio-economic discrimination (Walsh, 2020). This essay will primarily explore sexism on the internet, more specifically, the effects of algorithms on gender descrimination and vice versa. It is important to note that all these forms of prejudice hold equal importance and their consequences overlap in many scenarios.
Before exploring the concept of algorithmic bias, it is important to step back and analyse the development of the internet and how it may have contributed to the lack of diversity present today. Lusoli argues that the idyllic goal of a shared consciousness meant letting go of the very bureaucracy that acted against prejudice. Most of the people responsible for developing the internet were a part of a similar culture, usually white middle/upper class males. This “bro-culture” embedded itself in the early phases of Silicon Valley and is still majorly present (Lusoli, 2021). A study conducted by Reveal and the Centre for Employment Equity showed that 58.7% of executive positions in Silicon Valley were held by white men, and similar trends were seen in other leadership roles (Rangarajan, 2018). Web 2.0 is built on the idea of open sources and content sharing, but if that content is generated and promoted by a homogeneous group of people then there is bound to be a lack of diversity.

The role of algorithmic bias in discrimination
At the core of Web 2.0 lies algorithms. It provides companies with ways to profit off of the attention economy, unrivalled efficiency, as well as an easy way to access open information to our preferences. However, these very algorithms can perpetuate stereotypes and amplify bias towards minorities. The concept of an algorithm is inherently neutral, but anything designed by humans would imply some form of bias. In order to create algorithms, programmers must generate/label data and select variables, which would inevitably introduce bias. A lack of diversity would mean limited perspectives. The variables needed for an algorithm to produce a somewhat encompassing result would require people from multiple backgrounds providing their ideas on relevant inputs. Deloitte revealed that women only make up 20% of the professionals in AI, giving rise to an overwhelming lack of variability in data (Hupfer, 2021). On top of that, the lack of women in leadership positions means that their input isn’t as highly regarded. Other than the development of the algorithms themselves, bias can stem from “bad data”:
Unrepresentative/incomplete data: Guild created an AI software to recruit programmers based on their resumes and their contributions to platforms like GitHub. However, it failed to consider the fact that women often dedicate hours after work to unpaid care – resulting in less time to chat online. Additionally, many women pose themselves as men on these platforms to bypass safety concerns. Consequently, the algorithm ranked women much lower compared to their male counterparts (Smith, 2021).
Specifying a solution that excludes certain groups: Amazon abandoned their AI recruiting tool in 2018 as it was found to be discriminatory. The algorithm was responsible for screening resumes based on 10 years worth of resumes, which had been predominantly male due to gender workplace statistics at the time. Hence the algorithm disposed of most female resumes as their references did not have many female resumes to begin with. The lack of data caused the tool to ignore more important aspects like skills and experience, further reducing women employed in the company (Dastin, 2018).
Prioritising certain information over others: The UN women campaign #womenshould revealed sexism in Google’s search engine results. The results discriminated against them (women cannot be trusted/drive) or outright denied their rights (women shouldn’t vote). These long held stereotypes were the prioritised search results because they were simply the most popular. Popular information earns more profit, bringing attention to some parts of the web, and satiates the audience by playing to their preconceptions (Halvais, 2019).
Historically embedded bias: In 2019, Genevieve Smith found that her husband’s credit line was 20 times greater despite her owning the same card with a better credit history. Algorithms used to calculate credit often use marital and gender history, however these are historically biased due to women facing severe discrimination in the past as well as being unable to get formal credit documents (Smith, 2021).

The affects of gender discrimination on societies and individuals
The consequences of gender inequality not only has an immense impact on societies, but on the individual as well. It may lead to an inability to promote women’s empowerment, or set a feedback loop that is difficult to reverse. By making certain information more findable and others less discoverable, stereotypes that may have already been embedded into societal behaviour are further perpetuated (Halavais, 2019). NYU conducted a study proving pre-existing gender equalities are further exacerbated by search algorithms, which then go on to transform neutral terms into gender charged. When the participants were exposed to search results created by data with low gender inequality, they had more “egalitarian judgments” of gender job preferences. Search results shown with high gender inequality resulted in certain jobs being stereotyped specifically for men or women (2022). Not only can these stereotypes influence future hiring positions, it can prevent diversity in any industry as women will not be supported.
Search results are calculated and occur in a highly commercialised environment, these results are then normalised by the sheer influence and popularity of the internet, which are then believed as factual. This is harmful as minorities don’t often have the power to speak up against discrimination (Noble, 2018). The free nature of the abundance of information available on the internet makes individuals and societies believe it is inclusive of many perspectives. This glaring misunderstanding may eventually lead to the erasure or derogatory treatment of minorities. For example: the absense of non-binary individuals in gender analytics produces results that ignore an important group of people. Hence, an unfair allocation of resources and lower quality service for these abandoned groups (Smith, 2021).
The lack of varied data may also result in increased physical harm. Car safety measures such as seatbelts and airbags are designed using dummies based on the male physique. Pregnant women, or even the general body type of women don’t fit these dimensions and are hence 47% more likely to be injured in a car accident (Niethhammer, 2020). The healthcare industry is also infamous for having a lack of data on women. Medical studies aren’t conducted on women as often due to pregnancy or hormonal changes. However, the huge gaps in medical data because of this has prevented women from being diagnosed properly, even causing incorrect prescriptions.
It is clear that algorithms play a significant role in not only perpetuating, but creating gender stereotypes. The US-centric ideology that the internet was built on has resulted in the ignorance of cultural diversity and nuance (Halvais, 2019).I believe that a deeper understanding of these algorithms as well as a stronger effort to promote diversity within these algorithms are at the core of solving the issue of algorithmic bias. Simply changing the mindset of some individuals is not enough however. It will be vital for groups from all different backgrounds to come together and create certain guidelines that promote equality and inclusivity. It is not only important for the safety of societies and individuals, but also to provide them with a more knowledgeable approach to gathering information true to minorities.

References:
Dastin, J. (2018, October 11). Amazon scraps secret AI recruiting tool that showed bias against women [Web log post]. Retrieved from: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Hupfer, S., Mazumder, S., Bucaille, A., & Crossam G. (2021, December 1). Women in the tech industry: Gaining ground, but facing new headwinds. Deloitte. https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/2022/statistics-show-women-in-technology-are-facing-new-headwinds.html
Halavais, A. (2019). How search shaped and was shaped by the web. In N. Brügger, & I. Milligan The SAGE handbook of web history (pp. 242-255). SAGE Publications Ltd, https://dx.doi.org/
Lusoli, A., & Turner, F. (2021). “It’s an Ongoing Bromance”: Counterculture and Cyberculture in Silicon Valley—An Interview with Fred Turner. Journal of Management Inquiry, 30(2), 235–242. https://doi-org.ezproxy.library.sydney.edu.au/10.1177/1056492620941075
Matamoros-Fernández, A. (2017). Platformed racism: the mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube. Information, Communication & Society, 20(6), 930–946. https://doi.org/10.1080/1369118X.2017.1293130
Niethammer, C. (2020, March 2). AI Bias Could Put Women’s Lives At Risk – A Challenge For Regulators. Forbes. https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/?sh=adc8e1d534f2
NYU. (2022, July 12). Gender Bias in Search Algorithms Has Effect on Users, New Study Finds. https://www.nyu.edu/about/news-publications/news/2022/july/gender-bias-in-search-algorithms-has-effect-on-users–new-study-.
Smith, G., Rustagi, I. (2021, March 31). When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity. Stanford Social Innovation Review. https://ssir.org/articles/entry/when_good_algorithms_go_sexist_why_and_how_to_advance_ai_gender_equity
UN Women ad series reveals widespread sexism. (2013, October 21). [Web log post]. Retrieved from: https://www.unwomen.org/en/news/stories/2013/10/women-should-ads
Walsh, M. (2020, October 22). Algorithms Are Making Economic Inequality Worse. Harvard Business Review. https://hbr.org/2020/10/algorithms-are-making-economic-inequality-worse