The Internet Repeats Itself: When will developers finally recognise the industry’s lack of diversity in imbuing harm?

RE: 05 Jenni.

Q2: To what extent has a lack of diversity influenced the development of the internet. How does this lack of diversity harm societies and individuals?

It seems each day a new shiny technological tool headlines our feeds and promises utopia. These incessant efforts to advance technology are symbolic of the culture founding the development of the internet, often collapsed into widely known ‘Silicon Valley culture’.

Silicon Valley’s Cultish Culture

Engineering culture in Silicon Valley secured its dominance through producing high quality products, integrating incentives into their organisational structure, specialising their processes and emphasising innovation as inseparable to its’ operations (Lecuyer, 2001). These characteristics help prevail Silicon Valley’s success and become the infamous global technology hub it is today (Lecuyer, 2001).

However, this capitalistic drive which merits innovation, prioritises what is good for business (Castell, 2011) which historically does not always translate into what is good for society.

This essay argues through the lack of diversity of groups leading the development of the internet, societies and individuals are subject to mediatisation as a growing, inescapable issue.

History repeats itself

Ada Lovelace, 1838
“Ada Lovelace, 1838” by Nefi is licensed under CC BY-NC-SA 2.0.

Silicon Valley is not our first glimpse of privileged power play in technology but rather, a modern manifestation of this precedent.

Dating back to the 1800s, female scientist Ada Lovelace, revolutionised the computer science field by laying the foundations of how computers developed. Yet, scholars in the field believed the work of Lovelace was overstated and is emblematic of the reductive normative gender roles within the industry.

Eventually Lovelace was recognised as the first computer programmer yet the issue of representation continues to persist. Women only attribute to 28% of the science and engineering workforce (Nwafor, 2021) making space to further situate older, white men as the face of science, technology, engineering, and mathematic (STEM) industries.

But the case of Lovelace is not only problematic because it solidifies historical untruths nor is the issue of diversity an issue of profitability. Rather, through diversifying STEM industries further develops better functioning technologies and propel social progress.

Ultimately, a lack of diversity is self-sabotaging to a truly well-functioning internet and goes against the fundamental motive of New Communalists (Lusoli & Turner, 2021); because how can the internet be a space ‘for the people’ when an overwhelming proportion of citizens’ participation are discounted and overlooked?


Growing intimacy of the internet

Jenkins’ ideas of participatory culture (2006, as cited in Jenkins 2014) illustrates how content alone does not produce meaning rather, social semiotics are conceived alongside the affordances of such medium- what is referred to as ‘mediatisation’ (Couldry & Hepp, 2013).

Further, the natural trajectory of the pandemic propelled an already deepening mediatisation, which further muddles the ability to pinpoint the root of the issue as developments advance. Thus, distinguishing between interactivity and participation is critical in understanding digital ethnography (Jenkins, 2014) .

Therefore, despite efforts of governance without acknowledging and critiquing the hegemonies leading these development such efforts are merely decorative.


Gatekeeping employment

We see embellishment of technology development with recent introduction of Artificial Intelligence (AI) in assisting workplace interview process.

Job Interviews
“Job Interviews” by World Relief Spokane is licensed under CC BY-NC-ND 2.0.

These interview systems are built on norms instilled by their developers; therefore, atypical behaviour is posited as unfavourable rather than for what they really are- different to the hegemony.

Moreover, the prevalence of one-way interviews is increasing and being favoured by firms for their heightened efficiency. Yet these systems built by neurotypical developers fail to consider how neurodivergent prospects are disadvantaged.

For example, automatic speech recognition software is bias towards men and younger adults (Guo et al., 2020). Further, people with disabilities who have the inability to speak must have blind faith in the systems output. Similarly, those who are hard of hearing are likely to face consequences for the way speech recognition may be wrongly interpreted (Guo et al., 2020).

The array of disabilities and the ways they manifest being both unpredictable and disparate counters the very binary operations of AI. Therefore, neurodivergent individuals are at the detriment to the design of systems which are not designed for them- again, by the failure of satisfactory consideration of diversity in technological development.

Similarly, gendered bias also seeps into job recruitment with both Google and Facebook job ads (Ifeoma, 2021) being directed towards men; echoing normative discriminatory gendered roles within the workforce.


The problem with AI’s feedback system 

Issues with a lack of diversity also manifest in the functioning of Artificial Intelligence (AI).

Microsoft’s Tay was introduced as a Twitter chatbot to showcase AI’s self-learning capabilities, shaped by its interactions with others. Initial tweets of “humans are super cool” quickly turned misogynistic, racist and antisemitic after learning and repeating ideaologies uttered by toxic technoculture.

The now privated Tay twitter account meant to showcase the utopian imaginaries of AI only highlighted the toxic manifestations of the hegemonies building such systems. In essence, this failure of Tay can be attributed to developers’ negligence and overlooking the need to regulate hate speech of its AI (Ifeoma, 2021).


Racist facial recognition software

Other incidents of the lack of diversity failing marginalised groups are revealed in instances of discriminatory facial analysis software.

In one instance, a black woman named Joy Buolamwini was confronted with the invisible technological prejudices against black people. Buolamwini found generic facial recognition software was unable to detect her facial features until she wore a white mask. This is reminiscent of HP’s face tracking feature’s inability to recognise black faces in 2015 and signals a failure of technology industries acknowledging these entrenched biases, leading to repeated offences.

Similarly, Flickr’s issue with auto-tagging falsely labelled a black person as an ape reappeared with Google photos labelling a black male as gorilla. Showcasing these embedded biases do not only exist on public platforms but seep into apps for private consumption.

Evidently, these instances of auto-labelling and functional ineptitudes fail black communities by solidifying racist stereotypes into their infrastructures and outcasting people of colour from using such softwares.

Rampant sexism exposed by GamerGate

Further it is crucial to consider the role a lack of diversity plays in guiding normative online behaviour.

Gamergate is one recent exposition of how online behaviour induces offline consequences. This culture war illuminates the disempowerment female journalists and gamers face which have evolved from pernicious internet norms.

The gaming industry is largely male dominated by both developers and users, which provides a greater opportunity for gendered echo chambers to proliferate. Steam’s subsidiary company Valve produced a game, “Rape Day”, with the game’s objective of raping women. By gamifying sexual assault, an issue disproportionately troubling women exemplifies how women’s experiences and struggles are ill reflected in internet development. Moreover, it reflects the apparent toxicity threaded throughout internet norms.


Why platforms must be proactive, not reactive

However, it is not enough to react to technical issues as they arise, but rather we must actively address the root of the problem to avoid regressing into old rituals. The issue of reacting is often performative because of the rising expectations for larger companies to subscribe to politically correct behaviour.

“Google” by Carlos Luna is licensed under CC BY 2.0.

Google introduced an AI ethic council as a response to critiques yet appointed an openly anti-LGBT leader. Google’s negligence demonstrates how performative decisions motivated to appeal to expectations only produce haphazard, temporary solutions. In this instance, the issue was not the absence of the council, but absence representation in the decision-making process.

Until the technology industry addresses its’ biases it is inevitable we see marginalised communities such as LGBTQ+, women, people of colour and disabled groups continue to be disillusioned.

Therefore, we posit the harm emerging from the development of the internet as a symptom of the systemic issues being a lack of diversity amongst elites and so, should not be reduced to a technological flaw.

Jasmin Ozolins (520406555) RE:05 Jenni.

Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0).


“Ada Lovelace, 1838” by Nefi is licensed under CC BY-NC-SA 2.0.

Bulao, J. (2022). How Fast Is Technology Advancing in 2022?. Tech Jury

Castells, Manuel. (2011). The Culture of the Internet. The Internet Galaxy: Reflections on the Internet, Business, and Society (pp. 36-63). Oxford University Press.

Clement, J. (2022). Distribution of video gamers in the United States from 2006 to 2021, by gender.

Clement, J. (2022). Distribution of game developers worldwide from 2014 to 2021, by gender. Statista.

Cohn, J. (2019). Google’s algorithms discriminate against women and people of colour. The Conversation.

Couldry, N., & Hepp, A. (2013). Conceptualizing Mediatization. Communication Theory, 23(3), 191-202.

Drage, R. (n.d.). How Humanity Underpins Business Sustainability and Profitability. My Business.

“Google” by Carlos Luna is licensed under CC BY 2.0.

Guo, A., Kamar, E., Wortman Vaughan, J., Wallach, H., &  Ringel Morris, M. (2020). Toward Fairness in AI for People with Disabilities: A Research Roadmap. ACM SIGACCESS Accessibility and Computing, 125, 1–1.

Helfrich, T. (2021). Diversity and Technology Have the Power to Boost Business Revenues. Entrepreneur.

Jenkins, H. (2014). Rethinking “Rethinking Convergence/Culture.” Cultural Studies, 28(2), 267–297.

Job Interviews” by World Relief Spokane is licensed under CC BY-NC-ND 2.0.

Koeze, E., & Popper, N. (2020). The Virus Changed the Way We Internet. The New York Times.

Lecuyer, C. (2001). Making Silicon Valley: Engineering Culture, Innovation, and Industrial Growth, 1930–1970. Enterprise & Society, 2(4), 666–672.

Levin, S. (2019). Google scraps AI ethics council after backlash: ‘Back to the drawing board’. The Guardian.

Lusoli, A., & Turner, F. (2021). “It’s an Ongoing Bromance”: Counterculture and Cyberculture in Silicon Valley—An Interview with Fred Turner. Journal of Management Inquiry, 30(2), 235–242.

McSorley, S. (20210). Why is Diversity Important in Technology? We are Crew.

Mellor, G. [@geraldmellor]. (2016, March 26). “Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about… [Image attached above] [Tweet]. Twitter.

Morais, B. (2013). Ada Lovelace, the first tech visionary. The New Yorker.

Naughton, J. (2021). Can big tech ever be reined in? The Guardian.

NowThisNews. (2021, June 12). Gamergate: The Sexist Side of Fandom | Bad Influence [Video]. YouTube.

Nouri, S. (2021). Diversity And Inclusion In AI. Forbes.

Nwafor, I. E. (2021). AI ethical bias: a case for AI vigilantism (AIlantism) in shaping the regulation of AI. International Journal of Law and Information Technology. 4(1).

Olson, P. (2018). The Algorithm That Helped Google Translate Become Sexist. Forbes.

Pariser, E. (2018, December 19). How news feed algorithms supercharge confirmation bias | Eli Pariser | Big Think [Video]. YouTube.

Paul, K. (2019). ‘Disastrous’ lack of diversity in AI industry perpetuates bias, study finds. The Guardian.

Romano, A. (2021). What we still haven’t learned from Gamergate. Vox.

SG Analytics. (2022). Is Silicon Valley Still Dominating Global Innovation?

Stanberry, K., Anderson, J., & Raine, L. (2019). Experts Optimistic About the Next 50 Years of Digital Life. Pew Research.

Experts Optimistic About the Next 50 Years of Digital Life

Spar, B et al., (2018). Global Recruiting Trends. LinkedIn Talent Solutions.

Statista Research Department. (2021). Number of forcible rape and sexual assault victims in the United States from 1993 to 2020, by sex. Statista.–reported-forcible-rape-cases-by-gender/

TED. (2017, March 13). How I’m fighting bias in algorithms | Joy Buolamwini [Video]. YouTube.

TEDxTalks. (2021, July 2). How Algorithms Spread Human Bias | Corey Patrick White | TEDxOklahomaCity [Video]. YouTube.

Vincent, J. (2016). Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day. The Verge.

Winfield, H. (2021). Apple and Google still have an LGBTQ problem. Wired.

wpengine. (2021). Increasing Women’s Participation in STEM: A Deep-dive into the Industry with Developers Institute. Candlefox.

wzamen01. (2009, December 11). HP computers are racist [Video]. YouTube.

About JASMIN OZOLINS 2 Articles
RE: 05 Jenni