Q2: To what extent has a lack of diversity influenced the development of the internet. How does this lack of diversity harm societies and individuals?
It seems each day a new shiny technological tool headlines our feeds and promises utopia. These incessant efforts to advance technology are symbolic of the culture founding the development of the internet, often collapsed into widely known ‘Silicon Valley culture’.
Silicon Valley’s Cultish Culture
Engineering culture in Silicon Valley secured its dominance through producing high quality products, integrating incentives into their organisational structure, specialising their processes and emphasising innovation as inseparable to its’ operations (Lecuyer, 2001). These characteristics help prevail Silicon Valley’s success and become the infamous global technology hub it is today (Lecuyer, 2001).
However, this capitalistic drive which merits innovation, prioritises what is good for business (Castell, 2011) which historically does not always translate into what is good for society.
This essay argues through the lack of diversity of groups leading the development of the internet, societies and individuals are subject to mediatisation as a growing, inescapable issue.
History repeats itself

Silicon Valley is not our first glimpse of privileged power play in technology but rather, a modern manifestation of this precedent.
Dating back to the 1800s, female scientist Ada Lovelace, revolutionised the computer science field by laying the foundations of how computers developed. Yet, scholars in the field believed the work of Lovelace was overstated and is emblematic of the reductive normative gender roles within the industry.
Eventually Lovelace was recognised as the first computer programmer yet the issue of representation continues to persist. Women only attribute to 28% of the science and engineering workforce (Nwafor, 2021) making space to further situate older, white men as the face of science, technology, engineering, and mathematic (STEM) industries.
But the case of Lovelace is not only problematic because it solidifies historical untruths nor is the issue of diversity an issue of profitability. Rather, through diversifying STEM industries further develops better functioning technologies and propel social progress.
Ultimately, a lack of diversity is self-sabotaging to a truly well-functioning internet and goes against the fundamental motive of New Communalists (Lusoli & Turner, 2021); because how can the internet be a space ‘for the people’ when an overwhelming proportion of citizens’ participation are discounted and overlooked?
Growing intimacy of the internet
Jenkins’ ideas of participatory culture (2006, as cited in Jenkins 2014) illustrates how content alone does not produce meaning rather, social semiotics are conceived alongside the affordances of such medium- what is referred to as ‘mediatisation’ (Couldry & Hepp, 2013).
Further, the natural trajectory of the pandemic propelled an already deepening mediatisation, which further muddles the ability to pinpoint the root of the issue as developments advance. Thus, distinguishing between interactivity and participation is critical in understanding digital ethnography (Jenkins, 2014) .
Therefore, despite efforts of governance without acknowledging and critiquing the hegemonies leading these development such efforts are merely decorative.
Gatekeeping employment
We see embellishment of technology development with recent introduction of Artificial Intelligence (AI) in assisting workplace interview process.

These interview systems are built on norms instilled by their developers; therefore, atypical behaviour is posited as unfavourable rather than for what they really are- different to the hegemony.
Moreover, the prevalence of one-way interviews is increasing and being favoured by firms for their heightened efficiency. Yet these systems built by neurotypical developers fail to consider how neurodivergent prospects are disadvantaged.
For example, automatic speech recognition software is bias towards men and younger adults (Guo et al., 2020). Further, people with disabilities who have the inability to speak must have blind faith in the systems output. Similarly, those who are hard of hearing are likely to face consequences for the way speech recognition may be wrongly interpreted (Guo et al., 2020).
The array of disabilities and the ways they manifest being both unpredictable and disparate counters the very binary operations of AI. Therefore, neurodivergent individuals are at the detriment to the design of systems which are not designed for them- again, by the failure of satisfactory consideration of diversity in technological development.
Similarly, gendered bias also seeps into job recruitment with both Google and Facebook job ads (Ifeoma, 2021) being directed towards men; echoing normative discriminatory gendered roles within the workforce.
The problem with AI’s feedback system
Issues with a lack of diversity also manifest in the functioning of Artificial Intelligence (AI).
Microsoft’s Tay was introduced as a Twitter chatbot to showcase AI’s self-learning capabilities, shaped by its interactions with others. Initial tweets of “humans are super cool” quickly turned misogynistic, racist and antisemitic after learning and repeating ideaologies uttered by toxic technoculture.
The now privated Tay twitter account meant to showcase the utopian imaginaries of AI only highlighted the toxic manifestations of the hegemonies building such systems. In essence, this failure of Tay can be attributed to developers’ negligence and overlooking the need to regulate hate speech of its AI (Ifeoma, 2021).
"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A
— gerry (@geraldmellor) March 24, 2016
Racist facial recognition software
Other incidents of the lack of diversity failing marginalised groups are revealed in instances of discriminatory facial analysis software.
In one instance, a black woman named Joy Buolamwini was confronted with the invisible technological prejudices against black people. Buolamwini found generic facial recognition software was unable to detect her facial features until she wore a white mask. This is reminiscent of HP’s face tracking feature’s inability to recognise black faces in 2015 and signals a failure of technology industries acknowledging these entrenched biases, leading to repeated offences.
Similarly, Flickr’s issue with auto-tagging falsely labelled a black person as an ape reappeared with Google photos labelling a black male as gorilla. Showcasing these embedded biases do not only exist on public platforms but seep into apps for private consumption.
Evidently, these instances of auto-labelling and functional ineptitudes fail black communities by solidifying racist stereotypes into their infrastructures and outcasting people of colour from using such softwares.
Rampant sexism exposed by GamerGate
Further it is crucial to consider the role a lack of diversity plays in guiding normative online behaviour.
Gamergate is one recent exposition of how online behaviour induces offline consequences. This culture war illuminates the disempowerment female journalists and gamers face which have evolved from pernicious internet norms.
The gaming industry is largely male dominated by both developers and users, which provides a greater opportunity for gendered echo chambers to proliferate. Steam’s subsidiary company Valve produced a game, “Rape Day”, with the game’s objective of raping women. By gamifying sexual assault, an issue disproportionately troubling women exemplifies how women’s experiences and struggles are ill reflected in internet development. Moreover, it reflects the apparent toxicity threaded throughout internet norms.
Why platforms must be proactive, not reactive
However, it is not enough to react to technical issues as they arise, but rather we must actively address the root of the problem to avoid regressing into old rituals. The issue of reacting is often performative because of the rising expectations for larger companies to subscribe to politically correct behaviour.

Google introduced an AI ethic council as a response to critiques yet appointed an openly anti-LGBT leader. Google’s negligence demonstrates how performative decisions motivated to appeal to expectations only produce haphazard, temporary solutions. In this instance, the issue was not the absence of the council, but absence representation in the decision-making process.
Until the technology industry addresses its’ biases it is inevitable we see marginalised communities such as LGBTQ+, women, people of colour and disabled groups continue to be disillusioned.
Therefore, we posit the harm emerging from the development of the internet as a symptom of the systemic issues being a lack of diversity amongst elites and so, should not be reduced to a technological flaw.
Jasmin Ozolins (520406555) RE:05 Jenni.
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0).
References.
“Ada Lovelace, 1838” by Nefi is licensed under CC BY-NC-SA 2.0.
Bulao, J. (2022). How Fast Is Technology Advancing in 2022?. Tech Jury
https://techjury.net/blog/how-fast-is-technology-growing/#gref
Castells, Manuel. (2011). The Culture of the Internet. The Internet Galaxy: Reflections on the Internet, Business, and Society (pp. 36-63). Oxford University Press.
Clement, J. (2022). Distribution of video gamers in the United States from 2006 to 2021, by gender.
Statista. https://www.statista.com/statistics/232383/gender-split-of-us-computer-and-video-gamers/
Clement, J. (2022). Distribution of game developers worldwide from 2014 to 2021, by gender. Statista.
https://www.statista.com/statistics/453634/game-developer-gender-distribution-worldwide/
Cohn, J. (2019). Google’s algorithms discriminate against women and people of colour. The Conversation.
https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516
Couldry, N., & Hepp, A. (2013). Conceptualizing Mediatization. Communication Theory, 23(3), 191-202. https://doi.org/10.1111/comt.12019
Drage, R. (n.d.). How Humanity Underpins Business Sustainability and Profitability. My Business.
https://www.mybusiness.com.au/how-we-help/be-more-efficient/work-smarter/how-humanity-underpins-business-sustainability-profitability
“Google” by Carlos Luna is licensed under CC BY 2.0.
Guo, A., Kamar, E., Wortman Vaughan, J., Wallach, H., & Ringel Morris, M. (2020). Toward Fairness in AI for People with Disabilities: A Research Roadmap. ACM SIGACCESS Accessibility and Computing, 125, 1–1. https://doi.org/10.1145/3386296.3386298
Helfrich, T. (2021). Diversity and Technology Have the Power to Boost Business Revenues. Entrepreneur.
https://www.entrepreneur.com/growing-a-business/diversity-and-technology-have-the-power-to-boost-business/396905
Jenkins, H. (2014). Rethinking “Rethinking Convergence/Culture.” Cultural Studies, 28(2), 267–297. https://doi.org/10.1080/09502386.2013.801579
“Job Interviews” by World Relief Spokane is licensed under CC BY-NC-ND 2.0.
Koeze, E., & Popper, N. (2020). The Virus Changed the Way We Internet. The New York Times. https://www.nytimes.com/interactive/2020/04/07/technology/coronavirus-internet-use.html
Lecuyer, C. (2001). Making Silicon Valley: Engineering Culture, Innovation, and Industrial Growth, 1930–1970. Enterprise & Society, 2(4), 666–672. https://doi.org/10.1017/S1467222700005310
Levin, S. (2019). Google scraps AI ethics council after backlash: ‘Back to the drawing board’. The Guardian. https://www.theguardian.com/technology/2019/apr/04/google-ai-ethics-council-backlash
Lusoli, A., & Turner, F. (2021). “It’s an Ongoing Bromance”: Counterculture and Cyberculture in Silicon Valley—An Interview with Fred Turner. Journal of Management Inquiry, 30(2), 235–242. https://doi.org/10.1177/1056492620941075
McSorley, S. (20210). Why is Diversity Important in Technology? We are Crew. https://wearecrew.io/blog/diversity-in-tech/
Mellor, G. [@geraldmellor]. (2016, March 26). “Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about… [Image attached above] [Tweet]. Twitter. https://twitter.com/geraldmellor/status/712880710328139776
Morais, B. (2013). Ada Lovelace, the first tech visionary. The New Yorker. https://www.newyorker.com/tech/annals-of-technology/ada-lovelace-the-first-tech-visionary
Naughton, J. (2021). Can big tech ever be reined in? The Guardian.
https://www.theguardian.com/technology/2021/nov/21/can-big-tech-ever-be-reined-in
NowThisNews. (2021, June 12). Gamergate: The Sexist Side of Fandom | Bad Influence [Video]. YouTube. https://www.youtube.com/watch?v=33u2JExlGwQ
Nouri, S. (2021). Diversity And Inclusion In AI. Forbes.
https://www.forbes.com/sites/forbestechcouncil/2021/03/16/diversity-and-inclusion-in-ai/?sh=783f10be5823
Nwafor, I. E. (2021). AI ethical bias: a case for AI vigilantism (AIlantism) in shaping the regulation of AI. International Journal of Law and Information Technology. 4(1). https://doi.org/10.1093/ijlit/eaab008
Olson, P. (2018). The Algorithm That Helped Google Translate Become Sexist. Forbes. https://www.forbes.com/sites/parmyolson/2018/02/15/the-algorithm-that-helped-google-translate-become-sexist/?sh=25b845cd7daa
Pariser, E. (2018, December 19). How news feed algorithms supercharge confirmation bias | Eli Pariser | Big Think [Video]. YouTube. https://www.youtube.com/watch?v=prx9bxzns3g
Paul, K. (2019). ‘Disastrous’ lack of diversity in AI industry perpetuates bias, study finds. The Guardian. https://www.theguardian.com/technology/2019/apr/16/artificial-intelligence-lack-diversity-new-york-university-study
Romano, A. (2021). What we still haven’t learned from Gamergate. Vox. https://www.vox.com/culture/2020/1/20/20808875/gamergate-lessons-cultural-impact-changes-harassment-laws
SG Analytics. (2022). Is Silicon Valley Still Dominating Global Innovation?
https://www.sganalytics.com/blog/is-silicon-valley-still-dominating-global-innovation-/
Stanberry, K., Anderson, J., & Raine, L. (2019). Experts Optimistic About the Next 50 Years of Digital Life. Pew Research.
Spar, B et al., (2018). Global Recruiting Trends. LinkedIn Talent Solutions. https://business.linkedin.com/content/dam/me/business/en-us/talent-solutions/resources/pdfs/linkedin-global-recruiting-trends-2018-en-us2.pdf
Statista Research Department. (2021). Number of forcible rape and sexual assault victims in the United States from 1993 to 2020, by sex. Statista. https://www.statista.com/statistics/251923/usa–reported-forcible-rape-cases-by-gender/
TED. (2017, March 13). How I’m fighting bias in algorithms | Joy Buolamwini [Video]. YouTube. https://www.youtube.com/watch?v=UG_X_7g63rY&ab_channel=TED
TEDxTalks. (2021, July 2). How Algorithms Spread Human Bias | Corey Patrick White | TEDxOklahomaCity [Video]. YouTube.
Vincent, J. (2016). Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day. The Verge. https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
Winfield, H. (2021). Apple and Google still have an LGBTQ problem. Wired. https://www.wired.co.uk/article/apple-google-lgbtq-apps
wpengine. (2021). Increasing Women’s Participation in STEM: A Deep-dive into the Industry with Developers Institute. Candlefox. https://www.candlefox.com/blog/deep-dive-into-women-in-stem-with-developers-institute/
wzamen01. (2009, December 11). HP computers are racist [Video]. YouTube. https://www.youtube.com/watch?v=t4DT3tQqgRM