
Since historical times, stereotypes originating from the Silicon Valley that white males or “geeks” dominate the tech world and its development continues to stand (Massanari, 2017). “The revenge fantasies of Silicon Valley founders, in which the so called geek or nerd gains power and moves from a marginal position to dominate their competitors, almost always valorises a white man” (Fan, 2014, in Massanari, 2017, p. 332). The development of technology introduces newly advanced innovations such as Artificial Intelligence (AI), machine learning, and the discovery of new platforms which brought anxieties on automation, reminding us to be aware on the biases of Silicon Valley’s engineers (Zajko, 2022).
Through the development of the internet, we encounter these biases in various platforms, search engines, and even trusted content circulating the internet. Unfortunately, they continue to harm societies and individuals who puts their trust on the internet (Zajko, 2022). Today, we need to be critical of the online space that have made our lives easier and not be blindsided by the range of free services they give us, ironically showing their monopolistic behaviour as it is their tactic to dominate the market.

Google search results

The internet depend on algorithms that emphasises popular and recent content (Massanari, 2017) with minimal effort on recognising what that content might be or its effect on certain groups in society. Machine learning allows algorithms to be trained using datasets that reflect human judgement and priorities (Zajko, 2022), and is predicated on specific values from the most powerful institutions in society and those who control them; men in Silicon Valley (Noble, 2018). This perpetuates further from platform’s algorithms and into an “authoritative mechanism that is trusted and highly used by the public: Google“ (p. 32).
Sofia Noble did an experiment using the Google search engine and retrieved pornographic images when simply typing “Black Girls” (Noble, 2018). Noble claims these problematic representations are not new, “Google’s dominant narratives reflect hegemonic frameworks often resisted by women and people of colour” (p. 24). Google’s search engine works in a way that they prioritise results with the highest clicks from users and those commercial processes on paid advertising (Noble, 2018). The results retrieved from the google search “Black Girls” is a “reflection of old media traditions into new media architecture” (p. 24), where previously women of colour are sexualised and objectified in society.
The fappening on Reddit
These biases can also be discovered on a platform like reddit, an open-source platform on which anyone can create their own community of interest known as subreddits (Massanari, 2017). These spaces are devoted to the geek culture and STEM interest, presuming a white male centrality, having the tendency to see women as either objects of sexual desire or unwanted intrusions, creating an unwelcoming environment for women to enter (Varma 2017 in Massanari, 2017). Noble (2018) found in her research on the male gaze and pornography on the web page that “the internet is a communications environment that privileges the male, pornographic gaze and marginalises women as objects” (p. 58), which supports the argument that replications of “traditional misrepresentation in old media made real once again”(p. 24).

In late August 2014, stolen pornographic photographs of celebrities was posted to 4chan, and continue to spread on the subreddit /r/thefappening, which acted as an unsettling hub of conversation about the pictures and the celebrities involved (Massanari, 2017). Few expressed concern over the ethical question that this event raised, but concentrating on what other photos might surface or who is the next target. Furthermore, Reddit’s platform algorithm provides little support for discouraging material that might be objectionable or harassing (Massanari, 2017), “upvotes” on posts present high visibility and that content will most likely appear in the first page users see when they go on Reddit. Earlier votes also count more heavily than later ones, “hence downvoting after something has become popular is likely to have little effect” (p. 338), exemplifying how Reddit’s platform design tend to support various toxic techno-cultures – the term used to describe the toxic cultures enabled and propagated through sociotechnical networks – to thrive (Massanari, 2017).
Why aren’t companies taking action?
Despite the various harm that algorithmic bias have caused different groups in society, and the increasing effort scholars put into raising awareness in this issue, Google’s monopoly status drives them into “biasing information towards the interest of the neoliberal capital and social elites in the Unites States” (Noble, 2018, p. 36) and those that reflects advertising interests. In many case, racist and egregious content are highly profitable hence they continue to circulate the internet as many tech platforms prioritise attracting the interest and attention of the majority in the United States to generate profit rather than those of racialized communities (Noble, 2018).
The same goes for Reddit’s platform, part of the reason why administrators were reluctant to ban /r/thefappening despite the obvious harm it is doing is due to the reason that subscribers purchased enough Reddit gold to allow the entire site to run for a month (Greenberg, 2014 in Massanari, 2017). Both of these cases have proved that monopolistic behaviours of these tech giants continues to harm minority groups as they prioritise earning profit despite the toxicity that continues to circulate the internet.

“Profits Key” by Got Credit is licensed under CC BY 2.0.
Algorithmic bias on the extreme: risking health, wellbeing and employability
Biases might also arise from missing information, “resulting in datasets that are not representative of the target population” (Pessach et al., 2021, p. 1). What is alarming is the extent that it has reached to an extreme risking the health and well-being of certain groups in society. Scholars have identified that Latinx people are underrepresented in online content (as a result of automatic decision making through algorithms) about Prostate Cancer compared to the general U.S. population (Loeb et al., 2022). “Latinx men have a higher risk of being diagnosed of Prostate Cancer at a later stage” (p. 561), and with the increasing number of men seeking medical information online, underrepresentation of their race in the context of Prostate Cancer diagnosis dismisses their opportunity to mitigate risks in the early stage (Loeb et al., 2022); as most of the content they encounter was not be readily understandable nor actionable for health consumers.
On another extreme, lack of gender equality is found in HR hiring processes, particularly favouring men in the tech sector. Amazon’s hiring system uses AI to select job candidates showed bias against women (Pessach et al., 2021). They claim that data of resumes fed into the system over a 10-year-period were of men, showing their dominance in the tech sector, causing machine learning to adopt a pattern of bias towards men candidates (Pessach et al., 2021). Amazon’s system automatically penalises resumes which included the word “women” (Dastin, 2018), robbing the career opportunities they deserve and are very much qualified for.

Conclusion
All in all, through analysing different events that have occurred in the development of the internet, we can testify that biases originating from the Silicon Valley founders favouring white males continues to shape the internet we rely on today. This is evident at every stage of development from search engine results, internet platforms, and even to the latest most advanced technology; artificial intelligence, that seems to potentially be a big part of our future.
They have harmed society and individuals through sexually objectifying women in different parts of the internet, but what is even more concerning is the extent that lack of diversity can be life threatening to certain groups because of misrepresentation in online content. Not to mention, the biases in hiring process that are stealing women’s future of having a career and earning a good living. Unfortunately, tech giants dominating the industry tend to look away from these problems as long as they are generating high profits. The perfect solution to remove these biases requires further discussion, however, it is important that the society who are increasingly placing their trust and delegating every part of their lives to the internet have an understanding on the power that play across these new technologies and how that might reflect towards different groups in society.
Reference list:
Dastin, J. (2022). Amazon scraps AI recruiting tool that sowed bias against women. Reuters. Retrieved 10 October 2022, from https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G.
Loeb, S., Borno, H. T., Gomez, S., Ravenell, J., Myrie, A., Sanchez Nolasco, T., Byrne, N., Cole, R., Black, K., Stair, S., Macaluso, J. N., Walter, D., Siu, K., Samuels, C., Kazemi, A., Crocker, R., Sherman, R., Wilson, G., Griffith, D. M., & Langford, A. T. (2022). Representation in Online Prostate Cancer Content Lacks Racial and Ethnic Diversity: Implications for Black and Latinx Men. The Journal of Urology, 207(3), 559–564. https://doi.org/10.1097/JU.0000000000002257
Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
Noble, S. (2018). A Society, Searching. In Algorithms of Oppression (p. 15 – 63). NYU Press.
Pessach, D., & Shmueli, E. (2021). Improving fairness of artificial intelligence algorithms in Privileged-Group Selection Bias data settings. Expert Systems with Applications, 185, 115667–. https://doi.org/10.1016/j.eswa.2021.115667
Zajko, M. (2022). Artificial intelligence, algorithms, and social inequality: Sociological contributions to contemporary debates. Sociology Compass, 16(3). https://doi.org/10.1111/soc4.12962
Anchor links reference list:
Hardy, J. (2022). History of Silicon Valley. History Cooperative. Retrieved 10 October 2022, from https://historycooperative.org/history-of-silicon-valley/.
Smith, A. (2022). Silicon Valley: an army of geeks and ‘coders’ shaping our future. the Guardian. Retrieved 10 October 2022, from https://www.theguardian.com/technology/2014/may/12/silicon-valley-geeks-coders-programmers.
What is Artificial Intelligence (AI) & Why is it Important? | Accenture. Accenture.com. (2022). Retrieved 10 October 2022, from https://www.accenture.com/au-en/insights/artificial-intelligence-summary-index?c=acn_glb_brandexpressiongoogle_12786625&n=psgs_0122&gclid=CjwKCAjwv4SaBhBPEiwA9YzZvPSyGrXO_MOju0w3hWxxF1Bb7ZeZTn0Dg1JYUwGUnuvqu7PWkVRL5hoCrvcQAvD_BwE.
What is Machine Learning?. Ibm.com. (2022). Retrieved 10 October 2022, from https://www.ibm.com/au-en/cloud/learn/machine-learning.
What is a Platform? – Definition from Techopedia. Techopedia.com. (2022). Retrieved 10 October 2022, from https://www.techopedia.com/definition/3411/platform-computing.
What is automation? | IBM. Ibm.com. (2022). Retrieved 10 October 2022, from https://www.ibm.com/au-en/topics/automation.
Levin, S. (2022). ‘Bias deep inside the code’: the problem with AI ‘ethics’ in Silicon Valley. the Guardian. Retrieved 10 October 2022, from https://www.theguardian.com/technology/2019/mar/28/big-tech-ai-ethics-boards-prejudice.
What is Search Engine – Definition, Meaning and Examples. Arimetrics. (2022). Retrieved 10 October 2022, from https://www.arimetrics.com/en/digital-glossary/search-engine.
Dentzel, Z. (2022). How the Internet Has Changed Everyday Life | OpenMind. OpenMind. Retrieved 10 October 2022, from https://www.bbvaopenmind.com/en/articles/internet-changed-everyday-life/.
Internet and its Services – GeeksforGeeks. GeeksforGeeks. (2022). Retrieved 10 October 2022, from https://www.geeksforgeeks.org/internet-and-its-services/.
US tech giants accused of ‘monopoly power’. BBC News. (2022). Retrieved 10 October 2022, from https://www.bbc.com/news/business-54443188.
Denny, J. (2022). What is an algorithm? How computers know what to do with data. The Conversation. Retrieved 10 October 2022, from https://theconversation.com/what-is-an-algorithm-how-computers-know-what-to-do-with-data-146665.
Google – Wikipedia. En.wikipedia.org. (2022). Retrieved 10 October 2022, from https://en.wikipedia.org/wiki/Google.
In-Depth Guide to How Google Search Works | Google Search Central | Documentation | Google Developers. Google Developers. (2022). Retrieved 10 October 2022, from https://developers.google.com/search/docs/fundamentals/how-search-works#:~:text=When%20a%20user%20enters%20a,device%20(desktop%20or%20phone).
Google Ads – Get More Customers & Generate Leads with Online Ads. Ads.google.com. (2022). Retrieved 10 October 2022, from https://ads.google.com/intl/en_au/getstarted/?subid=au-en-ha-awa-bk-a-bm1!o3~CjwKCAjwv4SaBhBPEiwA9YzZvPNYKnT9OrqqDVBKLKimIJ6AaVAGZKU10WVeTjNTrSe68vqFU280HhoCwQQQAvD_BwE~137468663221~kwd-791253327000~18174688924~618689869861&gclid=CjwKCAjwv4SaBhBPEiwA9YzZvPNYKnT9OrqqDVBKLKimIJ6AaVAGZKU10WVeTjNTrSe68vqFU280HhoCwQQQAvD_BwE&gclsrc=aw.ds.
Reddit.com. (2022). Retrieved 10 October 2022, from https://www.reddit.com/.
What is STEM? – Department of Education. Department of Education. (2022). Retrieved 10 October 2022, from https://www.education.wa.edu.au/what-is-stem.
Reddit Has Banned /r/TheFappening. Jezebel. (2022). Retrieved 10 October 2022, from https://jezebel.com/reddit-has-banned-r-thefappening-1631645858.
Toxic technocultures – COM 473. Courses.rachaelsullivan.com. (2022). Retrieved 10 October 2022, from http://courses.rachaelsullivan.com/473wiki/index.php?title=Toxic_technocultures.
Reddit.com. (2022). Retrieved 10 October 2022, from https://www.reddit.com/coins.
The warning signs of prostate cancer – Independence Australia. Independence Australia. (2022). Retrieved 10 October 2022, from https://www.independenceaustralia.com.au/health-articles/prostate-cancer-warning-signs/?gclid=CjwKCAjwv4SaBhBPEiwA9YzZvCVyKfY4SkKQEux96aw2xq17QskLraYUMOnjDc4zEGVLGuSMMDlq5hoClnwQAvD_BwE.
Lauret, J. (2022). Amazon’s sexist AI recruiting tool: how did it go so wrong?. Medium. Retrieved 10 October 2022, from https://becominghuman.ai/amazons-sexist-ai-recruiting-tool-how-did-it-go-so-wrong-e3d14816d98e.