Valuation of structural inequalities in the development of Internet: The case of gender and racism

Does the structure of the internet reduce or increase gender and racial inequalities?

We unwittingly and unconsciously divide the world into Us Versus Them and show biased treatment to those who we feel are outsiders. [Image: Keira McPhee,, CC BY 2.0,]

When we think about internet, we imagine an ahistorical, unbiased, and autonomous technology that is able to solve almost all the cultural, social, political, economic and environmental issue we face today. The today’s internet is a public space, an online shop, a doctor’s clinic, a school, an office, a bank, an entertainment platform, a socialization tool and much more. It has given voice to the marginalized people and provided opportunities for public discourse over rising gender-based and racial inequalities (Berners-Lee, 2020). While the advantages of internet are aptly exaggerated, fewer have examined the system design of tech products and services that conforms to the existing social inequalities and discriminations. The political and economic philosophies of tech giants support structures that contribute towards social inequality and discrimination (Popiel, 2018). These philosophies embedded in the design of internet technologies reify the whims and desires of majority groups (often male, white, cis-gendered, and heterosexual) which unwittingly intensifies societal inequalities against women and people of colour. In this essay I use the terms structure and design interchangeable to mean the assemblages of algorithm, bots, policies, scripts, and ethical standards that are integrated into the structure of tech companies’ products and that encourage certain behaviours and cultural practices while discourages others.

The early internet infrastructure

It is believed that the contemporary biased practices of secluding and sexualizing women and discriminating people of color have been adopted from earlier internet technologies dominated by white males (Cohn, 2019).The community of early internet developers and users was predominantly homogenous. The participation in the development and usage of internet was (and still is to larger degree) dominated by mainly white and middle class males (Levinson & Christensen, 2003, p. 1459). Internet at this stage was mainly for scientific community as the overwhelming information on it was of a scientific and informational nature. Indeed, it can be said that at this early stage the participation of women and people of other races might have been minimal to none. Consequently, the dominance of white male homogeneousness in the early development stages of internet shaped its infrastructure in a way that excluded counter arguments and ideas on the rights of women and other minority groups. In the aftermaths, we find the contemporary internet as highly centralized with strong monopoly which subverts women and other minorities.

Facebook is responsible for reducing our privacy and subjecting us to unwanted divisive and polarized content. Photo by Annie Spratt on Unsplash, Licensed under CC BY-SA 4.0.

Lobbying for a biased structure

The structure of internet technologies is driven by business models of accumulating profits at any cost. Since 2005 tech giants spent nearly $247 million on lobbying for privacy, online advertising, cyber-security, net neutrality legislation, and broadband (Popiel, 2018). All of these factors directly affect their business practices. If they lose monopoly over these factors, they are unlikely to follow their capitalist business models. According to Popiel (2018) tech giants like Facebook, Google, and Amazon tend to maintain a strong monopoly over the management of internet, exert their control over technical standards, and harnessing networks to lobby for patent protections. These efforts and tactics help the tech giants curb competition in the industry which would have reduced their power in the market. Indeed, tech giants exert strong monopoly in the sector. For instance: Facebook is able to attract 77% of the mobile traffic; Google is the first search option for almost 88% of internet users and 97% mobile search; and Facebook and Google hold 63% of the world’s digital ad market (eMarketer, 2017). The government often tends to avoid meddling with the lobbying of tech giants, possibly due to the tech sector’s ability to provide rich user data for the government’s surveillance related activities.

The revolving of structural design of internet technologies around financial and neoliberal ideals stems grave consequences for minorities like women and people of colour. Facebook whistleblower Frances Haugen in her interview with CNBC notes that during her time at Facebook she saw that there was conflict of interest between what was good for the public and what was good for Facebook’s self-interest (which is making more money) (60 Minutes, 2021). She found that Facebook has constantly optimized its design for its own self-interest. Facebook and other social networking platforms’ feed is algorithmically designed to pick up-voted or most liked content for you that is optimized to get engagement or a reaction from you such as like comment or share (Massanari, 2017). Haugen further states that Facebook’s own internal research proves that polarized, hateful, and divisive contents can get more engagement from users than other contents. If Facebook – or any other tech company for that matter – changes its algorithm to be safer, then less people will visit and this directly impacts Facebook’s advertisement market share. The engagement seeking design of internet technologies create a herding mentality around particular content which biases individuals to mirror the majority view. Such design directly creates unequal and discriminated treatment of women and people of colour on internet.

It’s like something is always there to decide what you should know. Photo by Brett Jordan on Unsplash. Licensed under CC BY-SA 4.0.

Search Engines’ autosuggest completion

One of the most extensive works on structural inequalities of internet – particularly search engines like Google – is done by Professor Safiya Noble in her book Algorithms of Oppression. According to Noble (2018) the history of search engines or algorithms is fraught with structural problems which subject minority groups such as women and people of color to the whims of majority group and other commercial influencers. Back in 2015, Google search engine autosuggested a range of sexist ideas against women. For instance, Noble typed into Google’s search box ‘Women cannot’, and the autosuggest revealed phrases like: drive, be bishops, be trusted, and speak in church (Noble, 2018, p. 15). However, if we search such query today, Google does not autosuggest anything. It can be said that Google has made considerable changes to its algorithm after Noble’s (2018) extensive work on Google’s biased and sexualized structure. Currently Google search box does not autocomplete phrases which may result in problematic results.

In a sense Google is reducing its functionality rather than improving it. As the matter of unequal treatment pertains to women and people of colour, so Google is willing to reduce its functionality. This obviously makes Google less welcoming and less useful for women and people of colour. Interestingly, tech giants like Facebook and Google maintain structural designs that prioritize search results of users that serve the purpose of promoting the company’s own self-interests (Noble, 2018, p. 24). In case the majority group and powerful elites counter a problem in their experience with Google search, the response of Google might be different as it will affect their business model and monopoly over the internet. Such preferential treatment in the design of tech companies unwittingly contributes towards growing inequality and discrimination against women and people of colour.

A flicker of hope

Access to information and social networks on the internet is the right of everyone. The ability of tech companies to pre-condition this right with their economic and neoliberal self-interests is problematic. Internet technologies have strong monopoly and tremendous amount of power to affect the lives of women and minority groups through its structural design which tends to favor the whims and desires of the majority and powerful.

Without doubt, women and people of color will benefit from becoming programmers and building alternative social and search platforms that are less biased and troubling. This will help prioritize the provision of wider range of information and perspectives for users which are generally censored by tech giants. A positive impact of such diversity in tech industry will be that the tech companies will not be able to bias the information towards the majority and the most powerful of the society at the cost of unequal treatment of women and minority groups.


Word Count: 1263




60 Minutes. (2021, October 4). Facebook whistleblower Frances Haugen: The 60 Minutes interview. [Video]. YouTube.

Berners-Lee, T. (2020, March 12). 30 years on, what’s next #fortheweb. World Wide Web Foundation.

Cohn, J. (2019, February 1). Google’s algorithms discriminate against women and people of colour. The Conversation.

eMarketer. (2017, September 21). Google and Facebook tighten grip on US digital Ad market.

Levinson, D., & Christensen, K. (2003). Encyclopedia of community from the village to the virtual world. SAGE.

Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346.

Noble, S. U. (2018). Algorithms of Oppression: How search engines reinforce racism. New York University Press.

Popiel, P. (2018). The tech lobby: Tracing the contours of new media elite lobbying power. Communication Culture & Critique, 11(4), 566-585. doi:10.1093/ccc/tcy027



Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.