“oberlin gender stereotypes” by istolethetv is licensed under CC BY 2.0.
In this diversified information age, gender stereotypes are still with us. Gender stereotypes can be defined simply as oversimplified and fixed expectations and beliefs about males and females that are based on their gender rather than individual traits. Google’s search algorithm, while designed to improve the user experience, may inadvertently play up gender stereotypes in society through its reactivity based on user behavior, overly personalized search results, and its incomplete transparency of how it works. This article discusses filter bubbles in Google Search and the unconscious reinforcement of social stereotypes caused by algorithmic opacity, and how to reduce algorithmic bias.
Unconscious reinforcement of stereotypes
“Nursing Class 2006” by timefornurses is licensed under CC BY-ND 2.0.
The reactivity of the algorithm reinforces existing gender stereotypes. When users conduct searches, their queries and click behavior provide data to search engines. This is closely related to the way algorithms operate, mainly using data to make decisions during operation, which mainly rely on user visit rates (Bar-Ilan, 2007). For example, when a user searches for a keyword related to “nurses,” if a majority of users click on images or content related to female nurses in the search results, the search algorithm may interpret this behavior as a user preference for female nurses. Noble (2018) points out that these biased search results are not isolated incidents but reflect broader social biases as well as the power dynamics of tech companies. Therefore, in future search results, the algorithm may put more emphasis on content related to female nurses. This process is part of a program that reinforces gender stereotypes reflected in society, namely, the widespread belief that nurses are women. The phenomenon highlights the driven nature of search algorithms, which rely on user data to personalize search results, but can also inadvertently reinforce social divisions, further marginalize minorities, and reflect the lack of diversity in the tech industry itself. Therefore, it is the responsibility of algorithm designers and users to ensure the fairness of algorithms and to constantly monitor and improve them.
Information Filter Bubble
“Hillary Clinton” by Nrbelex is licensed under CC BY-SA 2.0.
In the digital age, it has become increasingly apparent that excessive personalization of Google’s search engine can lead to information follicles. Filter bubble is a network phenomenon in which online platforms, such as search engines, use algorithms to recommend content based on user behavior and preferences, resulting in users being exposed to information that is consistent with their existing views and interests, while it is difficult to see conflicting views, which may lead to the solidification of users’ views. As Allen (2012) pointed out in The Filter Bubble, the side effect of these online personalization technologies is that they are so deeply embedded in our society that, while they are almost everywhere, they are becoming increasingly difficult to recognize when we critically analyze information. For example, in the 2016 US presidential election, Hillary Clinton became the first woman to win the nomination of a major political party. During that time, algorithms on search engines, social media, and other digital platforms often pushed news and content related to Hillary Clinton to users who had already shown an interest in women’s rights issues, gender equality, or other topics related to women. For those who regularly search for or view feminist and gender equality content, their feed may be flooded with pro-Hillary articles, videos, and other content. At the same time, however, other election issues such as the economy, foreign policy, or the views of other candidates may be relatively ignored. There is a significant concern when bias in politically relevant searches may influence electoral decisions, as this “filter bubble” mainly reinforces users’ existing positions (Bozdag, 2013). This “filter bubble” caused by search engine personalization algorithms may increase political polarization. Users are more exposed to information that is consistent with their existing views, and their knowledge and exposure to other views are limited.
“Investigating the Transparency of Algorithms”
Algorithmic opacity has highlighted the issue of gender bias in Google’s search engine in particular. Algorithmic opacity refers to the fact that the internal workings of algorithms are difficult for outside observers to understand, especially in complex machine learning models, making it difficult to understand their decision-making processes and how results are generated (Burrell, 2016). For example, the #MeToo movement is a global movement that aims to expose the prevalence of sexual harassment and assault worldwide, encourage victims to speak out, and promote a social and cultural movement. Because of sensitive topics like “#MeToo,” algorithms can be biased toward historically popular or frequently cited ideas, and some important but smaller voices can be marginalized. This not only misleads users to a certain extent but may also reinforce existing gender stereotypes. This fact shows that the algorithmic opacity of Google’s search engine, in its complex mechanism for determining content ranking and presentation, makes it difficult for outsiders to fully understand and evaluate the fairness of search results. The search results are not simply an objective reflection but are selected and ranked by algorithms, which are often sexualized, demeaning, or stereotypical when it comes to search queries related to black women (Noble, 2018). Therefore, in order to ensure the impartiality and accuracy of search engines, it is necessary to conduct a deeper review of algorithms, increase their transparency, and consider the complex biases that exist in social and cultural contexts.
“de #metoo à #wetogether” by Jeanne Menjoulet is licensed under CC BY-ND 2.0.
“Enhancing Algorithmic Transparency and Diversity in the Digital Age”
While both the concerns about search engines and the criticisms of algorithms are well-founded, we must also take into account the fact that algorithms simply reflect vast amounts of user data, which is derived from the real world. First, algorithms are generally not entities that actively formulate social norms or biases; they simply learn from data and predict what users are likely to be interested in. Criticism of gender stereotypes, such as the results for the keyword “nurse,” reflects common beliefs in real life, not the bias of the algorithm itself. Secondly, users have discretion. They have the right to decide what constitutes trustworthy information. In addition, many tech companies are actively working to improve their algorithms to reduce bias and increase the diversity of recommended content. Increasing algorithmic transparency is an important issue in today’s tech world, but that doesn’t mean all search results are problematic. Building a more just and open digital society requires not only technical solutions but also educating and training users so they can sift and evaluate online information more intelligently.
In today’s digital age, gender stereotypes persist and impact our online experiences. Google’s algorithm may inadvertently reinforce these stereotypes due to its design and data sources. While tech companies are striving for algorithmic improvements and transparency, users also bear responsibility for discerning content. Tackling this issue demands a holistic approach, combining technological advancements, user education, and societal awareness to foster a more equitable digital landscape.
Allen, J. (2012). The Filter Bubble: What the Internet Is Hiding from You. Policy Perspectives (Washington, D.C. 1994), 19, 131–. https://doi.org/10.4079/pp.v19i0.10431
Bar‐Ilan, J. (2007). Manipulating search engine algorithms: The case of google. Journal of Information, Communication and Ethics in Society, 5(2/3), 155–166. https://doi.org/10.1108/14779960710837623
Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 205395171562251. https://doi.org/10.1177/2053951715622512
Gordon, S. (2023, April 28). This is why the #MeToo movement matters. Verywell Mind. https://www.verywellmind.com/what-is-the-metoo-movement-4774817
Noble, S. U. (2018). A society, searching. In Algorithms of oppression: How search engines reinforce racism (pp. 15–63). essay, New York University Press.
By Jiajing Song
This work is licensed under Attribution-NonCommercial 4.0 International