Does Google’s search engine have harmful biases?

Anqi He, Jiayi Shi, Nora Wang, Lu Li

Example 1

Google users who searched the question “what do terrorists wear on their head?” , noticed that the search engine provided the answer “Kufiya” in Palestine. This angered Palestinian and Arab activists, who said Google’s move ignored facts, obscured voices and distorted the image of Palestine, as the Palestinian Kufiya was once the national symbol of Palestine. In this case, if the search result inaccurately associates the Kufiya solely with Palestine and potentially misrepresents its cultural significance, it could be viewed as problematic and offensive to Palestinian and Arab communities. It highlights the importance of search engines and online platforms in providing accurate and culturally sensitive information to their users.

Example 2

The article from the Guardian, titled “Google’s Autocomplete shows Right-wing bias,” discusses concerns that the autocomplete feature of Google’s search engine could reflect right-wing bias and promote political propaganda. Overall, the article raised concerns that the autocomplete feature of Google’s search engine could reflect right-wing bias and promote political propaganda. It discusses allegations, provides examples, and highlights the challenges of addressing bias in algorithmic systems such as autocomplete. It also highlights the wider social impact of biased search results and calls for greater transparency and accountability from Google.

Example 3: Occupational stereotypes due to search engine bias

New York University conducted an experiment on search engine bias. The researchers told participants they were looking at Google image search results for four different occupations. To mimic Internet search results in different countries, they chose two conditions: a low-inequality condition (gender composition closer to 50% men and 50% women). and high inequality conditions (gender composition is approximately 90% male and 10% female). Before viewing the search results, participants provided prototypical judgments about each occupation, that is, which gender they believed was more likely to pursue those occupations. In this baseline assessment, participants generally believed these careers were more suitable for men

After viewing the search results, participants in the low-inequality condition reversed their male-biased prototypes relative to baseline assessments, becoming more likely to believe that these occupations were suitable for women as well. Participants in the high-inequality condition, on the other hand, continued to maintain their male bias.

The findings therefore highlight the cycle of bias transmission between society, AI and users. As search engines continue to bias their output, more participants will become accustomed to high inequality conditions. At the same time, levels of social inequality are evident in Internet search algorithms, and exposure to this algorithmic output may cause human users to think and behave in ways that exacerbate social inequality. This demonstrates the harmful nature of bias in internet search results.


Example 4: Racial bias in Google’s Search Engines: Uncovering implicit biases search results

A typical example reveals a certain bias in search engine results presentation related to ethnicity and skin color. When conducting a keyword search for “unprofessional female hairstyles“, the majority of results displayed are those of black women, whereas searching for “professional female hairstyles” predominantly showcases results of white women. This suggests that search engines may exhibit some degree of bias when presenting search results related to race and skin color.

In this context, many hairstyles worn by black women are erroneously portrayed as styles that are unfairly restricted within various professional settings. Hairstyles such as cornrows, braids, long hair twists, African-style afros, and voluminous hair are stigmatized and wrongly considered unsuitable for presentation in professional environments. These hairstyles are unfairly associated with stereotypes like “rudeness” and “uncleanliness.” Conversely, hairstyles chosen by white women tend to align more easily with traditional, tidy, or professional appearance standards. This phenomenon highlights unfair treatment towards black women, reinforces racial stereotypes, and suggests that search engines may exacerbate these biases in their results presentation.

It is worth noting that this bias stems from multiple factors, including algorithm design, dataset selection, and user behavior. Addressing this issue requires comprehensive efforts to ensure the neutrality and fairness of search engine results, as well as broader reflection and education to reduce racial biases and inequalities within society.

Reference List:

Communications, N. W. (2022, July 12). Gender Bias in Search Algorithms Has Effect on Users, New Study Finds. Www.nyu.edu. https://www.nyu.edu/about/news-publications/news/2022/july/gender-bias-in-search-algorithms-has-effect-on-users–new-study-.htmlPierre, C. (2016). What IS Professional Hair?: An Opinion Piece. [online] www.linkedin.com. Available at: https://www.linkedin.com/pulse/what-professional-hair-opinion-piece-courtney-showell/ [Accessed 13 Sep. 2023].

Essa, A. (2021, May 25). Google search results suggest Palestinian keffiyeh a symbol of terrorism. Middle East Eye. https://www.middleeasteye.net/news/israel-palestine-google-criticised-keffiyeh-headscarf-terrorists 

Pierre, C. (2016). What IS Professional Hair?: An Opinion Piece. [online] www.linkedin.com. Available at: https://www.linkedin.com/pulse/what-professional-hair-opinion-piece-courtney-showell/ [Accessed 13 Sep. 2023].

The Guardian. (2016, December 16). Google’s Autocomplete Shows a Rightwing Bias. The Guardian. https://www.theguardian.com/technology/2016/dec/16/google-autocomplete-rightwing-bias-algorithm-political-propaganda