The detection of biases in search engines and online platforms has received a lot of interest in academic circles. Safiya Umoja Noble’s book ”Algorithms of Oppression,“ published in 2018, is a vital work that has considerably added to our knowledge of these biases. Noble‘s thorough investigation digs into the various ways in which racial biases are profoundly built in the algorithms that operate search engines, resulting in the misrepresentation or exclusion of marginalised groups from search results.She points out, for example, that searches for terms connected to marginalised communities frequently produce insulting or damaging content, reinforcing stereotypes and bad attitudes. This content may include insulting language and offensive pictures, as well as biassed storylines and misleading information.
These search results are not only upsetting, but they also reinforce damaging preconceptions and bad attitudes towards these populations.Stereotypes are reinforced: One of the most troubling consequences of these biassed search results is that they reinforce prejudices. When negative or unpleasant content arises often in answer to questions about specific racial or ethnic groups, it perpetuates societal stereotypes and strengthens existing biases. Users who rely on search engines for information may unintentionally acquire and internalise these prejudices, contributing to the perpetuation of bias and discrimination.
Harmful pornographic content towards female
Google changed and enhanced the search content in response to Noble‘s recommendation that putting ”black girl“ into Google would bring porn results. For the time being, though, searching for ”Asian girl“ yields similarly pornographic and sexualized results. Google appears to have just modified the search text for ”black girls“ and ignored the results for ”Asian girls.“ The question is why putting ”____ girls“ into Google still returns obscene results, including additional ”Asian girls.“ Even if the user has never searched for porn, Google pages will automatically push the content and classify it as the most popular and reputable. As a result, we can‘t help but wonder about the explicit and implicit biases that people may encounter in various search engines, databases, and algorithms throughout their lifetimes.
Additionally, in law enforcement facial recognition systems for identifying criminal risk, the risk associated with Black individuals is significantly higher than that for other ethnic groups.Discriminatory law enforcement in the United States is particularly evident. For instance, the COMPAS system could predicting the risk of criminal tendency after release from prison. Although some black people are more likely not to reoffend, they are labeled as having twice the risk of white people.
There are an enormous amount of resources but the Google search engines are still leading users to incorrect headlines that cause serious misundestandings. The search engines are now relying on the algorithms generated system to operate the search results. But this convinience of the overly smart system is rather doing a counter-effect that gives indirect discrimination of race and gender. The biggest flaw of the algorithms generated system is that it tends
Big Lex. (2018, September 2). Racial Bias and Gender Bias Examples in AI systems. Medium; The Comuzi Journal. https://medium.com/thoughts-and-reflections/racial-bias-and-gender-bias-examples-in-ai-systems-7211e4c166a1
Illing, S. (2018, April 3). How Google and search engines are making us more racist. Vox; Vox. https://www.vox.com/2018/4/3/17168256/google-racism-algorithms-technology
Noble, S. U. (2018). Algorithms of oppression : how search engines reinforce racism. New York University Press.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.