How Search Engines Are Manipulating YOU

Google prioritises making money over presenting objective information

Google Main Search" by MoneyBlogNewz is licensed under CC BY 2.0

This essay analyses the significant impact the search engine can have on society, with a focus on Google. Its initial function of providing data in a convenient and objective manner has been subverted by a capitalist agenda. Ultimately, despite the many advantages the search engine has, in its current form, it poses more threats to society than benefits. This will be argued through analysis of advertising, filter bubbles, and autocomplete recommendations.

 

How it Works

Figure 2: Explanation of how a search engine works. ‘The Internet: How Search Works‘. 

The Google search engine is known as a ‘Crawler-based search engine’ (Sullivan, 2002, p.1). Sullivan describes this type of search engine as having three main parts (2002, p.1):

  1. The Crawler or Spider: finds the relevant web pages, which is then passed on to the index
  2. The Index: becomes the database, storing all the web pages that the crawler finds
  3. Search engine software: tasked with going through all of the indexed web pages, and then ranking them in the order of what it deems most relevant

The Search Engine’s Genesis

The overarching concept which drove the technology towards this innovation was increasing difficulty to navigate through large amounts of files, as information databases grew to significant amounts.

Libraries are the original settings where information could be stored and searched through. Historically, information overload has always been inherent to libraries, and so, relevant technologies and techniques have and continue to evolve to help society sort through the information (Halavais, 2013, p.11). The use of computers in libraries established the fundamental functions of a search engine (Halavais, 2013, p.11).

Moreover, the industrial revolution is influential in its genesis, and the consequential changes to societal organisation caused a “flood of new paper files” (Halavais, 2013, p.12). A sort of domino effect of innovations began from this:

Stacking paper – pigeonholes – vertical filing

Wooton Desk
Figure 3: Pigeon hole Wooton Desk. Image: Maureen Wallenfang. All Rights Reserved.

At the inception of Google’s search engine, the algorithm utilised ‘PageRank’, which retrieved information and presented it to the user in a relatively objective way, with Google’s founder expressing that it worked within the “democratic structure of the web” (Carson, 2015, p.3).

However, in 2009 Google replaced ‘PageRank’ with personalisation (Carson, 2015, p.4). The was intended to provide increasingly relevant information to users according to their wants and interests.

In today’s society, search engines have moved out of the confinements of the web and are a key feature of media, as it has implemented itself into the entire “media ecosystem” (Halavais, 2013, p.9).

Filter Bubbles

The shift to a personalised search engine was the cause of copious harmful effects on society.

Filter bubbles occur as users are presented only with information supporting their current beliefs and knowledge. Consequently, they are not exposed to the information Carson considers the “most strategically important” (2015, p.8), that which challenges or threatens existing beliefs and assumptions.

Filter bubbles can significantly hinder growth of knowledge and can act to confirm false beliefs, distorting perceptions toward the world.

Twitter post about filter bubbles
Figure 4: Twitter screenshot about filter bubbles

Advertising

The change to a personalised algorotihm was motivated by profit. Personalisation was a way for Google to greatly increase revenue by selling advertising opportunities on its search engine (Carson, 2015, p.4). This made Google the most attractive and effective place for advertisers to reach their targeted users. Advertisers can pay to be the first website presented on the page, which is significant, as the top search result receives 75% of clickthroughs (Lazer, 2018, p.955).

Google’s initial democratic and objective delivery of websites built the reputation of trustworthy and reliable information. Therefore, when users see the top results, they believe these are the most relevant and accurate pieces of information, feeling inclined to click and continue on to that page.

Example of ads blended into search results
Figure 5: Example of google ads. Image: Google ads look just like search results now. Ricker. The Verge. All Rights Reserved

Economic implications

The search engine benefits two main parties: The companies controlling and enforcing the algorithm and the brands that can receive far more targeted and effective advertising.

Google currently dominates the market of  ‘search ads’, having a market share of 90% (Ketchell, 2020). From advertising searches, it generates over $20 billion per year (Noble, 2013, p.35).

Google's market share graphic
Figure 6: Google dominating the market share. Image: This Chart Reveals Google’s True Dominance Over the Web. Jeff Desjardins, Visual Capitalist. All Rights Reserved.

How does this affect the political sphere?

Epstein and Robertson suggest the search engine obstructs the ability to make any rational conclusions (2015, p.5). For example, if a user is only exposed to positive articles toward Trump, and only negative articles toward Biden, they will unknowingly adopt the attitude that Trump is better suited to be president. This also reiterates the effects that the filter bubble can have, and in combination with advertising ploys, can cause influential outcomes on elections, and thus the wider political landscape.

A common method used by political candidates to win, identified by Epstein and Robertson, is the “swing-voter model” (2015, p.5), which is where parties focus their efforts on forms of persuasion. If political parties invest money to search engine advertising (SEA), it has the potential to completely shift election results, as it is a proven form of effective persuasion. SEA is considered persuasive text (Haans, 2013, p.153).

Autocomplete Recommendations

Although autocomplete searches can be beneficial as they attempt to guess what the user is planning to search; they can also distort perceptions. The user may feel inclined to search what they were recommended, despite their initial intention. Baker and Pots evidence this through the example of someone planning to type “why do black holes exist”, however after beginning the sentence, the recommendation “why do black people have big lips” is suggested (2013, p.201), corrupting understandings.

Example of Autocomplete search engines
Figure 7: Example of biased autocomplete suggestion. Image: ‘How Google’s Instant Autocomplete Suggestions Work’. Danny Sullivan. All Rights Reserved.

Minority representation

Minority groups are subjected to the influence of the majority. When certain topics are searched for many times it increases the chance its recommended. As the majority controls the search engine results, how would the minority ever affect their representations (Noble, 2013, p.50)? Further, Noble evidences that the most vulnerable communities are discriminated against by search engine algorithms (2013, p.50).

How can it be beneficial?

Despite the negatives, the search engine’s main benefit is the access to information it provides. Whilst sometimes biased, if the individual understands the processes behind a search engine, it is possible to safely navigate and find a balance of information, which can “improve the overall quality of our lives” (Carrol, 2014, p.16).

Therefore, one could argue that in order for search engines, such as Google, to function, requires the pursuit of monetisation, meaning that it is down to the user to understand and overcome this by pursuing challenges to beliefs, seeing both sides to discussions.

How am I affected?

When a company holds so much power and influence, it is somewhat daunting to consider how I might be affected.

Arguably, the most prominent factor requiring scrutiny is the nature of manipulation. The user is largely unaware that it is even occurring. Only 8% of users can make a distinction between unpaid and paid search results (Noble, 2013, p.53). Consequently, this creates an environment of uncertainty, Epstein expresses that individuals tend to believe they have come to certain conclusions voluntarily (2015, p.9).

This increases the difficulty for me to find authentic information and discern between what is real and fake.

For example, when searching for information for this essay, especially that of a negative nature to Google, how do I know certain information has not been restricted by Google, or filtered out? Manipulations are very hard to detect (Epstein and Robertson, 2015, p.9), and what power would one even have when subjected to such forces?

Conclusion

The search engine had initially successfully presented information in a democratic way. However, through the change from PageRank to personalisation, it has shifted in priorities, from presenting information in the most accessible manner to a way that is most profitable.

Thus, this essay argues that the current environment the search engine has created, is uncertain and highly constructed, with the negatives outweighing the positives.

The subsequential landscape of google search is ambiguous and biased. However, can Google really be blamed for acting in the interest of its business? Is that likely to ever change? Epstein and Robertson believe it will only get worse (2015, p.7).

I contend that old forms of information retrieval through, for example, libraries, included less bias, as they were not as subjected to the influence of profits. Therefore, I argue that trustworthy information in today’s environment is harder to attain.

Word Count: 1424

Reference List

Baker, P., & Potts, B. (2012). ‘Why do white people have thin lips?’ Google and the perception of stereotypes via auto-complete search forms. (P.195-205).

Carrol, N., 2014. In Search We Trust: Exploring How Search Engines Are Shaping Society. [online] ResearchGate. Available at: https://www.researchgate.net/publication/265508430_In_Search_We_Trust_Exploring_how_Search_Engines_are_Shaping_Society

Carson, A., 2015. Public Discourse In The Age Of Personalization: Psychological Explanations And Political Implications Of Search Engine Bias And The Filter Bubble. [online] Sciencepolicyjournal.org. Available at: https://www.sciencepolicyjournal.org/uploads/5/4/3/4/5434385/pa3finalformattedv2.pdf

Epstein, R. and Robertson, R., 2015. The Search Engine Manipulation Effect (SEME) And Its Possible Impact On The Outcomes Of Elections. [online] Pnas.org. Available at: https://www.pnas.org/content/pnas/112/33/E4512.full.pdf?with-ds=yes&source=post_page

Ghose, A., Ipeirotis, P. and Li, B., 2014. Examining the impact of ranking on consumer behaviour and search engine revenue. informsPubsOnline. Availble at: https://pubsonline.informs.org/doi/pdf/10.1287/mnsc.2013.1828

Haans, H., Raassens N. and Van Hout, R., 2013. The impact of advertising statements on click-through and conversion rates. SpringerLink

Halavais, A. (2013). The engines. In Search engine society (pp. 5–31). Cambridge, UK ; Malden, MA: Polity.

Lazer, D., Robertson, R. and Wilson, C., 2018. Auditing the Personalisation and Composition of Politically-Related Search Engine Results Pages. Web and Society. Available at: https://dl.acm.org/doi/pdf/10.1145/3178876.3186143?casa_token=-ko2qKPiSt8AAAAA:W9aM7PGizThF9NUMfb8077kZcZ88yXIBurtZ3ZD4qxYgRUwR00IXnU6i6d1m6t0aHIMHRrf0UzDIMQ

Noble, S. U. (2018). A society, searching. In Algorithms of Oppression: How search engines reinforce racism (pp. 15–63). New York University

Preis, T., Reith, D. and Stanley, E., 2010. Complex Dynamics Of Our Economic Life On Different Scales: Insights From Search Engine Query Data. [online] Royalsocietypublishing.org. Available at: https://royalsocietypublishing.org/doi/pdf/10.1098/rsta.2010.0284

Sullivan, D., 2002. How Search Engines Work. [online] Didattica-2000.archived.uniroma2.it. Available at: https://didattica-2000.archived.uniroma2.it//prog_web/deposito/search_engine.pdf

Embedded content References

Figure 1: MoneyBlogNews, 2010. Google Main Search. [online] Flickr. Available at: https://www.flickr.com/photos/22127803@N02/5267464508

Figure 2: Code.org, 2017. The Internet: How Search Works. [online] Available at: https://www.youtube.com/watch?v=LVV_93mBfSU

Figure 3: Wallenfang, M., 2018. ‘Pawn Stars’ Pays Big For Antique Desk From Fox Crossing’s Harp Gallery. [online] Postcrescent.com. Available at: https://www.postcrescent.com/story/news/2018/03/06/pawn-stars-pays-big-antique-desk-fox-crossings-harp-gallery/399576002/

Figure 5: Porter, J., 2020. Google’S Ads Just Look Like Search Results Now. [online] The Verge. Available at: https://www.theverge.com/tldr/2020/1/23/21078343/google-ad-desktop-design-change-favicon-icon-ftc-guidelines

Figure 6: Desjardins, J., 2020. This Chart Reveals Google’S True Dominance Over The Web. [online] Visual Capitalist. Available at: https://www.visualcapitalist.com/this-chart-reveals-googles-true-dominance-over-the-web/

Figure 7: Sullivan, D., 2011. How Google Instant’s Autocomplete Suggestions Work. [online] Search Engine Land. Available at: https://searchengineland.com/how-google-instant-autocomplete-suggestions-work-62592

 

 

Avatar
About Oliver Bowman 2 Articles
I am a student at the University of Sydney, studying digtial cultures. I have always been intrigued with media, and how it is constantly changing and evolving everyday. I am especially interested in how it can influence thoughts and perceptions.