How Algorithms Limit Our Thinking Imperceptibly

In this digital age, the role of algorithms has becoming increasingly important. It is an integral part of people’s daily lives, and it has a huge impact on people. The algorithms of the internet will analyse the data of people and match people’s requests with the information database quickly. It determines everything that people see on the internet. Although algorithms bring people convenience, it often makes people exposed to singular information. Since the internet information is complex and diverse, fragmented information makes people tend to search for singular information online such information that they are interested in. In this light, people are more likely to be recommended information by algorithms that “what they may want to see” because algorithms can present the information to the public selectively. Singular information makes people learn less about the world, which also makes their thinking limited. The algorithm offers a comfort zone for people who do not access different kinds of information. The comfort zone makes people rely more on the information that the algorithms recommend, and it imped individual development. The consequence resulted from the singular information of algorithms is higher than people expected, so they should challenge the comfort zone that algorithm created for them.

File:Eli Pariser, author of The Filter Bubble – Flickr – Knight Foundation (1).jpg” by Knight Foundation is licensed under CC BY-SA 2.0.

Filter Bubble

‘Filter bubble’ delivers singular information to people by its customization information and online feedback loops. It can customize the information that people see on the internet with algorithms according to people’s interests. On most digital platforms, the system recommends people to choose their preferences when they use it for the first time. The algorithms customize the information based on people’s first preference and their browsing history, so the information is very attractive to people. They usually isolate people’s online experience, but people cannot realize that they are being isolated by their own ‘filter bubble’. In addition, the personalisation algorithms amplify the ‘filter bubble’ when the platform users click and view some fringe and extreme information (Ledwich et al.). Unhealthy information can spread more efficiently than other information within ‘filter bubble’. ‘Filter bubble’ creates an overload of singular information for people, so it is not a suitable context for information dissemination. According to the research, algorithms used by social media platforms like YouTube can enhance the filter bubble effect (Reed et al., 2021, p.5). Besides, ‘filter bubble’ creates a feedback loop. When people view the information that the algorithms recommend, the system would confirm that people “like” the content and keep recommending other relevant information to them. The algorithms filter different kinds of information based on people’s data, and the information it keeps and disseminate to people is limited. In this way, an online feedback loop is formed. It limits the variety of information, and it is harmful to the development of individual critical thinking because people keep receiving singular information from recommendation algorithms. Since everyone’s ‘filter bubble’ is personalized and unique, the progressing of algorithms also obstructs the exchange of different and diverse opinions. Therefore, ‘filter bubble’ limits the public’s thinking by personalizing the information and making feedback loops for people.

Extreme thinking mode

The algorithms hardly show them ideas that are different from people to improve their online experiences, which is harmful to people’s thinking, and it leads to an extreme thinking mode. The way algorithms show people information creates an ‘information cocoon’ for them. ‘Information cocoon’ limits the information that people accept because it isolates people from the contradictory information (Sunstein, 2006, p.9). People in an ‘information cocoon’ have fewer chances to access diverse information, and it pushes people to the extreme. They choose to see what they like to see, so they actually form the ‘information cocoon’ by themselves. If people do not have the ability to do self-control of using recommendation algorithms, they cannot think critically. What’s more, the ‘information cocoon’ develops polarized ‘echo chambers’ for people who have extreme ideas (Sunstein, 2006, p.8). In ‘echo chambers’, people can only see like-minded discourse, they ignore ideas that are different from them. Additionally, algorithms do information relevance ranking, which makes people become more extreme and strengthen their existing stances. They will form an online community which includes people with the same ideas as themselves. People in this community only communicate the same ideas, so their ideas will be extreme, and it will lead to group polarizations. In this light, people’s narrow thinking is easy to get biased affirmation by themselves in ‘echo chambers’ because it does not give people opportunities to do self-correction. As they use recommendation algorithms, they will ignore other valued and comprehensive information, and it is not conductive to cultivating people’s critical thinking. Hence, algorithms can limit people’s thinking by intensifying polarizations of people’s ideas and make them think in extreme.

Top Ten Trends Debate” by jurvetson is licensed under CC BY 2.0.

Follow-the-Crowd

Since algorithms provide limited information, it is easy to cause ‘bandwagon effect’ to people’s thinking. There is no denying that it promotes social cohesion, but it also brings some negative impacts on individuals. People’s herd mentality can help spread biased information such as fake news because it amplifies not only real information, but also biased information (Amrollahi, 2021, p.4). ‘Bandwagon effect’ makes people follow the trend that algorithms bring. In this light, algorithms can maintain social bias, so if people follow the wrong idea, it will bring negative impacts to society (Abbate, 2017, p.9). 

People can not only see what they are interested in, but also see online communities which show the wrong idea. That information confuses people and makes them believe the information is real. Since an increasing number of people view biased information, the algorithms would make the information to be the top post so that more people can see it. The spread of biased information can form a vicious cycle because it will make more and more people believe it. The algorithms mislead people to insist on their wrong idea, which limits their critical thinking. Also, people who follow the crowd almost do not have the ability to distinguish because they only believe what most people say (Molina, 2023). Although “the most people” are decided by algorithms, it still creates a ‘bandwagon effect’ for people’s thinking. ‘Bandwagon effect’ has potential influence for long-term (Henshel & Johnston, 1987, p.500). Since algorithms make the spread of information go viral, it makes the idea popular, and people will think the idea is confirmed by most people (Barnhart, 2021). As a result, algorithms make people lose their own thinking and just follow the crowd by making the singular information become popular and using ‘bandwagon effect’.

From the above, people might get the idea that algorithms narrow people’s thinking by offering them singular information and keep them in a comfort zone of the same idea. However, it has a positive impact on society because it makes people get together. Although algorithms are good for social cohesion and it is the success of technological products, people should not rely on them too much. ‘Filter bubble’, ‘information cocoon’, and ‘echo chambers’ are products of algorithms. They provide singular information for people, but they are fundamentally different. In a ‘filter bubble’, people do not retrieve the information, but the algorithms control information dissemination, and people always receive information passively. On the contrary, the ‘information cocoon’ mostly depends on people themselves, they have the initiative in accessing the information. ‘Echo chambers’ are mainly focusing on individual perspective reinforcement and group polarization, so it is easy for people to be extreme. They are bad for developing individual critical thinking, so people should use recommendation algorithms with caution and be actively in accessing different kinds of information to develop their thinking. Consequently, people should have the ability to identify the information, and not be misled by biased information so that they can improve their critical thinking.

Reference List

Abbate. (2017). What and where is the Internet? (Re)defining Internet histories. Internet Histories. Internet history, 1(1-2), 8–14. https://doi.org/10.1080/24701475.2017.1305836

Allred, K. (2018). The causes and effects of “filter bubbles” and how to break freehttps://medium.com/@10797952/the-causes-and-effects-of-filter-bubbles-and-how-to-break-free-df6c5cbf919f (NO DOI)

Amrollahi, A. (2021). A Conceptual Tool to Eliminate Filter Bubbles in Social Networks. Australasian Journal of Information Systems2, 1-16. https://doi.org/10.3127/ajis.v25i0.2867

Barnhart, B. (2021). Everything you need to know about social media algorithms. https://sproutsocial.com/insights/social-media-algorithms/ (NO DOI)

GCFGlobal. (2023). How filter bubble isolate youhttps://edu.gcfglobal.org/en/digital-media-literacy/how-filter-bubbles-isolate-you/1/ (NO DOI)

Henshel, R. L., & Johnston, W. (1987). The Emergence of Bandwagon Effects: A Theory. The Sociological Quarterly28(4), 493–511. https://doi.org/10.1111/j.1533-8525.1987.tb00308.x

Ledwich, M., Zaitsev, A., & Laukemper, A. (2022). Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations. First Monday27(12). https://doi.org/10.5210/fm.v27i12.12552

Molina, J. (2023). The bandwagon effect on social media in shaping public opinion.https://medium.com/@jafr.molina.au/https-www-adcocksolutions-com-post-the-bandwagon-effect-4f18627d2350(NO DOI)

Reed, A., Whittaker, J., Votta, F., & Looney, S. (2021). Radical Filter Bubbles: Social Media Personalisation Algorithms and Extremist Content. Royal United Services Institute (RUSI). http://www.jstor.org/stable/resrep37297

Slattery, J. G. (2014). The information cocoon. https://www.thecrimson.com/article/2014/3/5/harvard-information-cocoon/(NO DOI)

Sunstein, C. R. (2006). Dreams and nightmares. In C.R. Sunstein (Ed.), Infotopia: How many minds produce knowledge(pp.3-19). Oxford University Press. (NO DOI)

Ted Ed. (2017, July 7). How to spot a misleading graph-Lea Gaslowitz. [Video]. Youtube. https://youtu.be/E91bGT9BjYk?si=9zPfh-Hmj7insUsc (NO DOI)