The Unintended Consequences of Online Personalization: A Deep Dive into TikTok’s Algorithm.

Introduction

The overall effect of the internet has certainly been positive (Leiner, 2009). For example, in addition to facilitating instant communication, social media platforms are also providing opportunities for meaningful connections and interactions. TikTok is among the platforms whose effect on society has been particularly immense. Data indicates that by 2023, TikTok had amassed more than 1.7 billion active users per month. However, even as one recognizes the positive effect that such platforms as TikTok continue to have, it is indeed crucial to acknowledge the damaging effect of these platforms. Pushing users into information cocoon rooms is among the dangers that these platforms present. The purpose of this paper is to highlight the cocoon rooms, using TikTok as a case study. In addition to conceptualizing the cocoon rooms, the paper also examines how these rooms manifest on TikTok, the effect that they have on users, and some of the remedies that can be implemented to confront them.

What is an information Cocoon Room?

Essentially, an information cocoon room refers to a phenomenon where individuals are exposed to a narrow and limited range of information, views, and perspectives (Yuan & Wang, 2022).

data slide
https://www.flickr.com/photos/29096601@N00/2920562020
data slide” by bionicteaching is licensed under CC BY-NC 2.0.

When confined to information cocoon rooms, individuals lack access to diverse perspectives. Scholars have noted that this phenomenon is especially prevalent in social media platforms (Liu & Zhou, 2022). On these platforms, the formation of the cocoon rooms is facilitated by algorithms which recommend specific content and information that are aligned with user interests, tastes, and preferences (Castells, 2002; Noble, 2018). Thus, instead of exposing users to fresh information, the cocoon rooms essentially reinforce the ideas and views that the users already hold.

TEDx Talks. (2019). Challenge The Echo Chamber | Adam Greenwood | TEDxRoyalTunbridgeWells [YouTube Video]. In YouTube. https://www.youtube.com/watch?v=UKyFL389qe8

The cocoon rooms described above can be likened to echo chambers and filter bubbles.


The Case of TikTok

TikTok is among the platforms that are becoming increasingly notorious for strengthening information cocoon rooms. According to Zhang (2021), the TikTok algorithm is designed to promote engagement. Essentially, this platform is built to ensure that users spend as much time on it as possible. In fact, data indicates that there has been a steady increase in the amount of time that users spend on TikTok. For instance, in the U.S., it is estimated that teenagers spend as much as 99 minutes daily on TikTok (Perez, 2022). By comparison, these users are on YouTube for 61 minutes. The explosive growth that TikTok has experienced as measured by user engagement clearly shows that its algorithm is immensely effective.

https://openverse.org/en-gb/image/255badbc-d072-4270-8a8c-866450c8d907?q=tiktok
TikTok app” by Solen Feyissa is licensed under CC BY-SA 2.0.

TikTok has become a haven for the establishment of cocoon rooms. According to He et al. (2023), what has allowed these rooms to thrive on platforms like Twitter is the format in which the content and media are packaged. He et al. (2023) describe how the short-video format that TikTok has adopted has enabled it to boost engagement. However, He et al. (2023) make it clear that this format is harmful because it restricts access to diverse perspectives and information, thereby reinforcing the information cocoon rooms. Essentially, as users are fed information and content that matches their tastes, views, and preferences, is the core of the internet’s personalized recommendation system.

Dangers of Information Cocoon Room

Information cocoon rooms that seem to be flourishing are harmful for a number of reasons. The sections below highlight some of the key dangers that these rooms present.

Lack of Diversity Reinforcing Biases: Among the ill effects of the information cocoon rooms is that they reinforce the biases held by TikTok users. In fact, scholars have already warned that the TikTok algorithm functions in a way that amplifies racial and political biases among users (Murray, 2021).

https://www.flickr.com/photos/34194250@N08/5149833928
OpenOrd layout algorithm” by gephi_org is licensed under CC BY-NC-SA 2.0.

Citing a study that revealed clear racial biases in the algorithm that underpins TikTok, Heilweil (2020) describes how this algorithm leads users to follow the accounts of users of particular racial backgrounds. This is indeed concerning as it demonstrates that the algorithm is deeply flawed and an overhaul is urgently needed. In countries like the US that are plagued by racial tensions, the biases that the TikTok algorithm appear to promote are particularly dangerous as they could inflame the tensions further, thereby undermining the efforts made toward bolstering harmony.

Promoting Extremism and Prejudice: Another negative impact of the information cocoon rooms that are becoming deeply entrenched in TikTok is the promotion of extremism and prejudice. Concerns have been raised that TikTok is rapidly becoming a hotbed of extremism and hate. For instance, Weimann and Masri (2021) observed that antisemitism is spreading rapidly on this platform. According to Weimann and Mastri (2021), the emergence of the antisemitism on TikTok is largely the result of the platform’s algorithm. Essentially, once a user is exposed to antisemitic content, the algorithm tends to recommend similar content, thereby establishing a cycle that accelerates the broadcasting of hateful content and messages. Transphobia is yet another type of extremism that seems to thrive on TikTok. Little and Richards (2021) describe an experiment that they conducted to reveal biases in the TikTok algorithm. They observed that after interacting with transphobic content, the algorithm recommended similar content. The rise of extremism on TikTok presents a serious danger to such marginalized populations as transgender individuals who require protection.

Undermining Democracy: In addition to the negative outcomes outlined above, the information cocoon rooms on TikTok also appear to be undermining democracy by fanning the spread of misinformation and fake news. According to Kokas (2022), in addition to posing a threat to fundamental liberties, TikTok has also become an immensely effective instrument for the dissemination of misinformation and outright falsehoods. Paul (2022) also raised the alarm over the role that TikTok and other social media platforms are playing in allowing nefarious actors to weaken democratic institutions and processes by broadcasting lies. The use of TikTok to erase the gains made in boosting public confidence in democracy can be attributed to its algorithm. Essentially, as already noted, the algorithm tends to feed users information that reinforce their values and beliefs. Thus, if a particular user already believes that the electoral process has been compromised, and they are more likely to be directed to information and content that amplify this belief. 

Promising Remedies

Fortunately, there are some promising interventions that various actors can institute to dismantle the information cocoon rooms that are becoming increasingly rampant on TikTok. Some of these solutions are discussed below:

Regulation: Government intervention may be necessary to tackle the information cocoon rooms. The failure by TikTok to take action highlights the need for government involvement. In the US, there have been growing calls for the government to impose tougher sanctions against such technology companies to prevent the spread of misinformation and fake news (Wamsley & Bond, 2023). At present, regulation seems to be lacking, and this could explain why the information cocoon rooms remain prevalent on TikTok. If TikTok fails to effectively self-police, governments should adopt a tougher approach against this platform so as to safeguard users.

https://www.flickr.com/photos/140988606@N08/40506141403
TikTok-unter-der-Lupe” by Christoph Scholz is licensed under CC BY-SA 2.0.

User Sensitization and Individual Action: Regulation is unlikely to materialize. Thus, the onus is on individual users to take action to protect themselves against information cocoon rooms. Some of the specific steps that individual users can adopt include:

  1. Recognizing confirmation bias and the adverse impact that it may have on their experiences while online.
  2. Actively seeking diverse perspectives, and paying particular attention to neutral and objective information.
  3. Seeking to be part of diverse teams and groups that bring together individuals from various backgrounds who hold a wide range of views and perspectives (IESE Business School, 2021).

While fairly simple, the measures outlined above could help users to shield themselves against cocoon rooms and expand their minds by exposing themselves to diverse viewpoints.

Conclusion

In closing, TikTok’s popularity especially among young users has been explosive. It is difficult to deny the appeal of this platform. However, there is a need to recognize that TikTok carries some risks. Confining users to information cocoon rooms is among these risks. There is ample evidence showing that these rooms are flourishing on TikTok, and that the platform is not doing much to dismantle them. In addition to fuelling the rise of hate and prejudice, the cocoon rooms are also weakening public trust in democracy by enabling the spread of misinformation. 


References

Castells, M. (2002). The culture of the internet. In The internet galaxy: Reflections on the internet, business, and society. Oxford University Press.

He, Y., Liu, D., Guo, R., & Guo, S. (2023). Information Cocoons on Short Video Platforms and Its Influence on Depression Among the Elderly: A Moderated Mediation Model. Psychology research and behavior management, 16, 2469–2480. https://doi.org/10.2147/PRBM.S415832

IESE Business School. (2021). Avoiding Echo Chambers: 5 Strategies To Beat Confirmation Bias. Forbes. https://www.forbes.com/sites/iese/2021/06/16/avoiding-echo-chambers-5-strategies-to-beat-confirmation-bias/?sh=3c5c99e61267

Kokas, A. (2022). Why TikTok Is a Threat to Democracy. Journal of Democracy. https://www.journalofdemocracy.org/why-tiktok-is-a-threat-to-democracy/

Leiner, B. M., Cerf, V. G., Clark, D. D., Kahn, R. E., Kleinrock, L., Lynch, D. C., Postel, J., Roberts, L. G., & Wolff, S. (2009). A brief history of the internet. ACM SIGCOMM Computer Communication Review, 39(5), 22-31.

Liu, W., & Zhou, W. (2022). Research on solving path of negative effect of “information cocoon room” in emergency. Discrete Dynamics in Nature and Society. https://doi.org/10.1155/2022/1326579

Murray, C. (2021). TikTok algorithm error sparks allegations of racial bias. NBC News. https://www.nbcnews.com/news/us-news/tiktok-algorithm-prevents-user-declaring-support-black-lives-matter-n1273413

Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.

Paul, K. (2022). ‘We risk another crisis’: TikTok in danger of being major vector of election misinformation. The Guardian. https://www.theguardian.com/technology/2022/oct/24/tiktok-election-misinformation-voting-politics

Perez, S. (2022). Kids and teens now spend more time watching TikTok than YouTube, new data shows. Tech Crunch. https://techcrunch.com/2022/07/13/kids-and-teens-watch-more-tiktok-than-youtube-tiktok-91-minutes-in-2021-youtube-56/

Wamsley, L., & Bond, S. (2023). U.S. is barred from combating disinformation on social media. Here’s what it means. NPR. https://www.npr.org/2023/07/05/1186108696/social-media-us-judge-ruling-disinformation

Weimann, G., & Masri, N. (2021). TikTok’s spiral of antisemitism. Journal Media, 2(4), 697-708.

Yuan, X., & Wang, C. (2022). Research on the formation mechanism of information cocoon and individual differences among researchers based on information ecology theory. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2022.1055798

Zhang, Y. (2021). Risk response in the new media age – Taking the accurate push of Tik Tok as an example. Advances in Social Science, Education and Humanities Research, 571, 614-20.