Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Social networking sites give users new chances to communicate and interact with a wider spectrum of people. While the advantages are numerous, the dangers are also obvious. Every day, a significant amount of offensive, violent, abusive, unlawful, and discriminatory content is posted on the sites (Gillespie, 2018, p.254). This essay will discuss who will take action when questionable content arises on digital platforms: users, the platform itself, the government, or other entities. Because digital platforms are different from traditional media, digital platforms are in a more open and larger community, with responsibilities involving many other institutions in the field (Dutton, 2009, p.2). At the same time, the essay will also discuss the actions that people must take to prevent this sort of offensive content from emerging.

Who should take responsibility?

Google+ by Magnet 4 Marketing dot Net, licensed under CC BY 2.0.

First, governments need to be held responsible for offensive online content. The corresponding laws and regulations are not yet complete, the legal boundaries are not clear enough, and some platforms take advantage of legal loopholes to avoid punishment. Several scholars have advocated for the government and other regulators to intervene in the emerging platform society, rather than leaving its development to market dynamics (de Kloet et al., 2019, p.250). The growing amount of content on the Internet not only means more traffic, but also indicates a significant content risk lurking within it. Citizens are at risk, and the government cannot simply ignore them. However, it is worth considering that social media is not uniformly regulated by all laws due to the different systems and cultures of different countries. For example, in 2010, Google withdrew from the Chinese market. The reason was that China thought Google would be highly sophisticated and targeted for cyber attacks and asked for censorship of Google. But Google refused the censorship request, so it withdrew from the Chinese market. Google, on the other hand, received no charges or repercussions in other countries. This is the result of management differentiation. The designation of a company as “global” or “US” affects its compliance with regulatory regimes, including taxation. The ownership status of a website can also have an impact on its economic transactions and the social interactions of its users (van, 2018, p.10).

Blue Whale by Lakpura LLC, licensed under CC BY 2.0.

 

 

Secondly, social platforms should also be responsible for problematic content. Although it is said that in the early days of Internet content management, the broad and conditional safe harbor is extremely favorable to the online platform from a legal standpoint (Gillespie, 2018, p.258-259). According to Safe Harbor, the platform is only a technical service provider and is not liable for the dissemination of illegal or irregular information. As technology advances, more people can use social media platforms, and users are becoming younger. Faced with such a large number of users, social media companies should also take responsibility. Screen out some inappropriate content for users. Several studies have suggested that poor social media content may have played a role in the recent significant increase in adolescent suicide rates and depressive symptoms (Khasawneh et al., 2020, p.2). The Blue Whale Challenge first appeared on the Internet in 2015. Teens in a community complete the challenge by performing tasks that hurt them to varying degrees every day until they commit suicide. The emergence of the challenge has led to nearly 130 children committing suicide in Russia. The rapid spread of the Internet has also made the Blue Whale Challenge appear in other countries, all resulting in teenage self-harm or suicide. Until October 2016, the game’s founder, Philip Budeikin, was arrested and imprisoned. Although the creator of the challenge is said to be in jail, the challenge is still going on in some parts of the internet. The lack of regulation of content on social platforms has led to the creation of this content and its continued spread. However, for social platform companies, the point of conflict is in the vetting process. On YouTube, it is estimated that users upload more than 400 hours of video content every minute of every day (Flew et al., 2019, p.41). The proliferation of content makes it easy to overload the review mechanism. The classification of material uploaded by users into acceptable or rejected categories is a complex process that goes far beyond the capabilities of the software or algorithm itself (Roberts, 2019, p.34).

What should we do?

China Internet Police by isriya, licensed under CC BY-NC 2.0.

The government should enact laws that address the proliferation of bad information on platforms as well as on individuals. For example, in 2009, Russian law made website owners responsible for comments posted by users on their websites (Gillespie, 2018, p.261). At the same time, governments could do what China has done and give the public security department an additional position, the internet police. They could specialize in finding bad content on the Internet and imposing penalties. In addition to unilateral restrictions, the government should cooperate with social networking platforms. Increase the relevant regulations for the civilization of online communities, and social platforms to strengthen the handling of user reports and complaints. Strengthen law enforcement assistance, take information tag tracking, active detection of illegal content, technical blocking, the introduction of third-party monitoring and other ways to content management, eliminate hate, pornography, and other undesirable information. However, when vetting some objectionable content, governments and social media platforms must also consider the psychological harm that can be done to reviewers. These tasks alternate between mind-numbing repetition and secular images and materials that can be violent, disturbing, and, in the worst-case scenario, psychologically damaging (Roberts, 2019, p.38). At the same time, schools should also play a role. The use of social media is becoming increasingly popular among teenagers around the world due to its interactive nature (Mulisa & Getahun, 2018, p.295). In order to avoid the recurrence of the Blue Whale Challenge, schools should offer relevant courses to guide students to surf the Internet properly. Avoid teenagers from blindly sending and receiving undesirable messages. Emotionally, the widespread use of social media can reduce peer interaction and may lead users to feel lonely, leading to a distorted self-image of adolescents (Błachnio et al., 2016, p.27).

In conclusion, the emergence of undesirable content on the Internet is not just a problem for one institution. The rapid development of the Internet allows people to receive information faster and learn about the realities of developments around the globe. However, at the same time, the ability of users to distinguish information is reduced. Discomforting content can cause indelible psychological damage or physical harm to users. As official authorities and media intermediaries, governments and platforms should take timely measures to block the appearance of such content. But it’s crucial to remember that the public’s concerns about the flow of material have contributed to some of today’s regulatory challenges. It is important to consider the political concerns surrounding the effects of monitoring, the unrestricted access that third parties have to data, and the erosion of individual privacy (Schlesinger, 2020, p.1558). There are certain measures that can be taken not only by the government and social media platforms themselves. Schools and other public welfare organizations can also join in, guiding people to surf the Internet properly and helping users who have read undesirable content with psychological guidance.

Reference List

Adeane, A. (2019, January 13). Blue Whale: What is the truth behind an online “suicide challenge”? BBC News. 1/10/2022. https://www.bbc.com/news/blogs-trending-46505722

Bang, X. (2021, January 30). Google pulled its service from China more than a decade ago—Can Australia learn from that? ABC News. 6/10/2022. https://www.abc.net.au/news/2021-01-30/google-leave-australia-what-to-learn-from-china-legislation-law/13102112

Błachnio, A., Przepiorka, A., Boruch, W., & Bałakier, E. (2016). Self-presentation styles, privacy, and loneliness as predictors of Facebook use in young people. Personality and Individual Differences, 94, 26–31. https://doi.org/10.1016/j.paid.2015.12.051

CHINAREALTIME BLOG. (2015, January 6). China’s Internet Police Step Out of the Shadows. The Wall Street Journal. 2/10/2022. https://www.wsj.com/articles/BL-CJB-26989

de Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019). The platformization of Chinese Society: Infrastructure, governance, and practice. Chinese Journal of Communication, 12(3), 249–256. https://doi.org/10.1080/17544750.2019.1644008

Dutton, W. H. (2009). The Fifth Estate Emerging through the Network of Networks. Prometheus, 27(1), 1–15. https://doi.org/10.1080/08109020802657453

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1

Gillespie, T. (Ed.). (2018). Regulation of and by Platforms. In The SAGE handbook of social media (pp. 254–278). SAGE Publications.

Khasawneh, A., Chalil Madathil, K., Dixon, E., Wiśniewski, P., Zinzow, H., & Roth, R. (2020). Examining the Self-Harm and Suicide Contagion Effects of the Blue Whale Challenge on YouTube and Twitter: Qualitative Study. JMIR Mental Health, 7(6), e15973. https://doi.org/10.2196/15973

Mulisa, F., & Getahun, D. A. (2018). Perceived Benefits and Risks of Social Media: Ethiopian Secondary School Students’ Perspectives. Journal of Technology in Behavioral Science, 3(4), 294–300. https://doi.org/10.1007/s41347-018-0062-6

Roberts, S. T. (2019). 2. Understanding Commercial Content Moderation. In Behind the Screen (pp. 33–72). Yale University Press. https://doi.org/10.12987/9780300245318-003

Schlesinger, P. (2020). After the post-public sphere. Media, Culture & Society, 42(7–8), 1545–1563. https://doi.org/10.1177/0163443720948003

van, J. (2018). The Platform Society as a Contested Concept. In The Platform Society (Vol. 1). Oxford University Press. https://doi.org/10.1093/oso/9780190889760.003.0002