In the information age, the Internet’s rapid development has made life quite convenient for many individuals. However, at the same time, online violence, online pornography, bullying and other undesirable content are constantly being recreated with the emergence of various new technologies and modes of communication, polluting the space of online culture and jeopardizing the healthy growth of the Internet. Further consideration should be given to the issue of who should be in charge of halting the dissemination of this undesired content and what steps should be taken to do so. In a strict sense, digital platforms, governments, and individuals are all in charge of halting the spread of this harmful content.
Digital platforms are responsible for the distribution of undesirable content
Primarily, digital platforms are to blame for unfavorable content because the platform itself has grown to be a crucial factor influencing the outcome of the information material displayed. According to Green and Le (2022), digital platforms are distinct from other types of businesses in that they serve as market organizers while organizing the exchange of goods and services between bilateral or multilateral groupings. Digital platforms can, to a certain extent, provide the guidelines by which other players conduct their online activities and control the relationships between other actors in the context of the economy and society’s accelerating digital transformation (Green & Le, 2022). It is therefore crucial that platforms address the equity issues raised by this business model, i.e. how to balance the interests of society with those of the individual. TikTok is known to be a global hit as a platform for creating and sharing short videos, however, a BBC report claims that TikTok has become a gathering place for minors in the UK to receive sexual innuendo and threats of violence. In the comments section of some of the short videos shared by children, a large number of pornographic and explicitly sexu (Silva, 2019). For the individual, pornography attracts traffic and more attention, but for the platform, although it gains revenue from the traffic, it is branded as vulgar and affects its reputation.
What should digital platforms do?
However, how can digital platforms balance the interests of the individual with those of society and take responsibility for stopping the spread of this undesirable content? Helberger et al. (2017) emphasize that APP operators should improve the real name system and audit management mechanism, and take disposal measures such as warning, restricting functions, suspending updates, and closing accounts, as appropriate, for publishing undesirable information, keeping records and reporting to the relevant competent authorities (Helberger et al., 2017). In general, under the Internet profit model where traffic is revenue, it is understandable that online platforms rely on increasing the number of clicks and dissemination to gain revenue, but any profit-making behavior should abide by the legal bottom line and public order and morality. If digital platforms fail in their responsibility to review and supervise and allow undesirable information to abound, then regulators should deal with the creators and disseminators of undesirable content in accordance with the law and make the online platforms pay for their negligence.
“Social Media Icons on an iPhone 7 Screen” by Stacey MacNaught is licensed under CC BY 2.0. Retrieved from https://www.flickr.com/photos/staceycav/36814291390
Government’s regulatory responsibilities
Moreover, the government should assume greater responsibility in taking charge of stopping the spread of undesirable content on the internet. Mansell (2011) indicated that due to the increasing amount and influence of undesirable content on the Internet in the data era, the conflict of interest is becoming more prominent and complex. This is not only misleading but also harmful to the stability and development of society and disrupts the social order. Therefore, the government should strengthen its own efforts to regulate non-healthy content on the Internet, analyze and handle public opinion events in a timely manner, and improve its own governance capacity (Mansell, 2011).
“Attorney Stacey Belisle of the McGraw Morris Lawfirm Discusses the Legal Ramifications of Social Media During the 2012 Michigan Local Government Management Association Winter Institute in East Lansing” by Michigan Municipal League is licensed under CC BY 2.0. Retrieved from https://www.flickr.com/photos/michigancommunities/6812430865
How can governments use their powers to prevent the dissemination of objectionable content?
However, how can the government utilize authority to sanitize the internet? According to Brown and Peters (2018), they emphasised that upholding the rule of law and improving and fully utilizing it are the primary measures. As soon as possible, a system of discretionary rules for online platform content control should be established, the substantive censorship standards for managing Internet material should be improved, and a mechanism for grading content censorship should be investigated (Brown & Peters, 2018). Thus, it is essential to further improve the procedural rules for administrative penalties and administrative compulsion. Furthermore, Flew et al.(2019) claim that government departments should be adept at cooperating with Internet platform enterprises and other types of market players and social organizations, integrating and sharing big data resources, building synergistic linkage mechanisms, and exploring new modes of market regulation in the age of the Internet economy (Flew et al., 2019). For example, in July 2020, the UK Competition and Markets Authority, the Information Commissioner’s Office, the Office of Communications, the Financial Conduct Authority, and others formed the Digital Regulatory Cooperation Forum to promote information exchange and sharing (Schlesinger, 2022). However, the finest regulation is to prevent problems before they occur. Adhering to preventive regulation and urging corporate bodies to strengthen content review is the key to governance. The best way to tackle Internet chaos is only for subjects publishing online content to strengthen content review before publication to avoid disinformation, infringement, and security issues.
”5 Negative Effects of TikTok on Teens’‘ by Smart Social. All rights reserved. Retrieved from https://www.youtube.com/watch?v=ksr7g-EVUUE&list=PPSV
Individuals, as publishers of content, are also accountable for the information they publish
Finally, individuals are accountable for blocking objectionable content on the Internet, since it is the actual members of society who disseminate information on the Internet and obtain information from it. Allyn (2022) stated that last year several TikTok content censors filed a class action lawsuit against the TikTok platform and its parent company ByteDance, stating that the 10,000 TikTok content censors had to be exposed to child pornography, sexual assau (Allyn, 2022). The underlying reason for the dissemination of undesirable content by individuals as publishers of information is a disregard for laws and regulations and a lack of awareness of the fact that they are suspected of breaking the law. It is also a reflection of psychological distortion and moral confusion.
What should individuals do?
However, how can individuals be held responsible for the content they publish on the internet? Priority should be given to enhancing one’s awareness of the rule of law. Young people are gradually becoming the main body of a large team of Internet users, and this group is the direct beneficiary of Internet safety, thus they should strictly abide by the relevant laws and ethics of the Internet, resist all kinds of Internet chaos, and fundamentally establish a “green screen”. Furthermore, individuals need to strengthen their education on self-psychological health. The reality is that some people have confused moral and political concepts and look at all the undesirable information that appears in the online world with psychological misconceptions and curiosity. This inevitably has a serious impact on one’s mental health. Therefore, Christensen & Griffiths (2003)claim that self-cultivation, self-monitoring, and self-restraint are the strong internal driving force for individuals to consciously resist undesirable information on the Internet, to overcome undesirable ideological tendencies with a healthy desire for knowledge, and to improve their own moral status in order to enhance their “immunity”(Christensen & Griffiths, 2003).
In conclusion, preventing undesirable content on the internet should be the responsibility of digital platforms, governments, and individuals. Digital platforms should improve their real-name system and audit and management mechanisms, and do their part in censorship and supervision. The government should improve laws and regulations and further improve procedural rules for administrative penalties and administrative compulsion, and should also be good at cooperating with internet platform companies to build a synergistic linkage mechanism. Furthermore, individuals should enhance their own awareness of the rule of law and mental health.
Allyn, B. (2022, March 24). Former TikTok moderators sue over emotional toll of “extremely disturbing” videos. NPR. https://www.npr.org/2022/03/24/1088343332/tiktok-lawsuit-content-moderators
Christensen, H., & Griffiths, K. (2003). The internet and mental health practice. Evidence-Based Mental Health, 6(3), 66–69. https://doi.org/10.1136/ebmh.6.3.66
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1
Green, L.& Le, V. T. (2022). Holding the Line: Responsibility, Digital Citizenship and the Platforms. In Digital Platform Regulation: Global Perspectives on Internet Governance. Springer Nature.
Silva, B. M. (2019, April 5). Video app TikTok fails to remove online predators. BBC News. https://www.bbc.com/news/blogs-trending-47813350
Schlesinger, P. (2022). The neo‐regulation of internet platforms in the United Kingdom. Policy & Internet, 14(1), 47–62. https://doi.org/10.1002/poi3.288
Stojkovic , N. (2020). Hand on macbook. Dark side of the internet. https://www.flickr.com/photos/nenadstojkovic/49810616086
Smart Social. (2020). 5 negative effects of TikTok on teens [Video]. In YouTube. https://www.youtube.com/watch?v=ksr7g-EVUUE&list=PPSV
Helberger, N., Pierson, J., & Poell, T. (2017). Governing online platforms: From contested to cooperative responsibility. The Information Society, 34(1), 1–14. https://doi.org/10.1080/01972243.2017.1391913
Mansell, R. (2011). New visions, old practices: Policy and regulation in the Internet era. Continuum, 25(1), 19–32. https://doi.org/10.1080/10304312.2011.538369
MacNaught, S. (2017). Social Media Icons on an iPhone 7 Screen. https://www.flickr.com/photos/staceycav/36814291390
Brown, N. I., & Peters, J. (2018). Say This, Not That: Government Regulation and Control of Social Media. Syracuse L. Rev., 68, 521.