
“Freedom of Speech” by A.MASH is licensed under CC BY-NC-ND 2.0.
The technology of digital platforms is constantly improving and becoming more widespread in the lives of citizens. The use of digital platforms and their position in the daily life of users has reached the center. However, with the development of digital platforms, the platforms have given rise to a large amount of harmful content. Examples include pornography, hate, harassment, violent content, and other content. The power of distribution on digital platforms has led to the rampant spread of this undesirable content, affecting the user’s sense of experience and the stability of society. This reduces the user experience, impacts young children’s physical and mental health, and negatively affects society.
Who is responsible for stopping it?
There are three forms of regulation to prevent the spread of undesirable content on platforms: self-regulation, media regulation, and government regulation. These regulatory channels block harassment, violence, pornography, and other objectionable content. Stopping the spread of undesirable content is not only something that society and the government should do, but citizens should also have a duty to maintain and manage the undesirable content.
Self-regulation:
Self-regulation is one way to prevent the spread of such content. Self-regulation of media platforms means that the platforms have their internal framework to regulate and manage what happens on the platform and how they monitor or control their users’ use of their service systems. Management methods can achieve self-regulation to stop the spread of undesirable content, platforms removing unwanted content from users, monitoring users’ behavior on the web, and restricting some users’ speech rights. (Flew et al., 2019) Self-regulatory systems are subject to government control but can be controlled by the platform on a limited scale. Modifying and managing the content can prevent violence, harassment, pornography, and other content from appearing on the platform. The administrators believe that “self-regulation” is adequate. However, relying on self-regulation alone is insufficient, and Senator Graham believes continuous self-regulation is wrong. (Flew et al., 2019) The self-regulation of the platform is reflected in how it balances itself with the regulation of both. Therefore moderation is another challenge for the self-regulation of digital platforms. (Tarleton,2018) If it relies only on self-regulation, the result is that bad content is consistently and completely eradicated.
Social media platforms are reluctant to self-regulate and censor.
The two platforms, Twitter and Facebook, are reluctant to self-regulate posts with lousy content. By spreading false content on the platform, Facebook is not only failing to control its content on the platform. (Hyvönen, 2022) It is seriously endangering the safety of its users and creating fear in society. Facebook’s self-regulation is aimed at speech that is detrimental to the platform and will deal with such content. Some content that is beneficial to itself, on the other hand, will be retained. This action does not consider the users, the community, and the government. At the time of the US presidential election, supporters altered their accreditation privately to support the election of Donald Trump, which impacted US politics. The government intervened in such a chaotic situation. Self-regulation of platforms should be more active.

Media Regulation:
Media regulation is a way to organize the dissemination of undesirable content. Media regulation will have a more regulated framework by specific communities and organizations. These organizations create a safer and more stable platform by managing the platform. At the same time, digital platforms will still be able to realize their right to self-regulation under media regulation. Although any content that users disseminate on social media falls within their right to freedom of speech, it must still not infringe on the safety of others. The Australian Competition and Consumer Commission has found that some digital platforms operate with opaque algorithms and even push advertisements with undesirable content in the public and commercial interest. This seriously undermines the rights of consumers. (ACCC, 2019)Media regulation is a more effective and regulated form of regulation than self-regulation.
Government regulation:
Government regulation is a way of regulating social media content from the government’s perspective. Government regulation refers to the review of relevant matters by state administrative bodies. It is stricter compared to self-regulation and media regulation. However, government control lacks a perspective from the public interest and can easily lead to conflicts. (Schlesinger,2020) From a government regulatory perspective, they need to focus on the impact on society and the outcome of maximizing benefits. On the other hand, governments or independent bodies regulate digital platforms by employing intervention. It also means that the digital platform will be tinkered with under the management of the government. Government regulation can be effective in solving problems and making society more stable. The government’s managers also monitor the platform’s content for violations and audit it for safety. Government intervention will be more active than other organisations because of the danger to public health.(Cusumano et al.,2021) The safety of the citizens is further ensured. Moreover, how undesirable content can be distributed is controlled. When the dissemination ability is reduced, the corresponding content will also reduce.
German Hate Speech Law
On the original platform, there would be some users who would use hate speech to cause distress to other users. The German government has introduced a new criminal law for hate speech on social media platforms.(Glaun, 2021) This is the government’s way of regulating the platform and dealing with hateful content and disruptions to standard social order. German criminal law makes particular hate speech punishable by imprisonment. A law makes racist and anti-Semitic comments punishable by up to five years in prison. The German government’s control of undesirable content on digital platforms is comprehensive and adequately safeguards the safety of some citizens.

How to prevent it?
The efforts and measures taken by self-regulation, media regulation, and government regulation are still insufficient in light of the challenges faced by digital platforms. (Cusumano et al.,2021) From the user’s perspective, when browsing some undesirable content, they can report and reduce the appearance of similar content. Digital platforms should strengthen the screening and censorship of content to protect users. Digital platforms act as a barrier between lousy content and users. Gillespie believes that an open platform is the equivalent of a world under utopia. It is more important for platforms to protect their users while regulating content and maintaining order. (Gillespie, 2018) The government seeks to maximize its interests while protecting its citizens’ interests. When such undesirable content is disseminated, it can lead to social instability. The government intervenes by enacting laws and regulations or other forms of restraint on inappropriate content on digital platforms. Platforms, the media, and the government should strictly control undesirable content. It is not only the help of these organizations and the media that is needed to prevent the spread of this undesirable content but also to deal with it at its source. Having blocked such objectionable content, users should also protect their privacy on digital platforms and provide prompt feedback to platforms and other organizations when they see such objectionable content repeatedly.
Conclusion:
In summary, The spread of undesirable content affects not only adults with independent minds but even minors as effectively. Preventing the spread of violence, harassment, bullying, pornography, and other objectionable content is prevented by a system of self-regulation, media regulation, and government regulation. When destructive content appears on digital platforms, the platforms, the media, and the government all have a responsibility and obligation to address the problem. Destructive content appears because some users are maliciously trying to harm others or for other reasons. When a platform wants to stop it, the most effective way is to cut off the source and reduce distribution and censorship. Therefore not only the individual but also the media, society, and the government should look at it from the point of view of stopping it in time.
Reference list:
Australian Competition and Consumer Commission (2019). Digital Platforms Inquiry: Final Report- Executive Summary. Canberra: ACCC, pp. 4-38.
Cusumano, M. A., Gawer, A., & Yoffie, D. (2021, January 15). Social media companies should self-regulate. Now. Harvard Business Review.https://hbr.org/2021/01/social-media-companies-should-self-regulate-now
Cusumano, M. A., Gawer, A., & Yoffie, D. (2021). Can self-regulation save digital platforms? SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3900137
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1
“facebook website screenshot” by Spencer E Holtaway is licensed under CC BY-ND 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nd/2.0/?ref=openverse.
“Freedom of Speech” by A.MASH is licensed under CC BY-NC-ND 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nd-nc/2.0/jp/?ref=openverse.
Glaun, D. (2021, July 1). Germany’s laws on hate speech, Nazi propaganda & Holocaust denial: An explainer. FRONTLINE. https://www.pbs.org/wgbh/frontline/article/germanys-laws-antisemitic-hate-speech-nazi-propaganda-holocaust-denial/
Gillespie, T. (2018) All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. pp. 1-23.
Hyvönen, A. (2022, September 27). Platforms of post-truth: Outlines of a structural transformation of the public sphere. E-International Relations.https://www.e-ir.info/2022/09/27/platforms-of-post-truth-outlines-of-a-structural-transformation-of-the-public-sphere
Schlesinger, P. (2020) After the post-public sphere. Media, Culture & Society, 42(7–8), 1545–1563.
“The Law” by smlp.co.uk is licensed under CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/?ref=openverse.