Who should regulate digital platforms and how to block harmful content

Self-regulation of the platform

"michael-nuccitelli-stop-online-child-pornography" by iPredator is marked with CC0 1.0.

Online digital platforms provide opportunities for people to interact socially, but at the same time, there are certain risks. Bullying, harassment, violent content and pornography, and other problematic content are spread on digital platforms, causing harm to users. In the face of digital platform management, media platforms should establish relevant and specialized regulatory bodies to oversee the platforms. Digital platform regulators need to develop relevant regulations to manage the platforms and mitigate negative impacts.

“Stop-Bullying-Concept” by freepik is licensed under CC BY-ND 2.0

The government cannot control the complex network. This leads to a lag in government management in the face of undesirable phenomena on digital platforms, which makes the government’s credibility suffer. With the rapid development of self-media, there are more and more channels to disseminate information, thus making it more difficult to regulate.



To eliminate the spread of bad information on the Internet as much as possible, it is necessary for the relevant platform supervision departments to pay more attention to the technical control of the Internet and increase the training of network technical personnel, which will effectively enable the supervisory departments to better regulate the Internet. Digital platforms need to self-regulate themselves and regulate the use of platform content.


“avoiding-internet-scams1-caution-laptop. s600x600”by robertsrr08 is licensed under CC BY-NC-SA 2.0.

As understood in the draft bill and other attempts at platform liability, platforms are responsible because the service of hosting users’ content was provided by them. If the content posted is dangerous, the platforms are liable because in providing the service, they disseminate harmful content. Attempts at platform regulation, including the Online Safety Act, integrate this understanding by regulating the services provided by platforms (Price, 2021). Platforms are considered to be distributors, of user-generated content. To guarantee that platforms are not held liable for the information they provide , also is aim to promote freedom of expression, platforms, as platform parties, would be held responsible for the content provided by their users (Medzini, 2022).


self-regulation of Platform

is  the foundation  and primary principle

The focus of platform self-governance is on user rights and preventing the spread of harmful information. Price (2021) considers the platform as the owner of the online space, manages its structure and design, and is responsible for online hazards as the owner. The online safety act should recognize the extent to which platforms contribute to online hazards and should exercise a duty of care to ensure that platforms fulfill their responsibility to provide safe spaces online. Due to the huge scale of platforms, the government’s regulatory capacity is insufficient. Self-regulation is considered cheaper and more effective than industry-level rules. Managers can further benefit from corporate self-regulation as it permits them to simply originate for compliance that improve their monitoring and enforcement mechanisms by supporting compliance and legal teams, and better adhere to public values and social worries through supporting compliance and legal crew to better adhere to reliance on auditors, contractors, and other first-party regulatory intermediaries. (Medzini, 2022). Platforms have many issues and disputes to deal with daily, and when faced with content with hatred and violence appearing on the platform, delayed disposal can have a huge impact on the platform and society.

Therefore, only self-regulation by the platform, which assumes the function of coordination and supervision, can ensure the stable development of digital platforms and contribute to the protection of users’ rights. Facebook, as a rule, maker, delegates the responsibility upwards to regulatory intermediaries. The proposed enhanced self-regulatory framework addresses this issue by focusing on This duality is addressed by focusing on the process of delegating responsibility to regulatory intermediaries and by tracking the impact on content governance. (Medzini, 2022) Cooperative enterprises can also support their rules with formal and informal enforcement mechanisms and establish self-regulatory associations or institutions that combine the regulatory functions of government with private legal structures and interests. The governmental function of regulation is combined with private legal constructions and benefits (Black, 1996: 28, as cited in Medzini, 2022).  Wagner (2018) claims that dominant digital platforms create their own regulatory and contractual frameworks effectively insulating them from external interference and globalizing a specific set of corporate speech norms (chink et al., 2022). Targeted laws are needed to address problematic content on digital platforms such as pornography, violence, etc.



“flickr and facebook” by ansik is licensed under CC BY-NC 2.0.

Approach of regulation  

What digital platforms can do ?

  • allow for the implementation of real-name authentication, whereby users must have a valid ID when posting messages on the platform, which will, to some extent, regulate the casual posting of bad news and effectively reduce the harmful effects.


  • Platform operators, such as , Facebook host social media pages for news organizations and have established a management system for acceptable content. Establishing systems for managing acceptable content that emerges within the confines of existing laws and their own transnational and commercial interests emerge. Their regulation depends on trust and safety teams to develop codes of conduct and community guidelines (Riedl et al., 2021).

  • The regulatory platform creates a database of block words for undesirable information, and administrators identify harmful information and remove and warn posters. Studies have shown that moderation can be done in various ways, for example, by engaging in conversations, performing pre- or post-moderation, closing debates for a specific period of time, limiting discussion topics, modifying or deleting texts, or expelling users who violate the rules (Kalsnes et al., 2021).


 THE  Facebook Oversight board was established for this purpose. Its purpose is to help Facebook users answer difficult questions about freedom of expression online, including the removal and retention of content. the Oversight board is composed of experts from around the world who select the content decisions to be reviewed through user complaints and cases submitted to them by Facebook. The Oversight board’s decisions are binding. When the Capitol was attacked on Jan. 6, 2021, Trump posted a short video on social in which who praised the mob and made his unfounded claim that the 2020 presidential election was fraudulent, although it was intended to make the violence stop. Several platforms, including Facebook, removed it, with Facebook’s vice president Guy Rosen explaining that the video “helps rather than reduces the risk of continued violence.” Facebook blocked Trump’s ability to post new content and said the block would last at least until Jan. 20, when Trump’s term ends. Trump’s term ends on Jan. 20.


“20110428 Trump” by Chris Piascik is licensed under CC BY-NC-ND 2.0

Digital platforms are not only a medium of communication but also influence society. Therefore, digital platforms implement self-regulation and make regulations to restrain and manage, which can effectively stop the dissemination of harmful information on the platform and maintain the safety order of the network. Regulatory platforms can remove content that endangers the state, society or violates personal privacy through technical means, and introduce relevant laws and regulations to strictly define the difference between freedom of expression and criminal acts such as violence, and use clear legal provisions to restrain and stop harmful incidents from occurring.



“Jump on the social media bandwagon” by Matt Hamm is licensed under CC BY-NC 2.0.






Reference list


Price, L. (2021). Platform responsibility for online harms: towards a duty of care for online hazards. The Journal of Media Law, 13(2), 238–261. https://doi.org/10.1080/17577632.2021.2022331

Medzini, R. (2022). Enhanced self-regulation: The case of Facebook’s content governance. New Media & Society, 24(10), 2227–2251. https://doi.org/10.1177/1461444821989352

Chin, Y. C., Park, A., & Li, K. (2022). A comparative study on false information governance in Chinese and American social media platforms. Policy and Internet, 14(2), 263–283. https://doi.org/10.1002/poi3.301

Riedl, M. J., Naab, T. K., Masullo, G. M., Jost, P., & Ziegele, M. (2021). Who is responsible for interventions against problematic comments? Comparing user attitudes in Germany and the United States. Policy and Internet, 13(3), 433–451. https://doi.org/10.1002/poi3.257

Kalsnes, B., & Ihlebæk, K. A. (2021). Hiding hate speech: political moderation on Facebook. Media, Culture & Society, 43(2), 326–342. https://doi.org/10.1177/0163443720957562