Return me to a Fair and Clean Digital Platform!

Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

“Cross Media Platforms Chart” by Gary Hayes is licensed under CC BY-NC-ND 2.0.


In human society, information can be expressed in a variety of forms, and we call these expressions media. The revolutionary innovations brought about by mobile communication technology and Internet technology are affecting people’s daily life like a catalyst. Among them, the information media stored, processed and disseminated by computers are digital media, which are gradually moving into the digital era, and the platforms are also developing rapidly in the context of the digital era, and all kinds of massive information are widely disseminated on the digital media platforms. The digital media platform is producing a large amount of information every second, and the lack of access threshold restriction accelerates the spread of various information. In the era of traditional portals, editors did not have editorial rights, and the news content appearing on the website was mainly reproduced from cooperative media with news editorial rights, but at this time, there existed a large number of editors trained with professional journalistic quality on the website, and they could completely control the content strictly. However, entering the era of digital media platforms, this original model has changed dramatically. More and more content distribution platforms began to de-edit, and content production began to enter the crowdsourcing stage, with platforms transferring the editorial production of content to new media practitioners, which means that everyone can publish information. These people can be described as a mixed bag, ranging from professional traditional media people to the newly born self-publishers who do as they please. While digital media platforms may seem utopian in their existence and allow more people to have more direct communication to turn into online friends, the potential for immeasurable harm is also present, such as bullying, harassment, violent content, hate, pornography, etc (Tarleton, 2018, p.5). Given the real atrocities that often occur on social media platforms, one of the biggest challenges facing platforms is to create and implement a content review mechanism that addresses both extremes. Platforms are not only a medium for public discourse; they are also an integral part of it. Platforms themselves may not shape public discourse, but they do shape the shape of public discourse. So for now, more and more people want platform regulation mechanisms because they are often unfairly influenced by it. If regulators believe that “self-regulation” is effective on digital media platforms, they will often develop policies to complement it or give it legal force (Tarleton, 2018, p.6).


Can large institutions be used to regulate digital platforms?

The key to solving this set of problems is to rely on regulatory tools to do so. But before we discuss how to regulate digital platforms we need to understand that they are not “content producers” but rather they host, store, organize and distribute the content of others. Social media platforms are a bridge not only between users and users, users and the public, but also between citizens and law enforcement, policymakers and regulators. Social media platforms need to not only create and improve their own rules, but also develop tougher means to regulate them. (Tarleton, 2018, p. 256) Today, the primary job of regulating the Internet falls to large technology and telecommunications companies that must comply not only with the laws of their home countries, but also with the laws of other countries in which they have substantial commercial interests and assets. The policies adopted in the past for broadcasting, telecommunications and the media are often not applicable to contemporary media and communications. From “fake news” and its political implications to revenge porn, extremist content, cyberbullying, online harassment and hate speech, there is a growing public expectation that digital platforms need to be held accountable to the public interest, particularly because “these companies are increasingly monitoring, regulating and removing content, and restricting and blocking some users. and restrict and block some users. For example, Flew, Martin, Suzor (2019) investigated to the European Commission to impose a fine of €2.4 million on Google for failing to promptly remove content deemed illegal or hate speech. The fine that resulted from Google’s failure to promptly remove some comments deemed to be hate speech could be considered a punishment here. It is commonplace for platforms to remove content deemed offensive and content deemed harassing to users, and this is considered one means of platform regulation. In addition, 70% of Australian adults believe that online hate speech is spreading, and most believe more should be done to stop its growth, either through the introduction of new laws (71%) or through social media companies (78%), they expect platforms to regulate hate speech about gender and religion, and want them to remove it as soon as possible. By removing or filtering users from such speech, the vast majority of Australians support action to curb the spread of hate speech online( eSafetyComissioner, 2018, p. 6).

“No violence no hate speech” by Faul is licensed under CC BY 2.0.

From Micro angle to regulate digital platforms.

Moreover, relying on the digital platform’s self-regulatory mechanism is far from enough. When they admit to regulation, for example, platforms often frame themselves as open and impartial, partly because their founders fundamentally believe they are, and partly to avoid obligations or responsibilities. Furthermore, platforms are under pressure from the public and the government to control speech. However, in fact, the platform does not have so much energy to take on so many responsibilities. So we have to draw on the power of the community – community managers (volunteers) are also important, they need to develop forms of governance that protect the community and embody democratic processes that match their values ​​and those of their users (Flew, Martin, Suzor, 2019, p.264).


Meta-Regulation and Self-Regulation.

Furthermore, the traditional view of regulation emphasizes two opposing conditions: freedom and control. For example, Hutter (Sinclair, 2010, as cited in Mendelson, 2012, p. 4) defines meta-regulation as “state oversight of self-regulatory arrangements”. Although the traditional view of regulation places the government in the role of regulator, as mentioned earlier, this role can also be assumed by non-governmental standard-setting bodies or industry trade associations. (Community Manager) Platform operators sometimes neglect to deploy when they are helpful and avoid when they are nervous. When platforms make decisions, that is, when they choose to deal with speech, they make decisions based on their own realities. If the platform is smaller, it will choose lax regulation to retain users. This should include new liability principles tailored to social media platforms, rather than replicating laws designed for Internet service providers and search engines. So too is self-regulation, which can include any rules imposed by nongovernmental actors or rules created and enforced by the regulated entities themselves. For example, Sinclair (Sinclair, 2010, as cited in Mendelson, 2012, p. 4) describes self-regulation as a form of regulation that “relies heavily on the goodwill and cooperation of individual firms to achieve compliance”.



In short, the platform-managed approach is slowly becoming the familiar and accepted way of handling user-generated content, a mundane feature of the digital cultural landscape. In this day and age, truly valuable content (text, video, articles, and everything else that is presented) is gradually being removed from the mainstream public eye. Most of the content people consume now is carefully designed branded marketing by businesses. Without regulation, platforms not only can’t survive, but they can’t be platforms. Regulation has been there from the beginning and always has been there.



Cary, C., Evan, M. (2012). Defining Meta-Regulation and Self-Regulation. Meta-Regulation and Self-Regulation, 12(11), 1-36. https://DOI:10.1093/oxfordhb/9780199560219.003.0008


eSafetyComissioner. (2018). Online hate speech:eSafetyresearch.


Flew,T. , Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50.


Tarleton, G. (2018). Governance by and through Platforms. In Burgess & ProQuest (Firm) (Eds.), The SAGE handbook of social media (pp. 254–278). SAGE Publications.


Tarleton, G. (2018). All Platforms Moderate. In Gillespie (Ed.) , Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media  (pp. 1–23). Yale University Press.