The Dilemma of Content Regulation: How Complex and Difficult theModeration Is!

“Like & follow US on all social media platforms! Linkedin >> Youtube >> Twitter >> https://goo” by We Are Social is licensed under CC BY 2.0


Nowadays, digital platforms are not only a tool for people to contact friends, but also a platform for distributing and sharing information, where people can publish all the good things they prefer. According to the statistics, by July 2021, Facebook sits at more than 2.85 billion monthly active users (Statista, 2021), accounting for two-fifths of the global population, which means the impact of such platforms is huge and unpredictable. With the great development of media, more and more negative problems will appear at a random time. Some inappropriate content will also be published, such as pornography, harassment, hate speech, and so on. Therefore, content regulation becomes necessary. However, content supervision is not an easy task, and there will rise many problems and controversies.

What problems will arise during the content moderation?

  • Regulation Sequence

The first problem is about the regulated sequence, which means that the inappropriate content can not be immediately deleted or blocked. Because of the huge scale of the digital platforms, tens of thousands of posts are published on the platforms in seconds.

As Shirky (2008, as cited in Gillespie, 2017) states, “Nearly all platforms must embrace a ‘publish-then-filter’, that means user posts are immediately public, without review, and platforms can remove questionable content only after the fact.”.

Image by Cea. is licensed under CC BY 2.0

The problem of this procedure is that the creepy posts may also be released. Even if the publication’s time is short, due to the unpredictable propagation speed of the Internet, some criminal actions may occur and have certain negative effects. In addition, it is easy for the users, who are against the rules and be banned to post, to re-register an account and then publish again, leading to an endless competition between these people and content moderators.


  • A Double-edged Sword: Negative effects on revenue

Considering the economic aspect, content moderation may lead to the problem of the loss of users, thereby reducing profits. In the second quarter of 2021, Facebook’s total advertising revenue was about 28.5 billion dollars, and other revenue generated 497 million dollars (Statista, 2021). This shows that advertising revenue accounts for the majority of the income of digital platforms. Most platforms rely on the content from users to make traffics to attract more users and advertisers. However, Gillespie (2018) considers platforms face a dilemma because content regulation is a double-edged sword.

Pay-per-Click – Author: Seobility – License: CC BY-SA 4.0

On the one hand, if the regulation is too little, the users may leave, to give up the toxic environment. Furthermore, he also believes that, if the supervision is too little, pornographic, violent or other disturbing content will scare away the advertisers, who do not want to see their products’ advertisements appear with those posts at the same place. On the other hand, with excessive moderation, users may still leave, because they think that their freedom go speech is largely restricted. As a result, it is difficult to grasp the degree of content regulation, and the loss of users and traffic will also let advertisers choose other platforms to find more target consumers.


  • The Exploitation of Cheap Labor

The last problem is the exploitation of cheap labor. The automatic algorithm of filtering content has limitations, and the high error rate leads to the urgent need for manual intervention. The complex process of classifying uploaded materials into acceptable and rejected categories is far beyond the capability of the software (Roberts, 2019).

Jee (2020) says that “The whole function of content moderation is farmed out to third-party vendors, who employ temporary workers on precarious contracts at over 20 sites worldwide.”.

These content moderators do repetitive work every day, to review some irregularities, including child abuse, violence, obscenity, bloody videos, or images, and then delete them manually.

Outsourcing. Image source: ourcing.jpg.

They receive extremely low salaries while suffering great psychological trauma.As maintained by Roberts (2019), human intervention is a key but obscure part of the regulation of online platforms that rely on users’ uploaded content. Their work is indispensable, but they are often be ignored and their psychological problems have never been concerned.




What Controversies?

  • The Terror of War Issue

    Story Behind The The Terror of War: Nick Ut’s “Napalm Girl” (1972). Souce: Youtube –


In addition to the problems mentioned above, there are also some controversies in the implementation of content regulation. In September 2016, a reporter included a photo named The Terror of War in his article, and then the Facebook moderator deleted the reporters’ post and banned him twice because the picture contained a minor nude.

After reinstated the photo, Justin Osofsky, the Facebook Vice President, explained, “In many cases, there’s no clear line between an image of nudity or violence that carries global and historic significance and one that doesn’t. Some images may be offensive in one part of the world and acceptable in another,” (Gillespie, 2017).

There is no doubt that minors’ nudity is prohibited in the global communities, but people think that removing this photo is the wrong decision of Facebook. Although the picture contains a child who is completely naked and screaming painfully, it may be deeply disturbing and violating the child safety policy. However, this photo reveals the terror of chemical warfare to people all over the world and appeals for maintaining peace. Social media platforms should relax the policies on iconic, historical, and valuable images. However, the digital platform is geared to the needs of the whole world, and different regions have different limits on content regulation. For the ‘violated’ content, such as The Terror of War, the review criteria need to be changed, and even if there is a clear standard, the screening process is always difficult.


  • Marginalization & Fairness of Censorship

Another controversy is about the marginalization of some groups and the fairness of censorship. According to Gillespie (2017), most of the employees of social media platforms are educated white men, and most of them are literals or libertarians. Whether the white-male-dominated censorship system will give priority to some contents in an opaque way or not, leading to the neglect of a few peoples’ views or culture. Moreover, some values may also be imposed on international user groups with different values.

Reddit Alien Wordmap” by rich8n is licensed under CC BY-NC-SA 2.0

On Reddit, the most popular communities are geeks’ interests, including technology, science, pop culture, games, etc. (Massanari, 2017), which are usually belonging to interests with white-male-centered. In the case of Gamergate, moderators are unable to retain a neutral attitude when they are regulating content, making the platform a hotbed for extreme cultures, like anti-feminism. Additionally, there is no protection for the users who may potentially be harassed. However, the administrators explain that it is necessary for them to maintain a neutral and fair platform to safeguard the peoples’ right to freedom of expression (Massanari, 2017). The justice here is more like non-interference, allowing extreme voices to spread on the platform. This is because they rely on users’ free content to make profits while transferring the responsibility to them. Therefore, this controversial justice may spread the content that the moderators want to disseminate, but ignore the voices of other groups.


What Should Government Do?

Finally, the government should play a greater role in restricting the content of social media. Following the section 230 safe harbor, digital platforms only provide the Internet or other network services, but they are not content producers, who are not responsible for users’ posts (Gillespie,2018). In October 2020, the CEO of Facebook, Google, and Twitter attended the hearing of the US Senate, and they all opposed the withdrawal of the section 230. The digital platforms have a certain responsibility for the content review, but it is not enough, so that the support of the government is needed.

As the Reality Check team (2020) states, “social media platforms will face fines if they do not delete extremist content within an hour; Or in China, there are hundreds of thousands of cyber-police, who monitor social media platforms and screen messages that are considered to be politically sensitive.”.

But some policies of the government still needs to be improved. In 2011, China’s government promulgated a policy of Weibo users’ real-name system to reduce the publication of inappropriate speech. South Korea has also implemented the real-name system policy, but it took four years from promulgation to abolition, because tens of millions of users’ personal details information were leaked. Therefore, while strengthening the regulation of the Internet, the government should also establish relevant policies to avoid the loss of users’ data, for ensuring the benign operation and development of social media.

Information Security. Source:
digital-traffic-signs-security-579553/.Pixabay License Free for commercial use.


Content moderation is a challenge, and this challenge grows with the development of the platforms. It is such a difficult task to balance the degree between freedom of speech and violation, which will raise many problems and controversies. However, while the digital platforms have their policies, the government should also intensify supervision to create a better network environment.

Reference List:

Gillespie, T. (2017). Governance by and through Platforms. In Burgess, J., Marwick, A., & Poell, T. (Eds.). The SAGE Handbook of Social Media, London: SAGE. pp. 254-278.

Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. pp. 1-23.

Jee, C. (2020). Facebook needs 30,000 of its own content moderators, says a new report. Retrieved from:

Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346.

Roberts, S. T. (2019). Behind the Screen: Content Moderation in the Shadows of Social Media. Yale University Press. pp.33-72.

Reality Check team of BBC News. (2020). Social media: How do other governments regulate it?. Retrieved from: /news/technology-47135058

Statista. (2021). Most popular social networks worldwide as of July 2021, ranked by number of active users(in millions). Retrieved from:

Statista. (2021). Facebook’s global revenue as of 2nd quarter 2021, by segment(in million U.S. dollars). Retrieved from:


The Dilemma of Content Regulation: How Complex and Difficult theModeration Is!by Charlotte He is licensed under a Creative Commons Attribution 4.0 International License.