Is it necessary for the government to spend lots of energy on media content moderation?

Content by Nick Youngson CC BY-SA 3.0 Alpha Stock Images


In 2023, social media platforms have reached a very high level of development and popularity worldwide. Social media platforms not only bring great convenience to people’s life, politics and society, but as we all know, the threshold for its use is extremely low. In “A declaration of the Independence of cyberspace” written by Barlow(1996), he argued that in the world of the Internet, anyone can speak freely without fear of being constrained. When everyone can be the creator of platform content, that content inevitably begins to drift away from the platform and even beyond its control. As a result, the Internet is filled with all kinds of people and unhealthy content, such as pornography, obscenity, and bloody violence (Gillespie & Tarleton, 2018). At the same time, it also provides opportunities for some people with ulterior motives to commit crimes. At this time, it is necessary for social media platform regulators to screen or delete undesirable content on the platform, including text, videos, images, etc., in order to maintain a healthy network environment. Due to the large volume of business, online censorship is also becoming a new industry, and large social media companies will hire a team of people to review the content posted by creators on the platform for compliance with the platform’s regulations. At this time, a dissenting voice has emerged, arguing that the content regulation system infringes on the right to free speech. However, the line between free speech and content moderation is too difficult to draw, and it is difficult to protect users from harmful content and at the same time maintain free speech. Moreover, the platform has a clear set of guidelines for content review, but due to cultural differences and value differences in countries or regions, some topics may be sensitive or even offensive in some areas, but may not be in another country. This situation is one of the platform’s biggest challenges. So far, the necessity of content review has been a controversial issue.

Next, this essay will make a rough analysis of the advantages and disadvantages of content review to discuss and demonstrate that it is necessary for the government to spend more energy on the content review of social media platforms. First of all, the content review system is indeed not perfect. The loss of users caused by restricting content will have an impact on media companies, but the result of not restricting content will be severe oppression of vulnerable groups, such as gender discrimination against women and exclusion of ethnic minorities. These consequences will have a great adverse effect on social harmony.

Side effects of media content moderation

It is true that strict content moderation does cause some problems for the economics and operations of social media companies. As mentioned above, especially for Facebook, Instagram, Twitter, such as a large number of social media platforms, the company will set up a department to deal with user content management. At present, the technology of artificial intelligence review is not very mature, and there are often some misjudgments, so in most cases, media companies still use manual methods to review content, which increases a lot of employment costs. And even if you do that, you won’t be able to eliminate 100 percent of the negative content on the platform. Moreover, because some people value free speech, content censorship can make them feel watched and inhibited. Users on interactive media platforms largely rely on user-generated content to monetize and grow. Once users feel uncomfortable with the experience on a platform, they are likely to give up using it or transfer to other platforms that do not have content restrictions, thus losing the opportunity for the platform to make profits (Burgess, 2018). The Chinese government’s control of media content is a case in point. Chinese users are “monitored” to varying degrees no matter what social media they are on, and comments that are even slightly sensitive or non-mainstream are deleted (de Kloet; Poell ; Guohua; Yiu Fai, 2019). This is clearly not in line with the rights that free speech should give citizens, and is one of the drawbacks of content moderation. But this does not mean that the government should not regulate media content, because the advantages of content censorship outweigh the disadvantages. 

Racial discrimination

If nobody manage media content, the comment of racism will cause racial isolation become more serious. People of color are sometimes perceived by their employers and colleagues as less competent than white men in the workplace (Flew, 2019). Race-based bullying in the workplace is common, and while people of color are no less capable than white men, their efforts and achievements are still often overlooked. On the Internet, it’s common to see men of color lashing out at white men or their employers for unequal treatment in the workplace. What started as a conflict between two people has become a conflict between two races through the fermentation of the Internet. Over time, the antagonism between the races gradually formed.


If nobody manage media content, women will suffer a lot from misogynist comments.  There has always been a bias against women on the Internet, and users are therefore subject to algorithms, just like racism, women are often perceived as less competent than men in every way. Even Google, one of the world’s largest browsers, has included a lack of respect for women in its search recommendations. A Google search for the word “women” results in a highly sexist reference to “women can’t drive” (Noble, 2018). Society tends to stereotype women in addition to their abilities. Searching for reports and news about women on the Internet, one can find that the stereotype of women is housewives, raising children, taking care of husbands and so on. The scope and impact of such an extreme opinion in a huge daily traffic engine like Google is immeasurable. And these vulnerable groups have no way to express their inner voice under oppression, let alone change their image in the eyes of the public. The behavior of burying the value of women is not only a loss for society, but also a tragedy for mankind. Therefore, the importance and necessity of content review is shown at this time. Curbing the spread of these negative messages at the source can effectively prevent their further spread.

“Justice Gavel” by toridawnrector is licensed under CC BY-SA 2.0


To sum up, it is clear that while Internet brings convenience to people, there are also some potential problems. There is still no country or company that has managed to balance the balance between free speech and content moderation. The suitability of social media platforms still needs to be adjusted by various efforts. In any case, for the interests of vulnerable groups, it is necessary for the government to participate in media content control. If only media companies consciously supervise users, they may relax regulation due to their own interests, thus requiring the intervention of the government (Sample, 2019). Media companies also have the responsibility and obligation to cooperate with government agencies to monitor the online environment. A positive Internet content that is full of positive energy can maintain people’s physical and mental health. Without content moderation, the Internet would be filled with racist, sexist, and other negative comments. Therefore, the protection of vulnerable groups becomes the main purpose and benefit of content moderation. In general, the harmony of the Internet environment is not only achieved by the efforts of one person or institution, but also requires the mutual restraint between the audience, the platform and the government.


Barlow, J. (2019). A declaration of the Independence of cyberspace. Duke Law & Technology Review, 18(1), 5-7.

Burgess, J., Marwick, A. E., & Poell, T. (Eds.). (2018). The sage handbook of social media. SAGE Publications, Limited. 

De Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019). The platformization of Chinese Society: Infrastructure, governance, and practice. Chinese Journal of Communication: The Platformization of Chinese Society12(3), 249–256.

Flew Terry, Martin Fiona, Suzor Nicolas. (2019) Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10 (1), 33

Gillespie, Tarleton. (2018) All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. 

Noble, S. U. (2018). Algorithms of oppression : How search engines reinforce racism. New York University Press.

Luckman, S. (1999). (En)gendering the digital body: Feminism and the internet. Hecate, 25(2), 36-47.

Roberts, S. (2019). 2. Understanding Commercial Content Moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (pp. 33-72). New Haven: Yale University Press.