Digital Platforms and Content Moderation: An impossible problem?

Group 7, assignment 2

“Social Media Icons Color Splash Montage – Landscape” By Blogtrepreneur is licensed with CC BY 2.0.

What is content moderation? And why do digital platforms need content moderation?

Social media platforms arose out of the exquisite chaos of the web (Gillespie,2018, pp.5). They were initially designed to build a free community where people can communicate with each other directly and share values. However, the reality is always disappointed, with the violent,  the obscene, the pornographic, the illegal, the abusive, and the hateful (Gillespie,2018, pp.5). Therefore, to protect users and groups, platforms must moderate, as to show the best appearance to the public and their customers. In this project, content moderation is the process of supervising and practicing a pre-set of rules to user-uploaded content to decide if such communication is allowed (especially a post). Nevertheless, digital platforms are always more or less tied to business, politics, culture, and history, it’s hard to find a universal rule for every platform.

Moderation is not an ancillary aspect of what platforms do. It is essential, constitutional, definitional. (Gillespie,2018, pp.21)


Why is this an impossible problem?

It’s nearly impossible for platforms to find a proper boundary for the public between the acceptable and the prohibited. The majority of platforms target their users worldwide, which include various politics, cultures, and history. Content moderation policies are derived from the policy use cases of the region in which they are located, without due regard to the context of the particular region. The fact that most platforms come from the US may define or shape the enforcement and the development platforms’ global policies according to North American cultural, legal, political, and social norms (Panday, 2020).

If there are posts with a certain historical or cultural significance

“The Story Behind the ‘Napalm Girl’ Photo Censored by Facebook” credit to by Nick Ut is licensed under CC BY 2.0

is difficult to identify whether they should be issued, according to the subsequent public feedback will decide whether to delete, but this is related to the different cultures around the world. The Napalm Girl caused an uproar after it was removed by Facebook, which officially responded that it “violated platform policy” and that the photo had caused some users to resist its disturbing, nude appearance. The tag they give to this picture is always related to ‘nauseating,’ ‘obscene,’ and in ‘poor taste. People tend to believe this decision was motivated by the refusal of the US government to recognize the war of aggression. (Campbell, 2016, pp.135) On the other hand, historians and critics find its value of ending the war and it’s should be expressed. Some images may be acceptable in one part of the world but not in another. Content moderation is not only determining what is acceptable and what is not, but also balance importance and offense; mediating when people harm one another, purposely or not; coordinating competing value systems; respecting the outlines of political discourse and cultural taste; extending moral obligations across cultural, linguistic, and national boundaries; trying to solve the inequities of sexuality, gender, class, and race; and doing all that around the hottest issues of the day. (Gillespie,2018, pp.10)

For these questionable contents, not just different ideologies and values are considered by the moderation system, it also deals with theories of competing politics of culture and psychological impact. Founders of platforms always claim that the online community holds every user’s value and makes the moderation standard based on “our value”, but the fact is the criteria are made by the owners to protect the profit of the company. Designers and managers of digital platforms tend to think their users’ value is “same as us”, but in fact, there are gender and class issues in management. The majority of full-time employees are an overwhelmingly white male who is educated, liberal or libertarian, and most of them should be technological in skill and worldview. (Roberts, 2019, pp.34)

Online content moderation requires a large amount of labor and technology costs. Digital platforms rely on content uploaded by users, and to protect the community, media platforms have to categorize and filter content. There is no complete technology/algorithm that can implement automated screening, so a lot of labor needs to be invested. The platform will hire professional screeners/teams to develop a screening system before work, and the moderators can decide to delete those posts before they are officially sent out according to this system. Professional moderators must be specialists in the tastes of the site’s presumed audience and have cultural knowledge of the platform’s location and the platform’s audience (Roberts, 2019, pp.38). However, content screeners meet the description of knowledge workers, which are the product of the technology of the digital network economy and depend on the cultural capital of workers and their ability to engage in ICT (Roberts, 2019, pp.40). Especially when it comes to the commercial issues, content moderators are not just following the rules established by the platform, but also do “knowledge products”. The staggering – and growing – the amount of user-submitted content on high-traffic sites like Facebook, Twitter, and YouTube also creates a need for more content moderators.

Why Content Moderation Costs Social Media Companies Billions. Source: YouTube – 


On reducing the costs of hiring content moderators, Reddit adopts a “hands-off” policy, which allows users to moderate content voluntarily. Reddit has become a “toxic techno-cultures” space through the cases of “The GG” and “The Fappening”, which also be addressed as the result of political, cultural, and platforms design (Massanari, 2017, pp.331). Moderators who suspend the subreddits that share the forbidden content may encounter countless trolls and boycotts, even death threats. Both users and managers should not treat free labor as exploited labor.

Together, we need to democratize access to information. But some of your users see that freedom as a license to hurt others. This can be bad. (Gillespie,2018, pp.6)

“Reddit Alien Wordmap” by rich8n is licensed under CC BY-NC-SA 2.0

When digital platforms acknowledge that they have mechanisms in place, they typically position themselves as open, impartial, and non-interventionist-partly to avoid obligation or liability, and partly because the founders fundamentally believe that they do (Gillespie, 2017, pp.269). However, implementing a transparent policy or strictly enforcing moderation will result in lost traffic and loss of interest, after all, social media platforms are essentially commercial enterprises. Greenberg points out that some reason Reddit administrators may have been unwilling to suspend “the-fappening” earlier may be economic: for six days, users can buy enough Reddit gold (the currency which pays for the costs of Reddit’s server) to make the whole site work for an entire month. (as cited in Massanari, 2017, pp.336).



The fantasy of a truly “open” platform is powerful, resonating with deep, utopian notions of community and democracy—but it is just that, a fantasy. There is no platform that does not impose rules, to some degree. Not doing so would simply be untenable. (Gillespie, 2018, pp.6)



More political for online content moderation

Government involvement in content moderation on digital platforms may be contradictory. Former US President Donald Trump had his account suspended by Twitter for making some inappropriate comments. As some people may say, he just exercised his right to free speech. But since the consequences of his comments on Twitter have some impact on American politics and democracy, Facebook decided to block his account. The conflict between individual rights and democracy has always been an unsolved problem, government here should balance the citizen’s rights and protect the free online community as well as moderating digital platforms. The boundary between the government and media is blurry, and it’s difficult for digital platforms to make profits while following the government’s policy. It’s reasonable to believe that some digital platforms have become a “gray zone”. In order to protect the democracy of our society, more restrictions on digital platforms should be considered by the government.
























Campbell, W. J. (2016). Picture Power? Confronting the Myths of the Napalm Girl Photograph In Getting It Wrong: Debunking the Greatest Myths in American Journalism (pp.130-149). University of California Press. DOI: 10.1525/9780520965119


Gillespie, T. (2017). ‘Governance by and through Platforms’, in J. Burgess, A. Marwick & T. Poell (eds.). In The SAGE Handbook of Social Media, pp.254-278. London: SAGE.


Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23). New Haven: Yale University Press. DOI: 10.12987/9780300235029-001


Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. DOI: 10.1177/1461444815608807


Panday, J. (2020, December 23). Exploring the problems of content moderation on social media. Internet Governance Project. Retrieved from


Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (pp.33-72). Yale University Press. DOI: 10.12987/9780300245318


CNBC. (2021, February 18). Why Content Moderation Costs Social Media Companies Billions. Video. YouTube.



Creative Commons Licence
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.