In the web 2.0 era, platformisation thrives on the flourishing user-generated content and the construction of decentralized online communities. (Paech, 2021) Creative global users allow digital platforms to profit from the “greater diversity of voices.”(Flew, 2021) However, not only positive content is produced, immoral, unethical and illegal materials are also spread via digital platforms, which makes content moderation a crucial task. This essay suggests that the commercial nature of platforms and the prejudice of their moderators are the main factors that hinder platforms from pursuing and achieving a better moderation outcome. Therefore, governments should take the role of guiding and supervising the platforms to execute content moderation.
Why Do Platforms Implement Content Moderation?
“platforms generally frame themselves as open, impartial, and noninterventionist…to avoid obligation or liability.” (Gillespie, 2018, p. 7)
Platforms label themselves as intermediaries that provide a public sphere where users can enjoy the freedom of speech. (Schlesinger, 2020) However, the absolute openness indicates the “utopian notions of community and democracy” which is an idealized fantasy drawn by platforms. (Gillespie, 2018, p.5) When pornographic, abusive, illegal, and discriminatory content disrupting the order of the networked publics, interfere with platforms’ operation and impact their business performance, they have to take measures in regulating user-generated content. (Gillespie, 2018; boyd, 2010) Content moderation helps to balance the platform’s commercial purpose with social responsibility, optimize service quality and protect users from malignity. Tarleton Gillespie (2018) suggests that content moderation should be regarded as a central service of platforms instead of the peripheral one.
Source: Appen (2021)
Why Do Platforms Downplay Content Moderation?
Source: Absolute Motivation (2018)
How Does Moderator’s Prejudice Influence Content Moderation?
“When rules of propriety are crafted by small teams of people that share a particular worldview, they aren’t always well suited to those with different experiences, cultures, or value systems.” (Gillespie, 2018, p.8)
In-house moderators of giant digital platforms are influencing by the “Californian Ideology”. (Barbrook & Cameron, 1996) They are high-tech labourers who work in Silicon Valley and earn the best wages. In relation to the Center for Employment Equity (2017), most of them are white, male and well-educated. The homogeneity might endow them with the “communal prejudice” that leads them to think from an emic perspective and judge content in terms of colonization and Western tech-utopianism. (Lusoli & Turner, 2021) Materials might have multiple connotations. And users decode material based on their personal comprehension while moderators tend to judge materials from their own perspectives. It is worth doubting that can moderators make an objective judgement for users? Or are they just working for evaluating whether the content is detrimental to people like themselves?
For instance, Facebook deleted the photo The Terror of War quickly after it was posted. The photo shows that several children are fleeing napalm attacks from Vietnamese soldiers. A naked girl in the middle is suffering from napalm burning over her body. (Gillespie, 2018) Moderators of Facebook removed the photo because it includes underage nudity and violence. They ignored the emotional and historic significance of the photo. They made the decision based on their communal values without resonating with children who are suffering from the untold pain and unimaginable crimes caused by wars.
Another example is that Tumblr banned the term #gay “because it is commonly associated with pornographic images, and thereby blocked all other non-pornographic content similarly tagged.” (Gillespie, 2018, p. 270) Tumblr’s moderators failed to consider the perspective of homosexual users.
Governments’ governance towards content moderation is crucial when platforms’ commercial nature impaired their social responsibility. The contradictions between the diversity of users and homogeneity of moderators, between platforms’ commercial attributes and social liability are the main obstacles for platforms to implementing moderation. Therefore, governments should enact edicts and supervise platforms’ execution. To avoid hegemonic control, governments are supposed to act as supervisors of platforms instead of regulators of users. Platforms should take the role of scrutinizing the government as well. As Gillespie (2018, p. 264) said “[platforms] must.. decide how to translate a new legal obligation [enacted by governments] into an actionable rule, react to the emergence of a category of content they would like to curtail, and respond to surges of complaints from users.
Absolute Motivation. (2018). You Will Wish You Watched This Before You Started Using Social Media/ The Twisted Truth. Retrieved October 2021, from Youtube: https://www.youtube.com/watch?v=PmEDAzqswh8
Barbrook, R., & Cameron, A. (1996). The Californian Ideology. Science as Culture, 6(1), 44-72 https://doi.org/10.1080/09505439609526455
Boyd, D. (2010). Social Network Site as Networked Publics: Affordances, Dynamics, and Implications. Networked Self: Identity, Community and Culture on Social Network SIites, 39-58. https://www.danah.org/papers/2010/SNSasNetworkedPublics.pdf
Bucher, T. (2012, 11). Want to be on the top? Algorithmic power and the threat of invisibility o Facebook. New media & society, 14(7), 1164-1180. https://doi.org/10.1177/1461444812440159
Castells, M. (2002). The Culture of the Internet. In The internet galaxy reflections on the Internet, business, and society (pp. 36-63). Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199255771.001.0001.
Center for Employment Equity; University of Massachusetts, Amherst. (2017). Is Silicon Valley Tech Diversity Possible Now? https://www.umass.edu/employmentequity/silicon-valley-tech-diversity-possible-now-0.
de Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019, 7 3). The platformization of Chinese Society: infrastructure, governance, and practice. Chinese Journal of communication, 12(3), 249-256. https://doi.org/0.1080/17544750.2019.1644008
Dutton, W. H. (2009, 3 1). The Fifth Estate Emerging through the Network of Networks. Prometheus (Saint Lucia, Brisbane, Qld.), 27(1), 1-15, ISSN: 08109028 https://doi.org/10.1080/08109020802657453
Flew, T. (2021). Week 6 – Governing the Internet: Content Moderation and Community Management. Retrieved October 2021, from Canvas: https://canvas.sydney.edu.au/courses/34089/pages/week-6-governing-the-internet-content-moderation-and-community-management?module_item_id=1160574
Flew, T., Martin, F., & Suzor, N. (2019, 3 1). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of digital media & policy, 10(1), 33-50 ISSN: 25163523 https://doi.org/10.1386/jdmp.10.1.33_1
Gillespie, T. (2014). The Relevance of Algorithms. In Media Technologies: Essays on Communication, Materiality, and Society (pp. 167-193). Cambridge, Massachusetts: The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.001.0001
Gillespie, T. (2018). All Platforms Moderate. In T. Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23) New Haven, CT: Yale University Press. https://doi.org/10.12987/9780300235029
Gillespie, T. (2018). Governance by and through Platforms. The SAGE handbook of social media, 254-278 ISBN: 9781473984066 https://doi.org/10.1177/2056305120936636
Gorwa, R. (2019). The platform governance triangle: conceptualising the informal regulation of online content. Internet Policy Review, 8(2), https://doi.org/10.14763/2019.2.1407.
Kelty, C. M. (2014). The Fog of Freedom. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: essays on communication, materiality, and society (pp. 196-220). Cambridge, Massachusetts: The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.001.0001
Lusoli, A., & Turner, F. (2021, 4). “It’s an Ongoing Bromance”: Counterculture and Cyberculture in Silicon Valley—An Interview with Fred Turner. Journal of management inquiry, 30(2), 235-242 ISSN: 10564926 https://doi.org/10.1177/1056492620941075
Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New media & society, 19(3), 329-346 ISSN: 14614448 https://doi.org/10.1177/1461444815608807
Paech, V. (2021). Week 6 – Governing the Internet: Content Moderation and Community Management Lecture： Online Community Management. Retrieved October 2021, from Canvas: https://canvas.sydney.edu.au/courses/34089/pages/week-6-governing-the-internet-content-moderation-and-community-management?module_item_id=1160574
Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (Vols. ISBN: 9780300245318 DOI: 10.12987/9780300245318 OCLC Number: (de-b1597)540572, pp. 33-72). New Haven, CT: Yale University Press. https://doi.org/10.12987/9780300245318
Schlesinger, P. (2020). After the post-public sphere. Media, culture & society, 42(7-8), 1545-1563 ISSN: 01634437 https://doi.org/ 10.1177/0163443720948003.
Xu, B., & Albert, E. (2017). Media Censorship in China. Retrieved October 2021, from Council on Foreign Relations: https://www.cfr.org/backgrounder/media-censorship-china