With the development of modern technology and the rise of new and various social media platforms, people begin to gain emotional satisfaction and life convenience in the virtual environment of the Internet. Social media platforms arose out of the exquisite chaos of the web. Many were designed by people who were inspired by (or at least hoping to profit from) the freedom the web promised, to host and extend all that participation, expression, and social connection (Gillespie, 2018). As social media platforms aim to give users the most comfortable using experience (such as free speeches), more and more problems are proliferating.
Content moderation is about an online platform screen and monitor user-generated content based on platform specific roles and guidelines to determine if the content should be published on the online platform or not (Besedo, 2020). As this is a necessary measure to ensure better experience of social media platforms, many problems also arise and affect content moderation. The first issue is common: the number of users on social media platforms is growing, as the amount of information that the regulators must deal with every day. This is a huge challenge. The problem is beyond the capability of software and algorithms. Yet the process to handle this content is often catch- as-catch-can. On many highly trafficked sites, the amount of user-generated content submitted is staggering and growing. Issues of scale aside, the complex process of sorting user- uploaded material into either the acceptable or the rejected pile is far beyond the capabilities of software or algorithms alone (Roberts &Sarah, 2019). Secondly, content regulators are required to have high standards, but they are being paid less and less. Professional moderators must be experts in matters of taste of the site’s presumed audience and have cultural knowledge about the location where the platform is based and the platform’s audience. As social media firms’ quest for workers willing to perform digital piecework proceeds on a global scale, the remuneration for their labor is offered at ever lower rates (Roberts &Sarah, 2019). Thirdly, perhaps the most important point, when regulating online content, it is likely to see all kinds of extremely violent and pornographic negative content. Such work brings intense psychological discomfort to human regulators, and many of them suffer from negative emotions or mental illness as a result. This hard work would hit the regulators very hard. As Roberts and Sarah stated in their writing, these platforms therefore engage commercial content moderators to perform tasks that oscillate between the mind-numbingly repetitive and mundane to exposure to images and material that can be violent, disturbing, and, at worst, psychologically damaging. Also, one of the spokesmen describes the job of content moderation as “a yucky job”. The final issue is the inconsistency between countries in enforcing content moderation. At the same time, it is a secret work. Commercial content moderators, I realized, labor under several different regimes, employment statuses, and workplace conditions around the world—often by design. Frequently, they are deployed far away from the physical locations where the content they moderate is created, and at great distance from the hosting sites of the platforms for which the material is destined. Their work titles often differ, ranging from “content moderator,” to “screener,” to “community manager,” to a host of other labels that may sometimes be euphemistic, fanciful, or imply little about their expected job activities—let alone their relation to others doing similar work. In fact, even the workers themselves have difficulty recognizing one another by job title alone (Roberts &Sarah, 2019).
Content regulation is to allow users to have a better network experience and avoid negative and personal attacks as much as possible. However, the exposing of violent and aggressive information that also contains educated messages is controversial. Some advocates to delete this kind of information while some believes it is a meaningful act under the premise of exposing evil and upholding justice. Gillespie and Tarleton raised a question of it: does the fact that something is newsworthy supersede the fact that it is also graphic? These questions plague efforts to moderate question- able content, and they hinge not only on different values and ideologies but also on contested theories of psychological impact and competing politics of culture. For example, Titled the Terror of War but more commonly known as “Napalm Girl,” the 1972 Pulitzer Prize–winning photo by Associated Press photographer Nick Ut is perhaps the most indelible depiction of the horrors of the Vietnam War. Since the moment it was taken, this photo has been an especially hard case for Western print media—and it continues to be so for social media. It was always both a document of war and a troubling object of concern itself: “Kim’s suffering was captured and published movingly in a still photograph—and the still is what became the iconic image (Gillespie, 2018). The two are at contrary when it comes to privacy and rights or the achievement of larger goals. It is hard to tell right from wrong in this case. Content regulators will be divided, and the public will see this as a contentious issue that is difficult to resolve.
Moderation is not an ancillary aspect of what platforms do. It is essential, constitutional, definitional. Not only can platforms not survive without moderation, they are not platforms without it (Gillespie, 2017). The government should pay more attention to the regulation of the online content and make more vigorous regulations in order to provide healthy social platforms for all the users. This act is not only responsible for the users but also for the platform itself. Content moderation has always been a relatively hidden effort from the users. Moderating user-generated content has been a relatively unknown and frequently not-fully-disclosed aspect of participation in social media and websites and services that rely on user-generated content (Robert & Sarah, 2019). In the first episode of the first season of Black Mirror (Otto, 2011), the British Prime Minister was forced by an anonymous kidnapper on the Internet to have sex with a pig live on TV in order to rescue the kidnapped British royal princess. When the horrific scene was finally shown live on TV, there were stunned expressions from those waiting to see it. One after another, people bowed their heads, showing that they do not want to watch it anymore. The example in this movie also indicates that when faced with the exposure of such information, people are not able to accept it. This also shows the importance of content moderation to users. In the reality, there is none truly perfect and free environment for people to express themselves. It’s just a figment of fantasy. The reason to study moderation on social media platforms goes beyond preventing harm or improving enforcement. Moderation is a prism for understanding what platforms are, and the ways they subtly torque public life. Our understanding of platforms, both specific ones and as a conceptual category, has largely accepted the terms in which they are sold and celebrated by their own managers: open, impartial, connective, progressive, transformative. This view of platforms has limited our ability to ask questions about their impact, even as their impact has grown and/or concern about them has expanded (Gillespie, 2018).
In conclusion, in order to create a healthier online environment suitable for all the users, content regulation with a regulatory system is an indispensable measure for the harmonious coexistence of users and the platform itself.
Gillespie, Tarleton (2017) ‘Governance by and through Platforms’, in J. Burgess, A. Marwick & T. Poell (eds.), The SAGE Handbook of Social Media, London: SAGE, pp. 254-278.
Gillespie, T. (2018). Regulation of and by platforms. The SAGE Handbook of Social Media, 254–278. https://doi.org/10.4135/9781473984066.n15
Massanari, A. (2016). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
Roberts, Sarah T. (2019) Behind the Screen: Content Moderation in the Shadows of Social Media. New Haven, CT: Yale University Press, pp. 33-72.
What is content moderation? Besedo. (2021, January 11). Retrieved October 15, 2021, from https://besedo.com/resources/blog/what-is-content-moderation/.