
As the information sharing paradigm of the Internet has changed – with the emergence of the ritual model (James Carry, 2009) – the boundaries between senders and receivers have gradually blurred. However, as Internet management has evolved, a new profession has emerged among senders and receivers – content censors. The job of content moderators is to review and screen content that is allowed to be published on public web platforms. This article will discuss why censors need to check online information and who should be responsible for censoring content such as violent gore and pornography.
Question1: why do we have censors?

With the growth of the Internet and the proliferation of users, the online
environment is no longer just a place to socialise. It has brought with it many commercial interests. Therefore, Internet censorship is brutal for an online platform with commercial interests to delete any information directly and entirely. Gillespie Tarleton (2018, P21) proposed an online platform like a multilateral market, which makes profits by matching sellers and buyers, producers and viewers. It has even become a breeding ground for harassment and criminals by allowing anonymous speech. In order to maintain a positive and sunny online environment, each online platform has a different code of conduct for user speech. Whatever speech is made needs to be sanctioned along the platform’s guidelines, and the person who enforces the censorship and sanctions is the online censor. In truth, they merely multiply the blurry lines that must be anticipated now and adjudicated later. This is an exhausting and unwinnable game for those who moderate these platforms, as every rule immediately appears restrictive to some and lax to others or either too finicky to follow or too blunt to do justice to the range of human aims to which questionable content is put. (Gillespie Tarleton, 2018, P45) The diversity of unfiltered user-generated content is a giant poison to the vast world of the Internet, with objectionable content such as pornography and violence severely damaging the mental health of minors and even adults. There are currently around 50 million unique YouTube creators worldwide, uploading up to 500 hours of video per minute, putting an enormous strain on the work of video reviewers. (Junhuan Liu, 2020) The positive and sunny state of the online world is not pristine but filtered through several censors. Therefore, our online world needs censors, as they filter out inappropriate information to the general public. Censors serve not only the general public but also the government, for example, statements or videos that are detrimental to the country’s development are removed by censors who specifically serve the government, and this profession has a considerable degree of control over the functioning of society. (David Belson, 2019) Online censors are therefore indispensable in today’s Internet 2.0 era. They are the judge of the vague norms of the day, and each ruling affects the development of a healthy online environment as well as the functioning of society.
Question2: The influence of human censoring network information in censors

Today’s online environment is protected by censors who work quietly to censor objectionable content. Nevertheless, faced with the sheer volume of pornographic and violent information, it is difficult for the health and well-being of censors to be free from harmful effects. According to an interview in The Washington Post (2019), 14 current and former moderators in Manila recounted nightmares in which self-absorption was a common consequence of the job. Some described seeing colleagues having mental breakdowns at their desks. One said he had attempted suicide as a result of the trauma. Online censors are highly susceptible to PTSD (post-traumatic stress disorder) after being confronted with so much bad information. Although platforms offer psychiatric services to compensate censors, they have little effect. Many censors want to improve AI technology and reduce the number of times they view wrong information in person. However, technology companies admit they may never reach the modest level of full automation. (Dwoskin, 2019). This statement implies that a machine will never replace its position. However, it also implies that an occupation that severely compromises the health and well-being of individuals will always exist. For some people with employment difficulties, especially Filipinos, the salary generated by profession is still an essential source of financial support. Despite the free psychotherapy services, many people are still attracted to applying for the censor job. (as site by Gillespie, 2018) Thus, the profession continues to have a steady stream of censors who suffer massive psychological trauma despite the apparent adverse effects. The best way to deal with the sensor’s dilemma at present is to give maximum medical support and as much technical help as possible, dismantle the review’s basic structure and find a new model.
Question3:Government’s role in content moderation

It is unreasonable that almost all of the dark side of the world wide web is being undertaken by these outsourced people with little oversight and uniform standards. (Gillespie, 2018) In addition to these highly discrete outsourced teams, there should also be government support and intervention. The government should maintain absolute supremacy and the right to intervene in the online world, and all content publishing should follow government policy under regulation. The right to control can be delegated to other parts and platforms, but the right to control cannot be completely abandoned. Governments should face up to the existence of online censors and gradually bring them under the umbrella of the state. Nevertheless, as Gillespie (2018) mentions, considering employees’ safety and not being angered by cyber hooligans and harassers, many employees are increasingly inclined to hide their identities. The government could therefore create a secret organisation to hide employees’ IPs for maximum protection and to provide the best psychological treatment. However, as the online world opens, excessive government control can affect the freedom of expression of internet users. As David Belson (2019) argues, social media has become increasingly illiberal globally for the ninth consecutive year as governments increasingly use it as a conduit for surveillance and election manipulation. Therefore, governments also need to diversify the way they control it, with mechanisms for multi-interest collaborative governance and similarly flexible management approaches more applicable to the current web 2.0 environment. Despite freedom of expression, there must be boundaries that stand in opposition to gory violence, pornography and other objectionable information. The role of government in online censorship is one of absolute boundaries and authority.
Internet censorship is still a high-risk profession, and censors are under enormous psychological pressure to deal with all kinds of violent and pornographic information on the world wide web daily. Governments need to draw the boundaries of the web, change the basic structure of censoring lousy information in the online world, and face up to the existence of internet censors to give them good protection. In addition, governments need to balance the degree of freedom of expression while maintaining authority and not leave each platform entirely alone to figure out the boundaries of freedom of expression, as this will make online speech more difficult to manage.
Reference:
Automotive Social Media Marketing. (2022). [Image]. Retrieved 6 October 2022, from https://wordpress.org/openverse/image/1dbc38ef-daab-40c5-9730-18ac9f498fdf.
Belson, D. (2019). Social Media Crisis Drives Ongoing Decline In Global Internet Freedom – Internet Society. Internet Society. Retrieved 6 October 2022, from https://www.internetsociety.org/blog/2019/11/social-media-crisis-drives-ongoing-decline-in-global-internet-freedom/?gclid=CjwKCAjws–ZBhAXEiwAv-RNL-fbZIh2VtgpRIOjwPWnZV3Ng18DYprLC7H6OtxxxsaF7gmS99E_ChoCSegQAvD_BwE.
Bearman Cartoon Freedom of Speech. (2022). [Image]. Retrieved 6 October 2022, from https://wordpress.org/openverse/image/a4accd1a-52c0-4c1a-af27-b0bc5bd5c048.
Carey J. W. and Stuart Adam. G (2009). A cultural approach to communication. In. Communication as culture, Essays on media and society. (2nd ed.) New York: Routledge, (pp.11-28)
Dwoskin, E. (2019). job conditions for content moderators in the Philippines. [Video]. Retrieved 6 October 2022, from https://www.washingtonpost.com/technology/2019/07/25/social-media-companies-are-outsourcing-their-dirty-work-philippines-generation-workers-is-paying-price/.
Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
Content moderators at YouTube, Facebook and Twitter see the worst of the web — and suffer silently. The Washington Post. (2019). Retrieved 4 October 2022, from https://www.washingtonpost.com/technology/2019/07/25/social-media-companies-are-outsourcing-their-dirty-work-philippines-generation-workers-is-paying-price/.
“Yellow storm” video to see the psychological shadow, content moderators can be rescued by AI?. Huxiu.com. (2022). Retrieved 4 October 2022, from https://www.huxiu.com/article/335656.html.