Content Moderation are Raising Worries

"Automotive Social Media Marketing" by socialautomotive is licensed under CC BY 2.0

Content moderations of digital platforms have raised hot issues among internet users as growing digital platforms have been linking more and more users from different regions in recent years. From originally gathering places for limited internet users to worldwide platforms, digital platforms like Twitter are having more complex and diverse users, which creates new and increased opportunities for abuse and discrimination. The idea of free speech enables tons of hate speech on digital platforms and users are bothered by multiple offensive speeches. This urges digital platforms to set full community standards, which is so called content moderation.

“The Story Behind the ‘Napalm Girl’ Photo Censored by Facebook” credit to by Nick Ut is licensed under CC BY 2.0

Since content moderation’s appearance, it has been helping internet landscape a lot to keep itself clean. With time, digital platforms conduct content moderation in several certain values. For example, they need to keep users authenticity, which means they request user’s information and speech to be real and responsible. Basically, digital platforms want to make sure user’s safety. Thus, threatening and menacing speeches are not allowed as well. Also, user’s privacy and personal information are protected, in order to contribute to free speech and free will. Moreover, users are asked to pay respect to others, in consideration of everyone’s dignity. Content that does not meet those fundamental values would be delete from the platform. However, people are also holding different views towards content moderation. There are concerns about which kinds of content is offensive or sensitive. According to Gillespie, one of the biggest challenges platforms face is establishing and enforcing a content moderation regime that can address both extremes simultaneously(Gillespie, 2018). Sometimes, you just can’t do both. When Norwegian journalist Tom Egeland put “Napalm girl” in one of his articles, which won Pulitzer Prize in 1972 , this post was soon removed by Facebook moderators and Tom Egeland even got suspended twice afterwards. This situation raised wide discussion and critics about Facebook moderator and some even all Facebook as “ the world’s most powerful editor”(Gillespie, 2018). Finally, after more than a week, Facebook reinstated this photo. To explain this situation, we must make clear that there is no clear line between offensive and acceptable, in consideration of cultural difference. Some content might be offensive in some countries while it is acceptable in other countries. When people from these region get together, this kind of content becomes sensitive and hard to deal with.

Steam Games : From PC Gamer
Steam Games : From PC Gamer

Furthermore, incomplete policies have the possibility to result in even more serious situations. In 2014, the Gamergate shocked the internet by its chaotic argues. Things started from an anonymous blog about a terrible breakup with his ex-girlfriend. Soon people found out the author, Eron Gjoni, and through information he gave in the post, they found his ex-girlfriend, Zoe Quinn, who is an independent game designer of Depression Quest. In his post, he revealed that Zoe Quinn cheated on him and used her intimate relationships with games journalists to get positive views of Depression Quest.  Within the gaming community, some argued that it was another instance of questionable ethics in games journalism(Stuart, 2014). Before long,  people started to condemn Zoe Quinn and the corruption of gaming industry. However, those posts or comments were soon removed by platform moderators. In comparison with silent media, curious public went even crazier even though their discussion would be immediately banned from the website at Zoe’s request. They believed she definitely had some intimate relationships with gaming journalists as more and more digital platforms were deleting their posts or videos even if they were just trying to explain the situation. They even found that Zoe Quinn used feminism as her protective umbrella to hype her game and attack other game designers, even female.Nevertheless, Zoe held a different statement. She claimed that she was the one who was discriminated in game industry for her female identity. Then, this battle even caused battle between feminists represented by Zoe Quinn and anti-feminists from different forums. From then on, this issue had developed from personal conflict to game industry corruption and sex discrimination in this industry. 

How did this happen even though moderators had tried to control the situation? This case is considered as an example of digital moderation regulation failure. Reddit is an online space where toxic technocultures coalesce and propagate(Massanari, 2017). As an environment of little accountability, anonymity and increasing global users, reddit is a nice place for toxic technocultures to survive and thrive(Bernstein et al., 2011; Pfaffenberger, 1996). From 4chan to other chan-style image boards, users were not asked to take the responsibility for what they said so they were free to say anything, including harassment. Moreover, those forums are all anonymous, which means user’s speech was hard to trace. Eron and  Zoe is an exclusion but users still believe they are invisible online so they can say anything about this affair. In this case, forum users did not keep their manners when they speak on the internet, which drive this issue to chaos. In contribution to this, globally growing members made it harder for website moderators to take in charge of every comment because it is hard to balance people’s opinions when they come from different countries. One thing that intensified this situation is how moderators act to control. They merely deleted posts from internet users, giving little explanation of the issue. When they removed user’s post, the excuses were always like copyright, which sounded not that important in this situation. This even strengthened the reverse psychology of the audience. The battle did not happened in an open and clear environment for both sides to communicate and moderators did not give convincing proof or reasons to remove those posts, out of which people grew more suspicious and angry.

WeChat has an estimated 2 million mini programs covering a variety of services, which now include reporting on epidemics. © 2020 by Masha Borak is licensed under Attribution-NonCommercial-NoDerivatives 4.0 InternationalChaotic situation of content moderation raise people’s concern about whether government involvement is needed on digital platforms or not. Some people claim that governments should have a greater role in enforcing content moderation restrictions on social media. This is a serious question that people have been worried about internet surveillance for a long time. There are news about digital platforms like Twitter or Facebook selling or leaking users information. So it is not an unexpected reaction for people to further worry about being monitored by government. Actually, China is an existing instance to refer to. The combination of China government and China digital platforms is deeper than many people think in the field of content moderation. Government has been collaborating with digital platforms to exploit Mini Programs on apps. On one hand, in a good way, the involvement of government tends to enforce government’s control over those platforms. To some extent, this might help to prevent digital platforms to sell user information illegally. On the other hand,  it also raise people’s concern that they are monitored anyway in China, and they do not have much freedom to hide or speak(de Kloet ;Poell ; Guohua;Yiu Fai, 2019). For years, China has had relatively strong content moderation over culture like music or films. Every film and television work or music is asked to pass content moderation firstly by the government before they are able to be put on digital platforms. Disqualified works will be asked to edit again to pass the moderation. This regulation has been denounced by many people as they are not allowed to create their content freely under this policy. Therefore, it is important to control the extent of government’s involvement in digital platforms in case of limited free speech. 


Gillespie, Tarleton. (2018) All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. 

Stuart K (2014) Zoe Quinn: “All Gamergate has done is ruin people’s lives.” Available at: http://

Bernstein MS, Monroy-Hernández A, Harry D, et al. (2011) 4chan and /b/: an analysis of ano- nymity and ephemerality in a large online community. In: Proceedings of the ICWSM 2011, Barcelona, Spain, 17–21 July 2011, pp. 50–57. Menlo Park, CA: AAAI Press.

Pfaffenberger B (1996) “If I want it, it’s ok”: usenet and the (outer) limits of free speech. The Information Society 12: 365–386.

de Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019). The platformization of Chinese Society: Infrastructure, governance, and practice. Chinese Journal of Communication: The Platformization of Chinese Society, 12(3), 249–256.