Online problematic content and its harm have been encouraged both by free speech online and increased participation in internet activities. According to Australian eSafety’s research, 60% of teenagers have seen harmful content online, and more than 40% said they have been through negative online experiences (The eSafety Commissioner, 2021). The following paragraph will mainly focus on the responsibility that online platforms and governments should present. Also, pointed out and criticized part of the current regulations they implied.
What is problematic content?
Inappropriate content has various standards between countries. The Australian government has defined ‘illegal and restricted online content’ as content that ranges from the most seriously harmful material (eSafety, 2022), for example, online pornography, violence or illegal content, nudity, hate speech, and so on. As we were stepping into web 2.0, internet communication and transformation were not limited to gaining information from the internet but being an active producer of online content and joining participatory culture. Thus, the culture of openness and immature regulation facilitated current problematic content. Stakeholders also keep updating their forms of regulations.
However, the controversy between counter-speech and regulations arose. The first amendment supports counter-speech by saying that “more speech is an effective remedy against the dissemination and consumption of false speech” (Napoli, 2018). But, the counter-speech theory is more like an “ideal type” than an “empirical reality” (Napoli, 2018) because the theory is based on the concept of “rational audiences,” which most citizens are not. Moreover, online platforms tend to hide with moderation because they try to emphasize the public image with “a space with free speech” in order to attract more online users. Besides, Human content moderator seems tough due to the unpleasant working task and potential for psychological harm (Roberts, 2022). To add on, the nature of the content is a complicated concept; it includes its content, intent, unintended consequences, and meaning (Roberts, 2022). The moderator needs to do more research and action to solve the difficulties of moderation.
Who should be responsible?

As the original producer of problematic content, individuals need to regulate themselves first. Possible actions can be reconsidering their content before posting and understanding the boundaries between positive content with negative ones. Moreover, users should be aware that they have the responsibility to obey the regulations as they accept the terms and conditions of the platforms. Also, “flagging” the content that makes you feel disturbed can result in a review by moderators (Roberts, 2022).
Platforms are playing as the middle between users. They “host, store, organize and circulate the content of others” (Gillespie, 2022). Platforms should be responsible for content regulation because they are “designed to protect individuals” (Flew et al., 2019). They are acting like a multi-sided market on the internet economy as the user’s utility will be affected by the participation of other users. For example, the Uber driver will be better off if more passenger users are using uber. Therefore, they have the responsibility to construct a clean environment for consumers. In addition, platforms benefit from posting advertisements on their sites and offering functions to charge their users. Platforms earn profits from bringing the seller and buyers together. For example, uber said they charge 25% off the drivers. But, uber did not transparent about their total cost, and we are not able to know how much they earn from drivers and passengers. In other words, People using platforms are expecting to have a positive experience on the platforms. Thus, platforms should take public interests into account. However, some platforms argue that counter-speech will help with reducing hate speech. Facebook, as an example, proposed to “support the voices that are engaged in counter-speech” to “unleash Couterpeech initiatives across global.”.
Government should take more responsibility for content regulation as they have stronger powers. They are the ones who can put the most stress on both individuals and platforms to moderate content. But, as people support counter-speech, the government’s power will be minimalized due to the open speech environment (Napoli, 2018). Also, individuals with different cultural backgrounds understand problematic content differently. It highlighted the indispensability of national moderation.
How do they moderate and space for improvement
Platforms
Platforms can moderate content and improve it in multiple ways. Firstly, applying more forms of content moderation reduces inefficient or wrong content cancellation. Facebook has deleted the pictures and blocked the account of Australian breast cancer survivors because their photos are considered nudity. However, the citizens think these photos are not related to sexuality, and Facebook restored the photo after realizing it. As a result, platforms may be thinking of using various forms of regulation, such as human moderators, AI-oriented moderators, third-party companies and so on. Due to some content might be hard to detect, It will reduce the possibility of mishandling complicated content if the human moderators confirm with content that has controversy. The rights of minority groups will then not be affected by inefficient regulations.
Furthermore, although platforms can satisfy their private interest by identifying themselves as a space with free speech and increasing platform activities, they still need to use stricter content moderation to balance the public interests. All platforms must impose rules, or they will not be untenable (Gillespie, 2018). Paying attention to creating a healthy public image is good for both platforms and companies with advertising cooperation. As Gillespie claims, “if the right balance is struck, the platform can enjoy the traffic and revenue both generated by users seeking illicit content and by users who want a clean experience of the platform.”. (Gillespie, 2018)
However, as platforms support counter-speech, the public interest is being damaged. For example, no restriction will lead to fake news and misinformation. It has caused numerous people to be influenced and nervous about these issues. Also, the whole of society will be affected. From the Australian government statistic, 4/5 Australian adults experienced misinformation about covid-19, and it prompted people to “undermine public health efforts, eroding trust in democratic institutions and causing an economic loss for the Australian mobile industry“.
Governments

The government is also taking online content moderation seriously. During the Australian 45th parliament, the government passed new legislation about harmful online content. The government has kept responding to various problematic content with detailed moderation rules. For instance, the regulations about limiting online gambling.
Not only laws, but the government can also use their power in education. Providing online content education to citizens will raise overall online knowledge. From research, 3/4 of teenagers want online safety information. Education on internet safety helps people find a clear standard of inappropriate content and actions after they experience negative online experiences.
On the other hand, worldwide online content moderation still has limitations due to the different cultural backgrounds and political aims. The platforms are an environment that includes users from all over the world, and the regulations become harder to implement. That’s because the standard for problematic content is not the same. The cultural differences prompt a different understanding. There are lacking regulations across countries.
Conclusion
Online platforms play multiple roles in content moderation. Although platforms sometimes focus on their private interest, they still use various forms of moderation while organizing the content. Individuals are encouraged to be more concise in content production and reduce inappropriate content from its origin. The government also gives its best to moderate content by imposing legislation and education for individuals. However, there is a space to sophisticate worldwide legislation. Stopping problematic content needs the cooperation of all internet stakeholders, but the platforms and the government should take the most responsibility.
Reference
Australian Communications and Media Authority. (2021). ACMA misinformation report. Retrieved from https://www.acma.gov.au/sites/default/files/2022-03/ACMA%20misinformation%20report_Fact%20sheet%201%20-%20key%20research%20findings.pdf
Australian digital teens: Negative experiences and responses to them – infographic. esafety. (2022). Retrieved 14 October 2022, from https://www.esafety.gov.au/research/digital-lives-aussie-teens/australian-digital-teens-negative-experiences-and-responses-them/infographic.
Counterspeech. Counterspeech.fb.com. (2022). Retrieved 14 October 2022, from https://counterspeech.fb.com/en/.
eSafety. (2022). Online Content Scheme Regulatory Guidance. Retrieved from https://www.esafety.gov.au/sites/default/files/2021-12/eSafety-Online-Content-Scheme.pdf
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal Of Digital Media &Amp; Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1
Gillespie, T. (2018). Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. All Platforms Moderate, 1-23. https://doi.org/10.12987/9780300235029
Gillespie, T. (2022). Governance by and through Platforms. In The SAGE handbook of social media (pp. 254-278). SAGE Publications. Retrieved 14 October 2022, from.
Hashmi, S. (2022). How Much Do Uber Drivers Make? Salaries Explored – The Teal Mango. The Teal Mango. Retrieved 14 October 2022, from https://www.thetealmango.com/lifestyle/how-much-do-uber-drivers-make/.
McPherson, E. (2022). ‘It’s actually very distressing’. 9news.com.au. Retrieved 14 October 2022, from https://www.9news.com.au/national/breast-cancer-survivors-want-facebook-to-stop-censoring-mastectomy-and-reconstruction-photos/0ee1002d-a385-43c5-8ac9-87084e1fd854.
Napoli. (2018). What If More Speech Is No Longer the Solution? First Amendment Theory Meets Fake News and the Filter Bubble. Federal Communications Law Journal, 70(1), 55–0_7.
Regulation of Australian online content: cybersafety and harm. Www-aph-gov-au.translate.goog. (2022). Retrieved 14 October 2022, from https://www-aph-gov-au.translate.goog/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/BriefingBook46p/Cybersafety?_x_tr_sl=en&_x_tr_tl=zh-CN&_x_tr_hl=zh-CN&_x_tr_pto=sc.
Roberts, S. (2022). Understanding Commercial Content Moderation. In Behind the Screen : Content Moderation in the Shadows of Social Media (pp. 33-72). Yale University Press. Retrieved 14 October 2022, from.
The eSafety Commissioner. (2021). The digital lives of Aussie teens. Retrieved from https://www.esafety.gov.au/sites/default/files/2021-02/The%20digital%20lives%20of%20Aussie%20teens.pdf