
Introduction & Background
With all the benefits that the growth of the Web 2.0 era has brought, as same as brought a lot of negative content has also emerged on digital platforms., such as bullying, harassment, violent content, hate, and pornography. To address these issues, a new review of platform responsibility is needed. (Gillespie, 2017, p. 273). Internet intermediaries are a broad role of many types, including Internet service providers, social media platforms, and search engines, illustrating that it does not simply act as an intermediary. Internet intermediaries should therefore be held more accountable for stopping the spread of problematic content. This article delves into who is responsible for preventing the spread of problematic content, the reason they are in that role, and how to prevent it, supporting that claim with three arguments: The self-regulation of the internet intermediaries and platform, The effectiveness of government control and the duty of individuals, and the conflict between internet intermediaries and the government.
The Self-regulation of The Internet Intermediaries and Platform
The internet intermediary can self-regulate as same as the platform because they are operating under its platform laws and domestic or other countries’ law and are undertaken by large technology and telecommunications companies. (Flew et al.,2019), which means the scope of their regulation is relatively broad. Furthermore, one of the jobs of internet intermediaries entails figuring out how to legally regulate the activities and behaviors that are going on the Internet. As Internet intermediaries act as service providers, social media platforms. And not only does the algorithm handle content moderation and ranking, but it can also perform other functions similar to publishers. (Council of Europe, 2018). For example, a hate speech case of the Hindu Yuva Vahini, should be eliminated by human censorship and artificial intelligence tools because the information is stored on the server of the intermediary. (Ukey et al., 2022).

Additionally, the nudity and violence on the social media platform can be traced back to a historical issue which is an image called The Terror of War left over from the Vietnam War. (Gillespie, 2018, pp. 2). Not only that, the legacy of hate speech, offensive speech, extremism, fake news, abuse, harassment, and other issues are deeply rooted and affect more audiences, culture, society, and digital media intermediaries. Inevitably, the platform won’t be able to filter every content, while an internet intermediary that stores user content could review, filter, and delete those illegal contents. According to a 2020 survey by Plan International of 14,000 girls aged 15-25, 58% of them were found to have been harassed or abused online. (Vanguard, 2022). Social media intermediaries can significantly reduce the dangers of online violence against women, which demonstrates the duty of Internet intermediaries to evaluate how their practices and services affect human rights. Therefore, this responsibility should not be left entirely to the platform, and social media intermediaries also have some duties.
The Effectiveness of Government Control & The Duty of Individuals
The government has regulatory power and executive force over problematic content, which strictly conducts a crackdown on the problem of bullying, violence, harassment, hate, and pornography according to law. According to the Intermediary Rules 2021, “significant social media messaging intermediaries should enable the government to identify the originators of information or messages.” (Ukey et al., 2022). Because the government is particularly efficient in the spread of supervising content. In terms of responsibility, the government should have the right to supervise the platform and content. For instance, the German government levied the first €2 million fine against Facebook under the NetzDG law in July 2019 for negligence or deliberate retention of posts on fake news and hate speech. (Check, 2020). So, Facebook also suffered a huge loss in the economy.
“Germany fines Facebook $2.3 million over report” by Newsy. All rights reserved. Retrieved from: https://www.youtube.com/watch?v=N5OxVjoVcDQ
Given that the Chinese government governance problematic content is crucial as a valuable example. The Chinese government began focusing on the assumption of “Internet sovereignty” in May 2010, not only for all Chinese Internet users to abide by Chinese laws and regulations, but also all the Chinese media, platforms, and intermediaries are tightly controlled by the Chinese government using keywords, bandwidth restrictions, and the self-censorship practices of many individuals and media companies. (Xu & Albert, 2017). All these point to the fact that the government can fulfill its obligations more indirectly via policies and laws.

On the other hand, as part of the network society, individuals should also be aware of the moral and social responsibility of the Internet. Since the Internet affects all aspects of society, including individuals, mostly because our behavior has a certain impact on others. It’s important to note that a user’s speech comprises more than just what they say on social media platforms; it also includes the consequences of what they say. More specifically, based on encoding and decoding theory (Hall, 2005), individuals are free to encode while thinking about whether their code is the same as the decoding of other users. Like, the cyberbullying incident (Hendry, 2018) of 13-year-old Ben McKenzie, who committed suicide after receiving cyberthreats and abuse on social media and mobile devices, amply highlights the need for internet users to take responsibility.
The Conflict Between Internet Intermediaries and The Government

In fact, the government cannot perfectly cooperate with the platform to regulate due to conflicting interests and resources. Government governance is more inclined to government interests than a freedom ideology. On the contrary, Internet intermediaries play a significant role in protecting freedom of speech and online access to information since freedom is the original intention of the internet. Moreover, “The balance between law and self-regulation will, inevitably, be difficult to strike” (Tambini, 2019). Because platform governance also tends to be in the interests of themself, sometimes platforms are ineffective at stopping hate and misinformation. As Flew mentioned, Google is quick to act when videos are found to violate copyright rules, but not as quick when hateful or illegal content appears. (House of Commons 2018: 10, as cited in Flew et al., 2019, p. 39).
Besides, if Internet intermediaries had been timelier and more active in controlling online bullying, violence, and other content, there might have been one less victim. Even though bullying and violence similar to offline can’t be stopped, policing content on the Internet in the jurisdiction is the biggest job. In general, the Internet intermediary is like a big umbrella, the greater the ability, the greater the responsibility.
How To Stop?
First and all, the area of responsibility for harmful content is so broad, so Internet intermediaries should consider and reflect. Next, laws and policies need to have a clear direction and comprehensive restrictions. Identifying accountability is a significant element that can be challenging to resolve, but what is needed is a strong and active assertion of the rights and responsibilities of users, platforms, and governments. Further, adopting the appropriate incentives for Internet intermediaries and other platforms to better carry out their duties. The strategy of notification, flagging, deletion, and reduction is still in place to stop the spread of problematic content. Alternatively, it could be improved in consideration of direct user feedback, but it cannot meet the needs of users at the same time. Then, detect problematic information by combining the latest techniques, user reporting, and human assessment. The monitoring, screening, filtering, and algorithmic planning involved in this is a good way to distribute problematic content, while Internet mediators that store user content are typically more equipped to manage illegal content than many other online mediators. This is mainly since if illegal online content is to be addressed, the involvement of internet access service providers, internet intermediaries, social media platforms, and internet intermediates that store user content has become practically necessary. (Wilman, 2020, pp. 378).

Conclusion
As a result, due to the Internet operating in a broad regulatory area according to laws and regulations, Internet intermediaries can self-regulate as social media platforms do, so they are more responsible for the problematic content. Even though the government has the power of supervision and law enforcement over problematic content, it can prevent the dissemination of problematic content through policies and laws, but the government is more interested in its interests than the freedom of the Internet. However, individual responsibility is the source of content and should account for their own behavior, but Internet intermediaries use the latest technology, delete, tag and other methods is a fundamental action to society. Although owning a healthy Internet culture is not easy, Internet intermediaries should be to make a difference, whether it’s the content distribution of the individual or the spread of negative.
Reference
Check, R. (2020, February 12). Social media: How do other governments regulate it? BBC News. https://www.bbc.com/news/technology-47135058
Council of Europe. (2018, August 15). Roles and Responsibilities of Internet Intermediaries: the Council of Europe has developed human rights-based guidelines. https://www.coe.int/en/web/freedom-expression/-/roles-and-responsibilities-of-internet-intermediaries-the-council-of-europe-has-developed-human-rights-based-guidelines
Flew, T. Martin, F. & Suzor, N. (2019) Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50.
Gillespie, T. (2017). Governance by and through Platforms. In J. Burgess, A. Marwick & T. Poell (Eds.), The SAGE Handbook of Social Media (pp. 254-278). London: SAGE.
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. pp. 1-23.
Hall, S. (2005). Encoding/Decoding. In M. G. Durham & D.M. Kellner (Eds.), Media and Cultural Studies: Keyworks (pp.163-173). John Wiley & Sons, Incorporated.
Tambini, D. (2019, October 28). Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination. Centre for International Governance Innovation. https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination/
Ukey, D., Raut, N, A., & Pal, G. (2022, October 9). Block hate, not religion, on social media. Free Press Journal. https://www.freepressjournal.in/analysis/block-hate-not-religion-on-social-media
Vanguard. (2022, September 16). Responsibilities of social media platforms in addressing online gender violence. https://www.vanguardngr.com/2022/09/responsibilities-of-social-media-platforms-in-addressing-online-gender-violence/
Wilman, F. (2020). Chapter 12: CONCLUSIONS. In Wilman, F. (Ed)., The Responsibility of Online Intermediaries for Illegal User Content in the EU and the US. (pp. 378-386). Cheltenham, UK: Edward Elgar Publishing. https://www-elgaronline-com.ezproxy.library.sydney.edu.au/view/9781839104824.00026.xml
Xu, B. & Albert, E. (2017, February 17). Media Censorship in China. Council on Foreign Relations. https://www.cfr.org/backgrounder/media-censorship-china