Who should be responsible to stop problematic content on the internet? How?

"Anonymous Hacker" by dustball is licensed under CC BY-NC 2.0.
“Anonymous Hacker” by dustball is licensed under CC BY-NC 2.0.


The Anonymity of Media is a Double-Edged Sword?

The development of the internet and social media notably shapes and influences the culture of anonymity. The anonymity of media plays a significant role in the internet as it allows all online users to share and interact on all types of content and information on digital platforms. However, this feature is a very powerful double-edged sword – The media coverage enables victims to speak out sensitive and problematic experiences that are extremely painful and challenging to tell offline (Oldfield & McDonald, 2022). Nevertheless, the anonymous feature of the internet apparently provides a breeding ground for criminals to spread and profit from problematic content on the internet via cryptocurrencies (Du et al.. 2020).


The Telegram Nth room case, a terrifying cybersex crime case occured from 2018 to 2020 has shocked Korean society followed by the entire world as it takes immense advantage of the anonymity of Telegram and cryptocurrency in order to traumatise and make profit from innocent ladies. A group of men had made use of the Telegram messenger application to blackmail and coerce female to record sexual exploitation videos for the purpose of economic benefits whilst the explotative content are distributed in various chat rooms on Telegram (Chang & Joohee, 2021). The Korean legislation revised the Sexual Violence Punishment law in order to sentence criminals who watch, store, possess and purchase illegal sexual content as well as culprits who use explotative content for blackmail or coercion up to three years in prison following the case. Moreover, this incident is so shocking that Netflix decided to film it into a movie named Cyber Hell: Exposing an Internet Horror.


Who should be responsible for stopping the spread of exploitation content? How?

“Justice” by m.gifford is licensed under CC BY-NC-SA 2.0.

1) The Government

As legislation authorities have the power to tackle inappropriate contents that should be censored under the control of internet governance (O’Hara & Hall, 2018), the adoption of legislative measures would be a fundamental strategy to manage and bring a stop to the distribution of exploitation content (Barclay, 2017). Consequently, The government should be responsible and make use of their authority to make an end to the spread of inappropriate content.

  • Revise Current Regulations regarding Exploitation Content

As inappropriate content on social media platforms could create unintended tragedies, the legislation authorities should revise regulations in order to protect and prevent online users from viewing and sharing related content. For instance, “Sulli’s Law” was introduced in South Korea’s National Assembly after Sulli, a 25-year-old Korean actress, decided to end her life due to malicious verbal attacks on social media platforms. The “real identity validation” could tackle online anonymity and suppress irresponsible behaviour by social media users.


2) Social Media Companies

In spite of the fact that there is social media etiquette for netizens, there are no strict rules that could make social media users to follow compulsorily. Therefore, social media policies should cooperate with legislation authorities to make use of their coercive power to stop the spread of illicit content on their platforms.

  • Revision of Social Media Policies

Online platforms should revise the age verification policies as there is no requirement for proof of age to sign up; for that reason children and teenagers could bypass restrictions, create accounts and be exposed to exploitation contents. This happens in spite of the 1998 law that requires parental consent for the collection of data on children as they are not old enough to distinguish right from wrong. New online safety rules and strict age verification regulations should be introduced in the view of the fact that it is effortless for children to access inappropriate content online.

  • Review and Change the Algorithms for Recommended Content

Social media platforms could stop recommending deleterious and inappropriate content to netizens as a large number of social media users tend to view and share the content recommended by algorithm, which creates filter bubbles in social networks (Berman & Katana, 2020). The examination and adjustment of social media algorithms could be an effective method to avoid the spread of exploitation content on social media.


3) Financial Service Providers

“Money” by Nufkin is licensed under CC BY-SA 2.0.

The rise of the global cybercrime industry is due to economic benefits whilst an estimated profit is a US$1 trillion industry according to various professional research (Kshetri, 2010). There is an entire cybercrime industry with attributes of legitimate business that includes markets, exchanges, operating specialists, integrated supply chains and outsourcing service providers just because it is able to generate tremendously high profits with relatively low risk (Gaidosc, 2018). Accordingly, financial service companies should work with government institutions to stop the spread of such content.

  • Suspend Credit Card Payments

The suspension of credit card payments by financial service companies would be an exceptionally effective way to prevent the spread of exploitation content. For example, financial service providers Visa and Mastercard distanced Pornhub, a successful adult platform that distributes illegal contents that involves sexual exploitation, misogynistic, racist, rapist and mistreatment without consent. Therefore, financial institutions investigated the allegations and suspended payments to this platform in order to eliminate illegitimate activities against the law.


4) Online Users

The communication between online users, self-presentation, social comparison are significant social behaviours on social media platforms. Successful self-regulation is able to provide high quality content for social media users and accelerate a healthy industry growth on the internet (cusumano et al., 2021). In contrast, the spread of harmful content and cyberbullying are the failure paradigms of self-regulation (Barclay, 2017). Online users should be responsible for every word they speak as well as every content they post and share on the internet.

  • Stop Reading and Sharing Negative Contents

There are uncountable cases of the distribution of exploitation content on the internet, and the most momentous perpetrators are online users. For instance, Molly Russell, a 14-year-old British teen suffered from depression and suicided due to the negative influences of depressive online content. Therefore, netizens should prevent viewing aggressive and depressive content that escalates psychological stress in order to prevent the circulation of inappropriate content as well as the participation of cyberbullying (Camacho et al., 2018).

  • Social Media Detox

As social media becomes an important and inseparable part in modern lifestyle, young adults tend to have a disruptive habitual behaviour on social media platforms. Social media detox, taking a break from social media could strengthen qualitative communications as the amount of time spent on face-to-face communication is increased. The detox also has positive effects on time management and productivity as the amount of “free time” is increased (Lepik et al., 2019). The recuperation from online toxicity could be a successful method to stay away from inappropriate content online.


"Tablet social media" by ijclark is licensed under CC BY 2.0.
“Tablet social media” by ijclark is licensed under CC BY 2.0.



The anonymity of the internet is a powerful double-edged sword as the media coverage could be helpful for victims to speak out about challenging experiences in the past; but it also allows criminals to spread and profit from inappropriate content (Du et al.. 2020) at the same time. Consequently, the government, social media companies, financial service providers and online users should be responsible for the spread of exploitation content. The government could revise current online safety regulations; social media companies could revise social media policies and online algorithms; financial service providers can suspend credit card payment; and online users could experience social media detox as well as stop reading and sharing exploitation content online. It is always challenging to tackle inappropriate content completely, but the next decade is likely to witness a considerable successful move if the society is willing to be responsible and change for the sake of the online community.



Barclay, C. (2017). Cybercrime and legislation: a critical reflection on the Cybercrimes Act, 2015 of Jamaica. Commonwealth Law Bulletin, 43(1), 77–107. https://doi.org/10.1080/03050718.2017.1310626

Berman, R., & Katona, Z. (2020). Curation Algorithms and Filter Bubbles in Social Networks. Marketing Science (Providence, R.I.), 39(2), 296–316. https://doi.org/10.1287/mksc.2019.1208

Camacho, S., Hassanein, K., & Head, M. (2018). Cyberbullying impacts on victims’ satisfaction with information and communication technologies: The role of Perceived Cyberbullying Severity. Information & Management, 55(4), 494–507. https://doi.org/10.1016/j.im.2017.11.004

CE Noticias Financieras. (2020, Dec 11). Mastercard and Visa distance the page ‘Pornhub’ by illegal content. http://ezproxy.library.usyd.edu.au/login?url=https://www.proquest.com/wire-feeds/mastercard-visa-distance-page-pornhub-illegal/docview/2469588452/se-2

Chang, J., & Joohee, K. (2021). Nth Room Incident in the Age of Popular Feminism: A Big Data Analysis. Azalea (Cambridge, Mass.), 14(14), 261–287. https://doi.org/10.1353/aza.2021.0016

cusumano, M. A., Gawer, A., & Yoffie, D. B. (2021). Can self-regulation save digital platforms? Industrial and Corporate Change, 30(5), 1259–1285. https://doi-org.ezproxy.library.sydney.edu.au/10.1093/icc/dtab052

Du, K., Fan, Y., Lu, S., Zhou, G., &  Zhuge, J. (2020). A Market in Dream: the Rapid Development of Anonymous Cybercrime. Mobile Networks and Applications, 25(1), 259–270. https://doi.org/10.1007/s11036-019-01440-2

Gaidosc, T. (2018). The industrialization of cybercrime. Finance & Development, 55(2), 22–25.

Kshetri, N. (2010). The Global Cybercrime Industry: Economic, Institutional and Strategic Perspectives (1. Aufl.). Berlin, Heidelberg: Springer-Verlag. https://doi.org/10.1007/978-3-642-11522-6

Lepik, K., & Murumaa-Mengel, M. (2019). Students on a Social Media “Detox”: Disrupting the Everyday Practices of Social Media Use. Communications in Computer and Information Science, 989, 60–69. https://doi.org/10.1007/978-3-030-13472-3_6

O’Hara, K., & Hall, W. (2018). Four Internets: The Geopolitics of Digital Governance (No. 206). Centre for International Governance Innovation. https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance

Oldfield, J. C., & McDonald, D. (2022). “I Am That Girl”: Media reportage, anonymous victims and symbolic annihilation in the aftermath of sexual assault. Crime, Media, Culture, 18(2), 223–241. https://doi.org/10.1177/17416590211002246