Introduction: The Internet and its Content
While digital landscapes enable individuals to connect and communicate beyond physical borders, they are also a vehicle for sharing and disseminating problematic and harmful content (Beaurchere, 2014). To ensure the internet is safe and free from harmful material, there are many groups involved in moderating content online, all of whom play an equally important role.

Joseph S. Nye Jr (2016) explains that a lack of a centralised network is fundamental to the internet’s culture, history, and community. ‘Everyone and no one’ controls, operates and maintains the internet (2016). Lack of centralisation means that the internet and its infinite content could not be simply ‘shut down’ or turned off from one singular hub. Instead, various bodies work together to ensure that the internet remains safe and is reflective of a society’s values and beliefs (Castells, 2011). Politicians, governments, independent institutions and communal bodies all have similar ideologies about keeping the internet a safe and regulated space, although their content moderation methods may differ (Bonnici & Mestdagh, 2005) (Akdeniz, 2003).
What is Problematic Online Content?
Problematic digital content is outlined as any content that may harm a community or an individual, such as harassment, bullying, hate speech, or content that is offensive or pornographic (Spada, 2014) (Massanari, 2017). Problematic online content may not always be explicitly illegal, but instead, go against the cultural and social expectations of digital landscapes while having prolific effects on the well-being and health of individuals (Kırcaburun et al., 2019).

Internet Moderation and Internet Culture
Freedom of speech is fundamental to the democratic system that has shaped much of the Western world (Carmi, 2011) and was extremely influential in the creation of the internet (Pandey & Ravishankar, 2022). The principles of free speech are used in the moderation of internet content and are used by the various bodies involved in the process of content regulation (2022).
In digital spaces, there is a moderation culture that exists beyond governmental or private organisations, as communities come together to control content in their own way (Castells, 2011). This style of moderation is fundamental to the culture of online spaces and is reflective of early internet landscapes. Castells (2011), explains that while technological freedom was at the centre of early internet culture, there was also an emphasis on the community and collective nature of digital spaces. Within the early internet, ‘collective construction transcended individual gain’ (Castells, 2011) and the internet was seen as a place where ideas and beliefs could be both shared and respected. Early internet communities were passionate about free speech and believed that it had to be protected online, although these views would adapt as the internet transformed. As the internet began to change throughout the 21st century, so did online communities’ relationship with freedom of speech, as problematic online content began to threaten the rights of both individuals and collectives.

Is Anyone Up?
In the year 2010, Hunter Moore created the revenge porn website ‘Is Anyone Up’. The site shared sexually explicit images of mostly women, often with the inclusion of their personal information such as their address, their work or their family members’ names (Connor, 2014). The images shared on ‘Is Anyone Up’ were posted with the intent to harm, humiliate and shame individuals and were essentially impossible to get removed from the site (Strokes, 2014). In 2010, there were no legal ramifications for posting or sharing ‘revenge porn’, a term that is explicitly used to describe the non-consensual sharing of sexually explicit photos as a form of revenge (Strokes, 2014). Individuals attempted to contact police regarding the sharing of these images but law enforcement and government agencies still had little understanding of the internet and how to control or navigate illegal activity online (Krank, 2022). ‘Is Anyone Up’ was harming people beyond online spaces, as many women lost their jobs and their mental health was severely affected (Krank, 2022).
“Bringing Down the Revenge Porn King”, by Vice World News. All Rights Reserved. Retrieved from https://www.youtube.com/watch?v=yHZTyU8bRRw&ab_channel=VICE
The site did reflect the inherent cultural values of digital spaces; it did not directly contribute to the advancement of the internet and directly profited off stripping people of their freedom of consent. While there was no one legally responsible for removing the site, various digital communities and individuals believed they had a duty to remove the site to preserve the rights of individuals and protect the internet from such abuses of technology (Castells, 2011). While some took to legal avenues, James McGibney took a uniquely digital route to stop the site from posting content through purchasing ‘Is Anyone Up’ and redirecting traffic to BullyVille.com. This action by McGibney directly stopped Moore from using the site to share problematic content when legal bodies could not do anything to assist (Neil, 2014).
Government Bodies and Content Laws
When the internet was first formed, there was an inherent distrust and disinterest in political involvement (Barry et al., 2009). However, as the internet has begun to grow and change, users often rely on laws created by governmental bodies in order to regulate what can be done in online spaces. While these laws and policies can prohibit some problematic behaviours online, they are not always sufficient in protecting people due to the borderless and irrepressible nature of the internet. Content laws across the world have begun to emerge in recent years, as governments scramble to meet everchanging demands. While some laws protect people, such as anti-revenge porn laws passed in Australia in 2015 (Makela, 2018). Although many laws created by governments in order to moderate what is posted online are regressive and overly restrictive (Economic and Political Weekly, 2011).
The Australian government created legislation with the intent to limit harmful content posted online without considering how it would strip many of their right to free speech online (Tonkin, 2011). Further, this legislation could potentially endanger those who are attempting to find safety in online spaces (Morris, 2021). Often legislative and governmental bodies struggle to find the middle ground in stopping harmful online content and stripping individuals of their rights in digital spaces, this is exactly why there need to be multiple bodies involved in the moderation process.

Private Organisations Moderation of Content

Privately operated social media platforms such as Facebook, Instagram and Twitter have their own set of terms and services influenced by various bodies. Castells (2011) notes that technological systems are socially produced, meaning that they are informed by the culture in which they are created. Social media sites are equally informed by economic gain, as their style of moderation is often driven by the desires of advertisers. What Facebook may consider problematic content may not be problematic within its context. Facebook and other social media sites do not consider context, as Gillespie (2018) explains, often the power of an image is in its violation, that while it violates one set of norms, it empowers another. Gillespie (2018) specifically explores the removal of an image that depicts children who have been burnt by napalm during the Vietnam war. Facebook continuously removed this image as it is in violation of their guidelines, although Gillespie (2018) notes that this image needs to be seen and shared to remind us of the horrors of war. If we continue to remove content online without consideration of its context, we directly defy the early internet pioneers’ ideas of what the internet is; a place where ideas and information are shared (Castells, 2011).
So, Who is Responsible for Stopping Harmful Online Content?
Just as Joseph S. Nye Jr (2016) said ‘no one and everyone’, is responsible for moderating problematic content online. Individuals, communities, organisations and governments must work together to ensure the internet is a safe environment that still upholds individuals’ right to free speech.
Recources:
Akdeniz, Y. (2003)., Controlling illegal and harmful content on the Internet. In Crime and the Internet. Routledge. pg. 125
Barry M., Vinton G., Cerf, Clark, D., Kahn, R., Kleinrock, R., Lynch, D., Postel, F., Roberts, L., & Wolff, S., (2009). A brief history of the internet. SIGCOMM Comput. Commun. Rev. 39, 5 , 22–31. https://doi-org.ezproxy.library.sydney.edu.au/10.1145/1629607.1629613
Beauchere, J. F., (2014). Preventing online bullying: What companies and others can do. International Journal of Technoethics (IJT), 5(1), 69-77. pg. 84..
Caplan, S. E., (2005). A social skill account of problematic Internet use. Journal of communication, 55(4), 721-736. pg. 45
Carmi, G. E. (2008). Dignity versus Liberty: the two western cultures of free speech. BU Int’l LJ, 26, pg. 277.
Gillespie, T., (2018) All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. Pg. 134
Kırcaburun, K., Kokkinos, C.M., Demetrovics, Z. et al. (2019). Problematic Online Behaviors among Adolescents and Emerging Adults: Associations between Cyberbullying Perpetration,
Problematic Social Media Use, and Psychosocial Factors. Int J Ment Health Addiction 17, 891–908. https://doi.org/10.1007/s11469-018-9894-8
Kranc, Lauren (July 28, 2022). “Where Is Hunter Moore Now? The Revenge Porn Criminal of ‘The Most Hated Man in America’ Today”. Esquire.
Lewis, T., (1989). Who owns the Internet?, in IEEE Internet Computing, vol. 2, no.1, pp. 82-84, doi: 10.1109/4236.656087
Martha, N., (2014). “‘Most hated man on the Internet’ is charged with email hacking to get photos for revenge porn site”. American Bar Association Journal. Chicago, Illinois: American Bar Association.
Massanari, A., (2017) #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3): 329–346. Pg. 48
Morris, S (November 25, 2021). Ending online anonymity won’t make social media less toxic. The conversation. https://theconversation.com/ending-online-anonymity-wont-make-social-media-less-toxic-172228
Mifsud Bonnici, J. P., & De Vey Mestdagh, C. N. J. (2005). Right vision, wrong expectations: The European Union and self-regulation of harmful internet content. Information & Communications Technology Law, 14(2), 133-149. pg. 134
Napoli, P. M. (2018). What If More Speech Is No Longer the Solution? First Amendment Theory Meets Fake News and the Filter Bubble. Federal Communications Law Journal, 70(1), 55+. Pg. 74 https://link.gale.com/apps/doc/A539774158/AONE?u=usyd&sid=bookmark-AONE&xid=1f0a3a5c
Nye, J. (August 10, 2016). Internet or Splinternet?. Project Syndicate. https://www.project-syndicate.org/commentary/internet-governance-new-approach-by-joseph-s–nye-2016-08
Pandey, N., Dé, R., & Ravishankar, M. (2022). Improving the governance of information technology: Insights from the history of Internet governance. Journal of Information Technology, 37(3), 266–287. https://doi.org/10.1177/02683962211054513
Song, H., & Lee, S. S. (2020). Motivations, propensities, and their interplays on online bullying perpetration: A partial test of situational actionword theory. Crime & Delinquency, 66(12), 1787-1808. pg.1
Spada, M. M. (2014). An overview of problematic Internet use. Addictive behaviors, 39(1), 3-6.pg. 34
facebook ban (0 of N)” by zipckr is licensed under CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/?ref=openverse.
Regulating Internet Content. (2011). Economic and Political Weekly, 46(53), 8–8. http://www.jstor.org/stable/23065623
Simpson, Connor (January 23, 2014). “Revenge Porn King Hunter Moore Arrested for Hacking Email Accounts”. The Wire.
Stokes, J. K. (2014). The Indecent Internet: Resisting Unwarranted Internet Exceptionalism in Combating Revenge Porn. Berkeley Technology Law Journal, 29, 929–952. http://www.jstor.org/stable/24119960
Tonkin, C. (Novermber 29, 2021). Govt wants to end online anonymity. The information age. https://ia.acs.org.au/article/2021/govt-wants-to-end-online-anonymity.html