
"Ninja Hacker" by dustball is licensed under CC BY-NC 2.0
What happens online?
Problematic and illicit content circulates in large amounts at all times all over digital platforms. It is a serious issue for the health and safety of internet users, especially children and youth, and raises the question of who is responsible for monitoring this content.
Cyberbullying, harassment and hate comments are only one part of the problem. The internet has a much more sinister underbelly. For example, challenges circle around that are targeted at children and youth that aim to make them complete dangerous activities such as the Momo challenge which encourages them to watch horror movies or even engage in self harm or suicide (Dickson, 2019).
Children and youth often don’t deliberately seek out inappropriate digital content, but rather inadvertently find it from links embedded in other websites. They therefore access problematic content by clicking on unknown links in emails or websites such as clicking on an online game pop up or incorrectly typing web addresses (Child Safety Hub, 2021).
Abhorrent content on the internet has gone so far that things such as terror attacks and suicides have been live streamed and shared around, being seen by thousands of people (Johnson, 2019). Immediate removal of footage turns the recording into a morbid fascination.
The porn industry is also a huge part of the problem. With thousands of videos being shared each day on a platform that is hard to monitor, videos of torture and child abuse can often slip through unnoticed. These porn companies can also profit off of child abuse content which is shared on their page (Belanger, 2022).
Illicit digital content from cyberbullying all the way to sex trafficking has gotten out of hand on the internet, and citizens are calling out for someone to take responsibility for putting a stop to it.
Who should be responsible?
Individuals and reporting to 3rd party software

Citizen responsibility is the ideal solution in a perfect world, however the current internet use has shown that this is not a valid solution. Therefore, part of this individual responsibility solution should include signing up to 3rd party software which can monitor and filter inappropriate and illicit content.
There are many agencies which currently inform people how they as individuals can remove damaging content from the internet such as the eSafety Commissioner of Australia, or Minc which is a legal resource center. These sources suggest ways in which internet users can handle problematic and illicit content on their own through actions such as blocking trolls and reporting illegal behaviour. This sounds ideal yet it has not been effective leaving internet users still seeking help. This is where 3rd party software comes in.
There are various 3rd party content monitoring tools that allow either individual users or websites as a whole to add a domain-based network element filter (Kupfer, 2015). This filter allows users to monitor elements from a specific domain or subdomain, either in a broad sense or by censoring specific elements.
Unfortunately, specifying content blocked using this filter can pose difficulties as internet users often use other characters, such as the @ symbol for ‘a’ etc, to still spell out blocked words, and therefore 3rd party software cannot be the principal content monitoring system. While this technology is extremely useful in finding a solution, 3rd parties cannot be held responsible for the actions of users on other websites.
Website owners

On most websites where you create an account it will prompt the user to agree to the companies’ terms and conditions. Within these terms and conditions lies rules which determine that users will be barred from a website if any of their content is deemed inappropriate. While this is creating a legally binding contract between the user and the web company, the terms and conditions are often too long for users to read, and they often violate the terms and conditions they agreed to (Bateman, 2022). These terms and conditions also provide a convenient scape goat for companies to dodge responsibility for inappropriate content that their users may post.
Social media companies declare that they are used for “free speech and assume no responsibility for what their users communicate.” (Forbes, 2020) It is a part of the ethos of companies such as Facebook to allow their users the ‘freedom’ to express themselves in whatever way they want to. It is not in Facebook’s interest to be monitoring their users as it could lead to them finding a new platform, and outcome Facebook clearly does not want.
Many social media platforms already have features which allow users to block specific content, however the settings must be set by the user and do not automatically turn on when you sign up. When it comes to websites such a pornHub, these sort of blocking tools are all but obsolete as the nature of the website is sharing inappropriate content, and this is the content which is much harder to control. Many people are calling for an entire shutdown of porn websites, however if even one is taken down, more will pop up in its place (McCoy, 2020). While this content is therefore unlikely to be removed, a possible solution to minimise its reach would be to prohibit users from sharing videos from the site.
Government
Having governments control what is appropriate for the users in their county can go down a dangerous road to censorship. Having a government dictate what people use the internet for limits free speech which is what many people associate the internet with and indeed a free society (Debating Europe, 2019). A system could therefore be implemented which is part of the government solution but acts independently from it. This would allow for some form of internet monitorisation without giving sole power to the government.

The Australian government has already written legislation to combat things like cyber bullying and online harm, and Individuals who post this sort of content online can be taken to court. This is a huge step forward to disincentivise individuals from posting abhorrent content online, however it is harder to hold persons accountable in the darker parts of the web.
International companies also pose a major issue to governments as they hold a significant amount of power in the eyes of the public. For example, Facebook held Australia to ransom in 2021 by turning off parts of the platform for only three days due to Australia introducing new legislation to make the platform pay for linking to news stories (Stokel-Walker, 2021). One of the major pros that comes with the global internet is the universality which it is built on. When countries introduce different rules, this is deemed to destroy this unity. It is therefore clearly a complex relationship.
Working together

No one party is responsible for the content which circulates online. While each party mentioned above may not solely hold responsibility, they are all part of the solution. Individuals can do their part by reporting problematic content they see online whether that is to the platform or to a third party reporting system.
Online platforms themselves can increase their monitorisation and add more features for individuals to block hate comments and report content to the platform.
Governments can hold content makers accountable for their content in the digital sphere, and can work on creating relationships with large web companies to work together to make the internet a safer space.
As the internet grows so too does the problematic and illicit content which is posted on it daily. While there are measures that can be taken to restrict access to this content, it is like fighting multiple small fires on many different fronts. Wide ranging drastic action will need to be taken to solve the problem of removing or even diffusing problematic and illicit content. Solutions may seem too difficult to implement at this current time, however if they occur in parallel with the continued growth of the internet then it increases our ability to understand and oversee it and potentially find a solution. It is however imperative that any solution must address the question of who is responsible for stopping illicit content and how, because there has been no comprehensive global solution yet.
References
Are Social Media Companies Responsible for the Misuse of Their Platforms If It Results in Slander, Abuse or Injury?. (2021). Retrieved 13 October 2022, from https://www.johnnyphillipslaw.com/blog/social-media-companies-responsible-for-slander-abuse-or-injury-on-platform
Bateman, R. (2022). Legal Issues with User Generated Content – TermsFeed. Retrieved 13 October 2022, from https://www.termsfeed.com/blog/legal-issues-user-generated-content/
Belanger, A. (2022). Accused of profiting on child sexual abuse, Visa halts Pornhub ad payments. Retrieved 13 October 2022, from https://arstechnica.com/tech-policy/2022/08/visa-pauses-pornhub-ad-payments-ahead-of-child-sexual-abuse-trial/
Biddington, M. (2022). Regulation of Australian online content: cybersafety and harm. Retrieved 11 October 2022, from https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/BriefingBook46p/Cybersafety
Dickson, E. (2019). Momo Challenge: Why Parents Are Freaking Over This New ‘Game’ – Rolling Stone. Retrieved 11 October 2022, from https://www.rollingstone.com/culture/culture-news/what-is-momo-challenge-800470/
eSafety Comissioner – Responsibility. (2022). Retrieved 13 October 2022, from https://www.esafety.gov.au/educators/classroom-resources/young-and-esafe/responsibility
INTERNET SAFETY: PARENTS: Offensive or Illegal Content – Child Safety Hub. (2021). Retrieved 13 October 2022, from https://www.childsafetyhub.com.au/internet-safety-parents-offensive-or-illegal-content/
Johnston, N. (2019). Christchurch attack: the dark web of terrorism as entertainment | Lowy Institute. Retrieved 13 October 2022, from https://www.lowyinstitute.org/the-interpreter/christchurch-attack-dark-web-terrorism-entertainment
Kupfer, M. (2015). Third-Party Content Monitoring – Dotcom-Monitor Web Performance Blog. Retrieved 13 October 2022, from https://www.dotcom-monitor.com/blog/2015/06/29/third-party-content-monitoring/
McCoy, M. (2020). Pornhub Removes Millions of Videos, But Many Are Calling for an Entire Shutdown – NC Family Policy Council. Retrieved 13 October 2022, from https://www.ncfamily.org/pornhub-removes-millions-of-videos-but-many-are-calling-for-an-entire-shutdown/
Minc, A. (2022). 12 Steps For Removing Content From the Internet – Minc Law. Retrieved 13 October 2022, from https://www.minclaw.com/removing-content-online/
School, E. (2018). Should Social Media Platforms Be Regulated?. Retrieved 13 October 2022, from https://www.forbes.com/sites/esade/2020/02/10/should-social-media-platforms-be-regulated/?sh=43f1aa333370
Stokel-Walker, C. (2021). Facebook vs Australia and the new battle to cut big tech down to size. Retrieved 11 October 2022, from https://www.newscientist.com/article/mg25033320-800-facebook-vs-australia-and-the-new-battle-to-cut-big-tech-down-to-size/
Who’s responsible for keeping illegal content off the internet? – Debating Europe. (2019). Retrieved 13 October 2022, from https://www.debatingeurope.eu/2019/05/14/whos-responsible-for-keeping-illegal-content-off-the-internet/#.Y0ihF-xBzb2