Who should be responsible for stopping the spread of online bullying, harassment, violent content, hate porn and other problematic content that circulates on digital platforms?  

Should it be the Government, the Intermediaries, the Users or all of the above?

Anonymous Mask and Interface by TheDigitalArtist licensed under Pixabay via Canva n.d. Canva

Content moderation within the ever-growing parameters of the internet is one of the many difficulties that social media has brought to the digital era. These systemic arguments of who should take responsibility over these growing matters of contention raises many further questions and barriers, as this is a global issue where each country has differing views on the matter. To which a universal understanding will be difficult but not impossible. So, should it be the Government, the intermediaries, the users, or all the above? This piece will find that only through a harmonious and compliant relationship between all three main actors will see these problems begin to get the justice and correct preventative measures put in place that all digital platforms require to moderate issues such as cyberbullying, violence and hate porn.


Case study || The repercussions of failing to stop the spread  

TED Talk Video ||

This 2018 TED Talk by Darieth Chisholm describes her personal experience of injustice surrounding the significant lack of Government policy around hate porn and how this failure of legislation allowed the repercussions of her ex-boyfriend’s actions to fall on her, the victim. This caused her to lose her job, self-esteem and suffer from chronic anxiety and depression. The images were shared by many users once posted, and the platform did nothing to help her nor did the Government through its lack of policy surrounding governance and prevention of hate porn on the internet. All three of the potential areas that could help stop the spread of this content failed her. Although she goes on to mention that after many months of reaching out publicly online for help from users, the intermediary where the images were posted and the Government did indeed turn around and come to her aid, but it was a very delayed response to the attack. This is a prime example of where the responsibility lies with user, intermediary and Government to stop illicit and criminal behavior such as this from occurring in the first place.

Her story describes the US Government having only a flimsy civil misdemeanor intervention as a form of punishment for what the perpetrator had done, which was only a $500 fine. Yet after an invasive investigation by authorities, he was arrested and convicted of his crimes under the 2015 Jamaican Malicious Communications Act, however, nothing stemmed from the US Government authorities.

Digital communication and virtual technology global network by khunkorn via Canva nd. unlicensed

 

The main issues || Who should take responsibility?

The internet has grown at an exponential rate which has quickly outgrown nearly all current governance functions all over the globe, which has left for situations as seen with Darieth Chisholm to become a real systemic issue (DeNardis 2014, Ch. 10). To simply put it, legislative change is too slow for social media, and it has allowed cyberbullying, violence and hate porn, to run rampant with little to no authority to monitor or prevent it. All three of the areas of the internet must work side-by-side as they all link together in a form of domino effect. If there is no adaptive legislation then the intermediary has nothing to stop the user from saying or doing whatever they please (Khol, 2012).

Government || 

One of the reasons raised by Mueller (2017) surrounding the issues in online Government legislation is regarding the ‘splinternet’ and of the economic repercussions through inputting strict content moderations on users thereby limiting their usage and activity, which then would lead to businesses losing significant revenue. There must be a profound and demanding policy that states any type of illicit behaviour will not be tolerated regardless of the scrutiny it may cause to public discourse, yet is not so restricting as to cause the ‘splinternet’ (Gillespie 2018) or heightening the power for surveillance capitalism as noted by Zuboff (2015).

Intermediaries ||

One of the main reasons intermediaries do not take responsibility is primarily due to the difficulties in surveilling and pinpointing the origination of these problems, along with complying with different country’s rules and regulations (Gillespie, 2017). It is difficult to trace where the issues can stem from and input measures to avoid in the future without causing chaos with users and policies as issues around techlash are becoming more prevalent in the online community.

 

User ||
There are two extremes with users. They can either be completely remorseless when it comes to online bullying, violence or hate porn due to the ability to hide behind a screen (Blackwell et al 2019) or are the victims of these actions and have no backing from intermediaries or Governments to support them. Users are the ones initiating these behaviors, so the responsibility falls on them to abide by Government policy as well as the terms and conditions of use on the platforms site. Yet if the other two do not comply, then the users are able to act as they will.


The Result || How can we stop the spread?

Section 230 ||

 

 To stop the spread of these issues there are a few preventative measures that can be put in place. One has been the implementation of Section 230 in the US. This helps intermediaries surveille and restrict access from users who have participated in defamatory behavior towards other patrons, yet it still is not quite enough in the form of prevention (DeNardis 2014, Ch. 7). This essay proposes that although each country has differing policies surrounding these issues, there should be a universal agreement that educating users in this issue share the same punishment and justice for victims. Australia currently has in place an e-Safety Commissioner overseeing cybercrime. Penalties can see perpetrators serve 3-10 years in prison if found guilty, yet other countries differ on the severity of the crime, if any at all. Intermediaries do attempt to inform the user around what to do if they are a victim or a bystander as seen on Microsoft’s website around cyberbullying and harassment, which is a step forward yet not enough to take responsibility for its content. Yet as Blackwell et al (2019) demonstrated in their assessment of online VR harassment of women, online users spoke far more freely without regard to the violent harassment they were casting on the other users. The theory surrounding this was that through being behind a screen gave them more confidence to be able to commit these injustices, which is a very worrying issue that requires immediate attention from the intermediaries and Governments. To stop the spread, all the actors need to take equal responsibility and work together to educate and reform what little there is to govern in the online world. If every Government exercised a profound level of power that can be morally justified (Suzor et al 2018), then intermediaries can abide by the rules and make their own to ensure compliance and education of its users by make sure they are informed and held accountable equally for their actions on a global scale, then the issues will have a unilateral fair justice scheme.

 

Conclusions ||

Each country has differing laws and beliefs that surround their democracy regarding these issues as demonstrated with Darieth Chisholm’s story. These issues are common, where some countries will aid the victim and others will not depending on its current legislation, yet we are a very interconnected society now and having a universal understanding surrounding these problems must be undisputed. The responsibility lies equally with Governments to make sure its policies are up to date with whatever the current online societal requirements are, the intermediary to make sure its users are not violating any laws or social injustices, and the users to make informed decisions about what they do as an active participant on the internet. All have an equal part to play in addressing and preventing these issues from continuing, it must be faster and more effective across the board for any real change to occur.

 

 

 


Citations ||

Blackwell, L., Ellison, N., Elliott-Deflo, N., & Schwartz, R. (2019). Harassment in Social Virtual Reality: Challenges for Platform Governance. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–25. https://doi.org/10.1145/3359202

Chisolm, D. (2018). Why We Need Revenge Porn Safeguards | Darieth Chisolm | TEDxPittsburgh [Video]. YouTube; TEDx. Accessed https://www.youtube.com/watch?v=C2a1AWdsUPI

Cyberbullying | Youth Law Australia. Youth Law Australia. Retrieved from https://yla.org.au/nsw/topics/internet-phones-and-technology/cyber-bullying/.

DeNardis, L. (2014). Chapter 7 The global war for internet governance. Yale University Press.

DeNardis, L. (2014). Chapter 10 The global war for internet governance. Yale University Press.

 

Frequently Asked Questions – Cyberbullying and Harassment. Support.microsoft.com. Retrieved from https://support.microsoft.com/en-us/topic/frequently-asked-questions-cyberbullying-and-harassment-9c810c30-ba27-a00a-5e2c-b900d6c4fc0d.

Gillespie, T. (2017). Regulation of and by Platforms. In The SAGE Handbook of Social Media (pp. 254–278).

Gillespie, T. (2018). Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press,. https://doi.org/10.12987/9780300235029

Guidelines for Prosecuting Cases Involving Malicious Communications: Section 9 of the Cybercrimes Act of Jamaica, 2015. Dpp.gov.jm. (2015). Retrieved from https://dpp.gov.jm/sites/default/files/news/Guidelines%20Prosecuting%20section%209%20Offences.pdf.

Hemphill, T. A. (2019). “Techlash”, responsible innovation, and the self-regulatory organization. Journal of Responsible Innovation, 6(2), 240–247. https://doi.org/10.1080/23299460.2019.1602817

Kamal, M., & Newman, W. (2016). Revenge Pornography: Mental Health Implications and Related Legislation. Journal of the American Academy of Psychiatry and the Law. Retrieved from https://jaapl.org/content/44/3/359.

Mueller, M. (2017). Will the internet fragment? : Sovereignty, globalization and cyberspace. Polity Press.

Section 230 – What Does It Mean to You?. Learn Liberty Youtube.com. (2020). Retrieved from https://www.youtube.com/watch?v=VltcZRiSM7M.

Shoshana Zuboff on surveillance capitalism | VPRO Documentary. Youtube.com. (2019). Retrieved from https://www.youtube.com/watch?v=hIXhnWUmMvw.

Suzor, N., Van Geelen, T., & Myers West, S. (2018). Evaluating the legitimacy of platform governance: A review of research and a shared research agenda. International Communication Gazette, 80(4), 385–400. https://doi.org/10.1177/1748048518757142

Uta Kohl (2012) The rise and rise of online intermediaries in the governance of the Internet and beyond – connectivity intermediaries,International Review of Law, Computers & Technology, 26:2-3, 185-210, DOI: 10.1080/13600869.2012.698455

Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of information technology, 30(1), 75-89.