It is without a doubt that modern communication systems have changed the way we interact with others and ourselves. Notably, digital platform services such as social media and internet search engine services have enabled individuals to easily obtain information on just about anything whether it is a product or a user. Presently, platforms are maintaining an environment where participation is at the center of their service (Tombleson & Wolf, 2017). It follows the basis of a sharing economy in which the internet was built upon cultivating innovation, collaboration, and cooperation (John, 2018). In 1996, when section 230 of the communications decency act was passed (Gillispie, 2018), platforms were created to encourage an open and fair medium for marginalized and suppressed voices and to protect users’ freedom of speech (Brannon, 2019). However, with increasing misuse of this freedom, platforms have only welcomed issues such as bullying, harassment, violent content, hate, pornography, and more illicit content.
“What goes on the internet, stays on the internet”
With the rise of social media platforms like Instagram, Twitter, and Facebook, it has become easier than ever for users to connect with others online, even if they do not personally know one another. Users from across the globe can easily find your profile and obtain your personal information such as your age, your likes/dislikes, and where you live with just a few clicks of a button. The idea of digital permanence and “what goes on the internet, stays on the internet” have created opportunities for individuals to misuse digital storage for illicit content. The legitimacy of digital permanence may reflect on an individual’s employment opportunities where many individuals have failed to earn employment simply due to questionable remarks they have made online against certain groups in society. Now, the problem lies in who should be in charge of regulating these online platforms to ensure that a negative digital footprint was never left on the internet. Is it the responsibility of the networked users? Perhaps the responsibility of the platform itself? Or is it the responsibility of the government?
Death by cyberbullying
Following the cyberbullying-led death of two Korean pop stars in 2019, On the 4th of February, 2022, a professional Korean volleyball player, Kim In Hyuk was found dead in his home leaving a note that reflected on his life pessimistically. Kim was the subject of cyberbullying in which countless users had left negative comments on Kim’s Instagram account questioning his feminine looks and making speculations about his sexual identity. Though Kim had previously cleared up all speculations through a post on his account, it seemed that the public did not think it was enough and continued to leave hate comments, driving him to his death.
It is important to note that this issue of cyberbullying is not only prevalent in South Korea. It is very much prevalent all across the globe with over 41% of adults having experienced cyberbullying at least once in their life (PEW Research Center, 2021). Hana Kimura, a pro-wrestler and the star of the Japanese Netflix reality series “Terrace House” took her own life in 2020 after receiving hateful messages online regarding her appearance on the show. She was criticized for her darker complexion as well as her “manly” persona in which she revealed that she was receiving up to 100 hate messages a day, some even telling her to “go die” (Marks, 2020). In response to this, the Japanese Government reviewed laws regarding cyberbullying and increased legal punishment for cyberbullying from a 30 days detention and a fine of $75 USD to 1-year imprisonment and a fine of $2,250 USD. The statute of limitations for prosecution was also increased from one year to three years. The issue of cyberbullying has been increasingly becoming more prevalent in contemporary society. A study by Maurya Et al. (2022), found that young adults who were once victims of cyberbullying were twice more likely to experience suicidal thoughts than those who were never victims of cyberbullying.
Enabling access to violent content
With the constant innovation of social media and the increasing affordances, social media is providing its users with, the ability to go on a live video stream is now implemented on numerous social media platforms like Instagram, Facebook & TikTok. On March 15th, 2019, Brenton Tarrant, an Australian far-right extremist, fatally shot 51 people in the span of 36 minutes in two mosques across Christchurch, making his case the deadliest terrorist attack in New Zealand’s history (Macklin, 2019). Tarrant had live-streamed his doings on Facebook and had around 200 people watching his live stream as it unfolded. None of these individuals had reported the video to Facebook and its first user report was only 29 minutes after the video started (Macklin, 2019). Moreover, the live stream was viewed over 4,000 times before Facebook was able to completely remove it from its site (Macklin, 2019). Additionally, because the stream was reported on the basis “for reasons other than suicide” the reports were not immediately reviewed and had to follow a different procedure that did not prioritize the removal of the video (Macklin, 2019). Tarrant’s live-streamed terrorist attack call attention to how flawed the moderating system of social media platforms is.
So who’s responsible for these deaths?
The above real-life cases of cyberbullying and violent content on social media only fuel the question of who should be held responsible. Though the South Korean and Japanese governments responded to the cyberbullying cases by reviewing and changing laws regarding cyberbullying, these incidents might have been prevented if such online negative comments weren’t so easily posted. At the same time, implementing stricter laws regarding online behaviors may be a good start for other nations to follow. Putting governments responsible may help in the phase of blocking content deemed illegal or offensive. However, with the internet built on the basis of freedom of speech, blocking each and every illegal or offensive content may result in what Palfrey (2010) observes as ‘overfiltering’ where instead of blocking a single content, the government may block the platform as a whole. This is evident in what China calls their Great Firewall in which the country has successfully banned Facebook, Instagram, Google, Youtube, and most western sites in order to have full control of what type of news and information is being put out and circulated in China. They had replaced these sites with their own platforms such as Weibo in exchange for Twitter, and Douyin in exchange for Tiktok. However, this has been revealed to be too restrictive with the country placing last on the press freedom index for two consecutive years in a row in 2016 (Xu & Albert, 2017).
Not only is it the government’s responsibility in placing laws against illicit online content, but the platforms themselves should also be held responsible. The need to revise Section 230 of the Communications Decency Act is only becoming more urgent with the constant misuse of social media platforms. This is because these laws were made without social media in mind (Gillispie, 2018) which means that the extent to which the law overlooks limits to a web that was largely populated by ISPs and amateur web publishers (Gillispie, 2018). The problem is that the safe harbor allows the platform to not be liable for what their users post or distribute as long as they have no ‘actual knowledge’ of and did not produce or initiate illicit material (Gillispie, 2018). In other words, according to the law, social media platforms are not responsible for what is going up on their platforms. However, there’s still work that can be done to minimize the impacts and the spread of illicit content online. Platforms can use algorithms for moderation to occur before content is uploaded onto the internet. The technology already exists; platforms just need to refine and employ it. For example, technology like photoDNA and contentID is capable of removing content automatically without human intervention which is highly useful for cases of child pornography (Paul & Reininger, 2021).
Above all, networked users should also be held accountable for their online activities. With platforms providing users with the capability of staying anonymous online, it has become easier to abuse this privilege and use it to attack others. Though putting algorithmic systems in place for regulating the online space may be beneficial, it will simply be overwhelmed if users were to continue flooding the internet with negative content. Hence, perhaps the only way to prevent any unwanted situations to happen is for users to avoid posting anything illicit, more so with the threat of digital permanence. Nonetheless, users should practice ethical online presence as the first step of prevention when it comes to spreading information online, especially in a society where the mass spread of content could happen in a few seconds.
It is without a doubt that social media has brought us increased convenience in the way we live and has provided us with countless opportunities to better our quality of life. However, the concerns and issues of online safety should be addressed promptly in order to minimize the issues and impacts of cyberbullying, harassment, violent content, hate, porn, and other problematic content. The responsibility of regulation does not solely belong to one party, nor does it weigh more on a specific party, instead, the responsibility is evenly distributed through the 3 parties of the government, the platform, and the networked users. All three parties should collaboratively work together to build an ethical online environment where each and every user can feel safe.
Anonymity and identity shielding. (n.d.). eSafety Commissioner.
Brenton Tarrant. (n.d.). Counter Extremism Project.
Brzeski, P. (2022, June 15). Japan Tightens Cyberbullying Law After Hana Kimura Death. The Hollywood Reporter.
Cho, J. (2019, December 1). Deaths of Goo Hara and Sulli highlight tremendous pressures of K-pop stardom. ABC News. https://abcnews.go.com/International/deaths-goo-hara-sulli-highlight-tremendous-pressures-pop/story?id=67303374
Digital Platform. (n.d.). Medium.
Ghosh, D. (2021, January 14). Are We Entering a New Era of Social Media Regulation? Harvard Business Review.
Gillespie, Tarleton (2017) ‘Governance by and through Platforms’, in J. Burgess, A. Marwick & T. Poell (eds.), The SAGE Handbook of Social Media, London: SAGE, pp. 254-278.
Hana Kimura: Netflix star and Japanese wrestler dies at 22. (2020, May 23). BBC. https://www.bbc.com/news/world-asia-52782235
Iyer, R. (2022). Professional volleyball player Kim In-hyeok found dead after battling hate comments about his looks. Sportskeeda.
John, Nicholas A. (2018) Sharing Economies. In The Age of Sharing. Cambridge: Polity. pp. 69-97.
Krahé, B., Möller, I., Huesmann, L. R., Kirwil, L., Felber, J., & Berger, A. (2011). Desensitization to Media Violence: Links With Habitual Media Violence Exposure, Aggressive Cognitions, and Aggressive Behavior. Journal of Personality and Social Psychology, 100(4), 630–646. https://doi.org/10.1037/a0021711
Macklin, G. (2019, July). The Christchurch Attacks: Livestream Terror in the Viral Video Age – Combating Terrorism Center at West Point. Combating Terrorism Center. https://ctc.westpoint.edu/christchurch-attacks-livestream-terror-viral-video-age/
Maurya, C., Muhammad, T., Dhillon, P. et al. The effects of cyberbullying victimization on depression and suicidal ideation among adolescents and young adults: a three year cohort study from India. BMC Psychiatry 22, 599 (2022). https://doi.org/10.1186/s12888-022-04238-x
McCurry, J. (2022, February 9). South Korea under pressure to crack down on cyberbullying after high-profile deaths. The Guardian.
Moore, D. (2021, March 18). Once on the Internet, always on the Internet | News | normantranscript.com. The Norman Transcript.
Morell, C. (2021, January 26). Social Media Platforms Must be Held Accountable for Illicit Content. Newsweek.
Paul, C., & Reininger, H. (2021, July 20). Platforms Should Use Algorithms to Help Users Help Themselves. Carnegie Endowment for International Peace. https://carnegieendowment.org/2021/07/20/platforms-should-use-algorithms-to-help-users-help-themselves-pub-84994
Section 230 of the Communications Decency Act. (n.d.). Electronic Frontier Foundation. https://www.eff.org/issues/cda230
Sharing economy | ACCC. (n.d.). Australian Competition and Consumer Commission. https://www.accc.gov.au/consumers/online-shopping/sharing-economy
South Korea Set To Introduce Cyberbullying Laws In The Wake Of K-pop Suicides. (n.d.). Cybersmile. https://www.cybersmile.org/news/south-korea-set-to-introduce-cyberbullying-laws-in-the-wake-of-k-pop-suicides
Tombleson, B., & Wolf, K. (2017). Rethinking the circuit of culture: How participatory culture has transformed cross-cultural communication. Public Relations Review, 43(1), 14–25. https://doi.org/10.1016/j.pubrev.2016.10.017
Vogels, E. A. (2021, January 13). The State of Online Harassment. Pew Research Center. https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/
Zutshi, Aneesh & Nodehi, Tahereh & Grilo, Antonio & Rizvanović, Belma. (2019). The Evolution of Digital Platforms. 10.1201/9780429280818-3.