Regulation of harmful content dissemination: whose responsibility and how to regulate it?

Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

The development of new media has various net functions, such as making our lives convenient and allowing us to interact with a lot of information. However, the dysfunction of new media is also negatively affecting various aspects of our lives. New media such as YouTube and Instagram do not have strict review standards like broadcasting stations, so stimulating and harmful content is pouring out every day, and the amount of content is so vast that adults and children are addicted to smartphones and are not interested in reality.

Gillespie (2018) referred to the platform as “Custodians of the Internet” and said, “Platforms must modulate when also disavowing it”.  In the case of minors whose values have not yet been formed, it is urgent to prepare countermeasures against such media addiction as it can harm emotional development (Dyson, 2002).  So who is responsible for regulating the spread of this negative content? How should we regulate this?

Can the problem of the spread of harmful content be solved through the self-regulation of online platform operators including SNS? Unfortunately, there is bound to be a limit to solving problems through self-regulation. Therefore, the government should prepare legal and institutional measures to prevent the spread of illegal pornographic and violent content through SNS or online platform services( Cusumano, 2021). In particular, it is necessary to prepare an institutional device that can hold SNS and online platform operators responsible for distributing intentionally manipulated and fabricated information for misleading news or information users. This is because this manipulated and fabricated fake news is used as tools for defamation, insult, slander, and incitement, such as infringing on individual honour and rights and wiping out human dignity and values, causing great harm to our society. In addition, biased and distorted false and manipulated information that is being distributed unattended by SNS or online platform operators is also causing a problem of misleading and distorting the objective public opinion formation process in our society by hindering sound public opinion formation (Karlsson, 2017). Therefore, the government needs to encourage online platform operators to voluntarily strengthen deliberation by taking responsibility for distributing fake news and illegal pornography and violent information.

    Self on social media.

 In addition to these, efforts are needed to block harmful content by users themselves. In other words, individual cooperation and effort are needed. It is important not only to be familiar with digital media guides but also to understand what role they play in fulfilling their shared responsibility for sound media use (Yee, 2022). It is necessary to spread media literacy education and to report harmful content by users themselves. Therefore, not only the legislature, business, and civil society but also everyone using online platforms should be involved in the process of deriving a solution through regulation.

Self-regulation models and major organizations around the world, which have developed Internet self-regulation, are facing new challenges in their nature and role as the Internet industry matures and the Internet ecosystem environment changes, and various changes are being attempted. Then, in what direction has Internet self-regulation developed and how should it continue to develop?

First of all, the importance and role of hotlines among the roles of Internet self-regulatory organizations are increasing. The main functions that Internet self-regulatory organizations were in charge of in the early days of the establishment include education, hotline, policy solidarity, and dispute settlement (Henderson, 1997). However, the role of hotline operation and media education in the method of notifying and deleting illegal content is strengthened, and the function of policy solidarity and dispute settlement seems to be weakening. In other words, as Internet-related policies mature and the roles of Internet self-regulatory organizations are specialized and subdivided, it is difficult to find in each country recently, dispute settlement work is also explosive and legal adjustment is needed.

The reason why hotlines have become a representative function of Internet self-regulation is that the government’s crackdown or monitoring of private organizations alone cannot control all of the enormous content on the Internet, and the effectiveness of reporting-oriented hotlines is great. As the importance and role of hotlines grew, cooperation with the government, simplification of reporting procedures, and international cooperation was required, and as a result, cooperation with the government became important for Internet self-regulatory organizations.

Second, among the roles of Internet self-regulatory organizations, there is a growing demand for media literacy education. In the early days of self-regulation, education did not occupy a large proportion among the roles of Internet self-regulation organizations. However, as the distribution of illegal and harmful information on the Internet has exploded and the types of illegal and harmful information have diversified, the importance of education is emphasized again as it is difficult to handle only the government and Internet operators (Keck, 2022). The importance of media literacy education at home and school has been emphasized since the beginning of Internet penetration, and various policy efforts have been made for media literacy education in many countries. However, as the use and distribution of illegal harmful content become technically advanced and complicated, it becomes difficult to solve simply by education at home or school, and accordingly, the demand for education by organizations with professional skills is expanding.

Typically, one of the major issues in recent Internet self-regulation is the processing of sexual content that is voluntarily created and exchanged by children and adolescents. According to the survey of IWF, an analysis of 3,803 images and videos of child sexual abuse content from September to November 2014 found that 17.5% of children in the content were under the age of 15, 85.9% of the content was made using webcams, and 93.1%, were women (IWF, 2015).

Preliminary breakdown of content by age.
   Amount of assessed content depicting each     gender by age range.

Unlike Aggravated, which can be understood at the level of sexual abuse and crime, how to handle sexual content voluntarily created and exchanged by teenagers is an important issue, and there is a growing demand for education as a role of Internet self-regulatory organizations, not just at home and school.

Third, cooperation between related business operators and governments such as Internet Service Providers (ISP) is becoming important in Internet self-regulation. As mentioned earlier, as the number of sexual content produced by teenagers voluntarily increases and distribution channels such as SNS and messenger are diversified, how to find and delete it is emerging as a new concern for governments and Internet self-regulatory organizations. Previously, it has achieved results through deletion, education, monitoring, and blacklist designation, but the need for more active intervention such as ISP is expanding, not just cooperation from companies to prevent damage caused by the production and distribution of these new types of content (Ferraresi, 2020). As the role of ISP increases, the need for new perspectives and discussions on ISPs’ immunity in a new Internet ecosystem environment is also increasing.

The challenge now is not whether it is reasonable for the platform to regulate content, but in what cases and in what ways the platform should intervene. The UN Special Rapporteur on Freedom of Expression said the consequences of neglecting online hatred could be tragic and that platforms should recognize that their corporate activities have a profound impact on human rights and take appropriate measures (Kaye, 2019). Specifically, platform companies are asked to clearly explain their hate speech response policy standards, implement policies transparently and consistently, prepare objection procedures, and use measures to limit the rapid spread of posts, mark sources, limit economic profit-seeking, and encourage expression. A well-established framework for regulating harmful content will eventually allow governments, businesses, and civil society to jointly share and cooperate, while at the same time contributing to the success of the Internet.

 

Insil Park Shirley 520189740

 

 

References

Cusumano, M. A., Gawer, A., & Yoffie, D.B. (2021, January 15). Social Media Companies Should Self-Regulate. Now. Harvard Business Review. Retrieved from: https://hbr.org/2021/01/social-media-companies-should-self-regulate-now

Dyson, R. A. (2002). Missing Discourse on Global Media and Terrorism.

Ferraresi, M. (2020, April 24). ISP and harmful content. Can technology redeem itself. Media Laws. Retrieved from:https://www.medialaws.eu/isp-and-harmful-content-can-technology-redeem-itself/

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Amsterdam University Press.

Henderson, J. (1997). Internet content can self-regulation work. ABA. Retrieved from:http://www.austlii.edu.au/au/journals/AUBAUpdateNlr/1998/43.pdf

IWF. (2015). Emerging Patterns and Trends Report #1 Online-Produced Sexual Content. Retrieved from:https://www.iwf.org.uk/media/2saninlk/online-produced_sexual_content_report_100315.pdf

Karlsson, E. (2017). How to defeat technological filter bubbles that skew your world. Retrieved from: https://debunkingdenialism.com/2017/01/11/how-to-defeat-technological-filter-bubbles-that-skew-your-world/

Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet.

Keck, B. (2022, March 21). OP-ED | OUR KIDS NEED MEDIA LITERACY LIKE FISH NEED WATER. CT News. Retrieved from: https://ctnewsjunkie.com/2022/03/21/op-ed-our-kids-need-media-literacy-like-fish-need-water/

Paul, R., & Elder, L. (2004). The thinker’s guide for conscientious citizens on how to detect media bias & propaganda. Retrieved from: http://kurtlancaster.com/socfilm/week11/MediaBias2006-DC.pdf

Yee, A. (2022, January 31). The country inoculating against disinformation. BBC. Retrieved from: https://www.bbc.com/future/article/20220128-the-country-inoculating-against-disinformation