Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Introduction:

With the development of society, the Internet has been highly popularized. With the development of the Internet, people can shop online, share their lives online, enrich their leisure time, make money through the Internet, etc. The Internet has brought people many benefits as well as new things. While the Internet has brought people efficiency and convenience, many problems have also emerged. Bullying, harassment, violent content, pornography, and other problematic content are spread on digital platforms. This affects the online environment as well as the user’s experience of using it. Although freedom of expression is emphasized on online platforms, I do not believe that spreading such problematic content is freedom of expression online. Expressing opinions requires a sense of boundaries and cannot infringe on the legal rights of others, let alone violate laws and regulations. Therefore, problematic content such as online violence needs to be blocked from spreading. This article will discuss how online regulators and major platforms should be responsible for stopping the spread of online violence and other problems, and also how to stop these problems from happening.

“Public Media” by Free Press Pics is licensed under CC BY-NC-SA 2.0.

Who’s responsibility?

Online platforms provide an opportunity for people to communicate and share their lives. People can make friends, and some people can record their lives or make money by sharing videos or live streaming on the Internet. The development of the Internet has also created some new professions, such as bloggers and anchors with goods. When more people use the Internet, some people may spread pornography or cyber violence to others. Take the spread of pornographic information, for example, based on the openness of the Internet, if you let the proliferation of pornographic and obscene information, anyone can get a lot of pornographic and obscene information without much effort, especially minors, minors are more likely to be induced by pornographic information, and unrealistic fantasies about sex, to indulge in it, hindering normal learning life, and once improperly disposed of will eventually lead to bad consequences. To gain attention, many people record or take pictures of illegal or criminal facts that occur in real life, such as prostitution, rape, forced indecent assault, and public disorder. Especially in the child pornography industry, there are crimes such as child rape, child molestation, child trafficking, buying trafficked children, and forced prostitution. When more and more people pay attention to these videos or information, those who distribute them may be able to profit from them, such as making money by distributing these videos, forcing others to make pornographic videos, or even buying and selling people. If such pornographic information is not dealt with promptly, especially for teenagers, who are still immature and do not have enough judgment, they can easily be misled by such information or take the path of crime.

One example is the ‘The Nth Room’ incident in South Korea.

The Nth Room’ is a case in which multiple secret chat rooms were set up through social media platforms, in which threatened women (including minors) were targeted and videos and photos were shared in the rooms illegally. The victims are all women, including minors and babies. To access the chat rooms of this social platform one needs to pay a membership fee and only those who have paid can watch these videos.

Internet platforms are created by different companies, and the internet regulator is a department created for these new internet products. When creating an Internet platform, companies have the dual identity of the provider of Internet software and manager of Internet information content (Flew, 2019). So if the network regulator and the development company know that users are producing and spreading pornography, but do not take responsibility for managing it, I think this can be interpreted as complicity. So the network regulator needs to strengthen the management of major platforms. For the management responsibility, I think the network regulators and the major platforms are responsible, because some platforms may be set up to promote some problematic information or videos, this time needs the network regulator’s management, to check and stop the dissemination of such information. For major platforms, managing their platforms is a responsibility that should be assumed, the more users the platform should pay attention to these issues. If users are always attacked on a platform, or always see intimidation and pornographic videos, the user’s online experience is also bad, the platform is likely to lose many users, which is the loss of the platform is the responsibility of the platform (Gillespie, 2018).

When online harassment moves into the real world

 

How to do it?

Many times, people commit online violence against others or spread bad information because they can be anonymous (Langvardt, 2018). Anonymous users may think that they are not responsible for whatever they do on the internet because everything they present on the internet is not their real selves. This also leads many people to do what they are afraid to do in real life on the internet, thinking that no one knows who they are. I think that when a user registers a new user identity, the platform can use the real name method, which helps the platform to know the real identity of the other person when managing. If some people involved in online violence spread bad comments under other people’s videos and caused the person to commit suicide, the platform could help the police investigate and hold the violent people accountable based on their identities. However, this also raises the issue of user privacy (Edwards, 2009). If the platform has access to all the information of all users, then it is not safe for the users. It also does not exclude that some people will use other people’s identities to register accounts. The network platform must set up a teenager model. For minors, the platform should let them use the youth mode, in which they can avoid the spread of some pornographic or violent videos, and recommend more positive information. It is the responsibility of network regulators and major platforms to screen the information that is not sent on the network. Doing this part will reduce the spread of a lot of bad information. Finally, I think it is also necessary to increase the punishment of bad information disseminators. For example, for those who spread pornographic information to block the number of treatments, or to block the number of network violence.

 

Conclusion:

The Internet has become an essential part of people’s lives, and people’s online lives are gradually enriched. Internet regulators and major online platforms have the responsibility to stop the spread of online violence and information such as pornography and intimidation. Platforms can maintain their platforms by screening new users, and Internet regulators need to strengthen management and impose appropriate penalties on both platforms and individual accounts that spread bad information. Although people have the right to freedom of speech on the Internet, inappropriate speech can cause cyber violence; although people have the right to privacy, some people who violate people’s privacy will treat privacy as a trade. Therefore, maintaining the network environment is an important responsibility of the platform as well as the supervisory department. Users need to be restrained to abide by the rules of using the network, and if there is no restraint, then everyone may become a victim of the network.

Reference list:

Flew, T. Martin, F. & Suzor, N. (2019) Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50.

 

Gillespie, T. (2018a),‘Regulation of and by platforms’, in J. Burgess, A. Marwick and T. Poell (eds), The SAGE Handbook of Social Media, London: SAGE, pp. 254–78.

 

Edwards, L. (2009), ‘Pornography, censorship and the Internet’, in L. Edwards and C. Waedle (eds), Law and the Internet, 3rd ed., Oxford: Hart Publishing, pp. 623–70.

 

Langvardt, K. (2018), ‘Regulating online content moderation’, The Georgetown Law Journal, 106, pp. 1353–88.

 

THE KOREA TIME. “I couldn’t believe what I saw”: What Happened in the Nth Room? Youtube.com. 2020. [online] Available at: <https://www.youtube.com/watch?v=sJ0KdFgxk94> [Accessed 13 October 2022].