The author: Kailin Peng
ARIN2610
——
Introduction:
In the era of digitalization, the Internet has invariably become an indispensable part of people, no matter talking and communicating, sharing and making friends, or even paying attention to strangers who have no contact with each other for tens of millions of meters. It has brought us closer to the outside world’s physical distance, but the “keyboard” process of striking has formed the distance between the heart and the mind and even become a part of the way of foul language attacks. In a time when typing is not a responsibility, the Internet has become a place where they can get away with it. In this paper, I will examine three key points to stop the negative impact of digital platforms on content dissemination: the individual, the venue, and the government. As well, I will look at the main measures that digital platforms and governments can take to prevent such problems.
Image by Moondance from Pixabay, and is licensed under CC BY-SA 2.0
What is the ability of INDIVIDUALS to stop the spread of malicious content?
When individuals receive cyberbullying they need to make sure they have the courage to share it with their families, parents, and trusted people, and to keep evidence of it. According to UNICEF International, anyone can be a victim of cyberbullying. Therefore, when there is bullying and the individual is verbally traumatized mentally and spiritually, it seems to trap him or her and there is no way to escape, which is something that requires the individual’s perseverance and the ability to overcome to fight for his or her reasonable and legitimate rights. On the other hand, this kind of vicious harassment, violence, or pornographic transmission is caused by the internal manifestations of some people, due to the different experiences of individuals over time, and family influences that lead to personality and speech habits. The individual can interrupt the path of transmission by blocking or blacking out the person or object that may appear to have an impact. In the same way, Dagmar Schumacher states, “We must work with each individual to change their behavior, change social norms, and ensure that it is no longer accepted (Ukteam, 2021).” Thus individual perceptions and norms are the primary criteria as well as expectations for avoiding malicious content, and stopping the spread of malice can be done by calling on bullies to stop their behavior, and even more so by the unafraid of the bullied.
What are the responsibilities that DIGITAL PLATFORM Sneed to take?
— A bloodless but endless slaughter
If online violence is a beast that drowns out the reason and swallows up public order, then online platforms should be a strong dam to guard the spiritual home. In the European Union, it is reported that 1 in 10 women aged 15 and above have experienced online harassment. On the characteristics of online violence, low cost, high impact, and difficult to defend rights. Since 2006, a number of Korean stars have committed suicide due to reasons such as not looking at the network pressure, including Choi Jin who died by suicide at home, the later suicide of Shirley in 2019, making the mask of cyber violence completely torn in front of the public. In the era of electronic facilities, the significance of the platform in which is also exposed at this moment, when there is a danger to the physical and mental health of the victims can be on the other side of the network account for bulk reporting measures, and the official platform in the audit process, there are indeed violations of the record or blocking treatment, can help victims to protect the rights of the period. As well as the implementation of the internal survey and audit standards will have negative, offensive, insulting text deleted. The root cause, behind the virtual nature, is anonymity without the identity of the table, which has become a tool to cover the face of most people with a different mentality. And the platform can give a certain amount of review when the automatic review can then be followed by a manual review to screen the user-generated bad content (Grimes-Viort 2010).
In another case, in Cleveland, Ohio man Steve Stephens used the social media network as a platform to share a video of a 74-year-old man walking on Easter Sunday after a fatal shooting, and it took Facebook nearly three hours to delete the content. In fact, cybersecurity expert Zohar Pinhasi stated that they could have had the technical means to prevent such videos from leaking and could have blocked the images or films accordingly, but the platform did not do so (Figueredo, 2017). The feedback on these violent issues on social platforms and their widespread dissemination is an oversight in the governance of the platform that helped users to spread undesirable content. The implementation of algorithms is actually also at the level of platform governance, where platforms can actually regulate and anticipate risks well in advance. For bad speech or video content, you can intercept the corresponding before the first release, and risk reminder. If forced to issue can be implemented to deduct or only personally visible restrictions on the treatment, so that the network platform to form a more clear environment.
“I couldn’t believe what I saw”: What Happened in the Nth Room? From THE KOREA TIMES
Why is it the GOVERNMENT’s responsibility to regulate negative content online?
It is appropriate and necessary for government representatives to regulate various industries and how they use data. The Nth Room is a private chat room (n-room) created in large numbers through the social networking platform Telegram, which consisted of threatening them and providing illegally obtained videos and photos of sexual exploitation to upload to the private room, where the victims were women, even minors and babies, and the suspects posted the solicited videos of sexual exploitation for members to watch and charge for while chatting, and there were as many as 260,000 members.
After the n-room incident in Korea, the South Korean parliament passed a series of laws to prevent the recurrence of sexual exploitation crimes like the Nth Room incident. This included increasing the prison terms and fines for various sexual crimes and revising the rules that impose prison sentences of up to three years or fines of up to 30 million won for possessing, purchasing, storing, or viewing illegally filmed sexual content (Kim, 2022). The root cause of this problem was that Korea did not have sufficient regulatory power at the time to regulate and block it. The situation is not limited to Korea, but there are major problems with the management and operation of the digital society in the crypto space, digital currency, and genocidal abusive videos. There is a lack of transparency in the review of platform content to stop the dissemination of standards regarding harmful content (Brown, 2021). Therefore, government intervention will also make the content of the platform more transparent, and when a situation like the Nth Room occurs it can also be uncovered in time and avoid greater proliferation.
Image by Chetraruc from Pixabay, and is licensed under CC BY-SA 2.0
To explore the main measures to prevent such problems.
The herd effect, as the saying goes, “When there is an avalanche, no snowflake is innocent.” When there is a whole group of people targeting an issue, they tend to echo each other, shouting the banner of justice, lashing out at others, and sharing the blame. In the case of Shirley’s suicide, everyone was complaining in the same breath about the keyboard warriors who had insulted and bullied her, and how many of these people had pushed her into the abyss.
The control of the platform needs to develop a social convention and clear rules before the incident, and make everyone aware that they should be treated with the respect they deserve, whether online or in reality, through increased education and media outreach. Create a unique answer-transfer function on Bilibili (A Chinese video website) so that users have an understanding of the primary rules before entering the platform.
Secondly, governments can open up various legal measures for regulation, and Barbé commented on the systemic nature of the issue: “We are waiting for legislation on hate speech around the platform, and Europe is at the forefront of it.” As well as the United Nations Economic and Social Commission for Western Asia (ESCWA) called for the establishment of an investigation team and reporting mechanism dedicated to gender-based cybercrime (Figueredo, 2017b). As mentioned at the beginning, the “age of typing without responsibility” is precise because the implementation of online discipline is not yet widespread to the depths, leaving people subconsciously unaware of the harm that speech can cause not only to others but even touching the boundaries of the law.
Conclusion
A single measure is bound to be insufficient, the need for a combination of platform control and legal ways to regulate, in the proper way to investigate the IP address to explore the network mask under the real individual and all the punishment effectively implemented to the person who should be in charge, so that the virtual no longer become the umbrella of evil but a tool of the judicial route. As far as individuals are concerned, discerning look at things to avoid becoming a rabble-rouser. There is no absolute freedom in the world, and when it is crowned with the hat of freedom of speech, the rationality and limitations behind it deserve our deep thoughts. All incidents of online violence, bullying, harassment, hatred, and pornography are well documented, and it is important that we do not ignore the protection of the law and do not become the executioner of violence.
Reference list:
A comprehensive guide to bilibili account setup and verification – Get on the youtube of China. (2020, June 21). Nanjing Marketing Group. https://nanjingmarketinggroup.com/blog/bilibili-setup-verification-guide
Chetrarua. (2020). Cyber Bullying. https://pixabay.com/illustrations/online-troll-cyber-bullying-hater-5268149/
Cyberbullying: What is it and how to stop it. (n.d.). UNICEF Europe and Central Asia. https://www.unicef.org/eca/cyberbullying-what-it-and-how-stop-it
Figueredo, L. (2017b, April 17). Do social platforms bear responsibility for violence posted on their platforms? WPEC. https://cbs12.com/news/local/do-social-platforms-bear-responsibility-for-violence-posted-on-their-platforms
Grimes-Viort, B. (2010). 6 types of content moderation you need to know about. Social Media Today. https://www.socialmediatoday.com/content/6-types-content-moderation-you-need-know-about
Moondance. (2021b). Https://pixabay.com/illustrations/cyberbullying-internet-computer-6168626/.
Kim, R. (2022, May 27). Everything to know about the nth room case in ‘cyber hell.’ Netflix Tudum. https://www.netflix.com/tudum/articles/everything-to-know-about-the-nth-room-case-in-cyber-hell
THE KOREA TIMES 코리아타임스. (2020). “I couldn’t believe what I saw”: What Happened in the Nth Room? [Video]. In YouTube. https://www.youtube.com/watch?v=sJ0KdFgxk94
Ukteam. (2021, December 2). Cyber-Violence: A gendered threat. United Nations Western Europe. https://unric.org/en/cyber-violence-a-gendered-threat/