The relationship between digital platforms and online bullying: the neutral space as a medium still needs the intervention of users and the government 

Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stopping the spread of this content and how?

"cyberbullying 1" by paul.klintworth is licensed under CC BY-NC 2.0.

Introduction: 

“50/52 : La censure sur Internet – Censure on Internet” by Eric Constantineau – www.ericconstantineau.com is licensed under CC BY-NC 2.0.

Social media can promote the spread of awareness, telecommunications, advertising, and social inclusion, and magnify social support and relationships (Jones et al., 2021b). However, despite the positive attributes of social media, platforms have proved to be usual places to participate in online bullying (Jones et al., 2021b). Compared with traditional bullying, picture/video and phone bullying have greater impact on victims (Dredge et al., 2014). The imbalance of power between the victim and the perpetrator has led to a serious underestimate of the harm caused by cyber bullying to people’s physical and mental health. Most of the perpetrators regard it as an extension of traditional bullying behavior in the virtual world, but they do not clearly realize that it has the same harm (Balaji et al., 2021). The viral spread of online bullying and harassment content on the digital platform makes the platform, users and government must pay attention to it and clarify their identity positioning to maintain the good order of cyberspace. 

 

"facebook" by pshab is licensed under CC BY-NC 2.0.
“facebook” by pshab is licensed under CC BY-NC 2.0.

System governance and effective supervision of digital platforms

Most of the cyber bullying occurs on digital platforms of social media, so effective governance and supervision of relevant regulations on digital platforms is of great significance. In the regulatory era in the late 1990s, people rejected public orders and government control and supervision and favored market based self-regulation (Hofmann et al., 2016). Taking Facebook as one of the representative digital platform giants as an example, this California ideology has infiltrated Facebook’s community principles. Its main concept is the liberal individualism endowed by technology in the context of free market and minimum national participation, while information ownership and open platform show market liberalism (Siapera&Viejo Otero, 2021). The open platform is uncontrollable at the same time, and the suppression and control of false and erroneous information and speech are also making progress in the search for improved methods. For example, Australian Code of Practice on Disinformation and Misinformation aims to provide the public and platform government with diverse ways to strengthen the technology to combat false information. Meta’s transparency report said that in 2021: by employing a third party to verify partners, more than 190 million items of content on Facebook were confirmed and evaluated as false, partially false, altered, or missing. More than 3000 accounts and groups were deleted because they repeatedly violated the provisions to prevent the dissemination of COVID-19 and vaccine error messages (Barrett, 2022). Although the audit system of digital platforms has gradually turned to algorithm audit, manual auditors are still a crucial factor in content supervision. In 2016, the European Commission signed a voluntary code of conduct with major digital platforms. The implementation of existing legislation through digital platforms has directly led to an increase in the number of manual auditors reviewing content. About 100000 people around the world are employed as content moderators, most of whom are outsourcing workers without stable contracts (Siapera&Viejo Otero, 2021). This practice makes the evaluation hierarchy more humanized and more reasonable. For information such as malicious words and false errors on the network, the moderator and the automation system will delete or downgrade this part of the content. Correspondingly, the system will further socialize users by increasing the visibility of other content (Siapera&Viejo Otero, 2021). 

 

“Online harassment lit review” by Wikimedia commons is licensed under CC BY 2.0.

Public users are consciously responsible for their online speeches

Examples of online bullying include offensive texts with personal emotions and low quality, social media posts, pornographic or threatening direct messages, etc. When making anonymous speeches on the Internet, the public should have respect for other users of cyberspace and awe of the platform (Balaji et al., 2021). Social media platforms enable users to communicate with their extended social networks in new ways and provide opportunities to meet strangers with similar interests or close distances, but they are also used as tools to harass and insult other users (Dredge et al., 2014). Many cyber bullies show a dismissive attitude towards cyber bullying victims, including labeling victims as “stupid, sad and boring” (Dredge et al., 2014). Only when users realize that the objects they slander, insult, and desecrate are not data, but living human beings, can they have basic respect for each other. Nevertheless, this concept is easy to get lost in toxic Internet communication. A survey of trends on the digital platform shows that victims on the Internet are getting increasingly hurt, while bullies are almost unaffected (Jaidka et al., 2021). The term cyber bullying can refer to the mental instability directly or indirectly affected by online harassment experience, which reflects the terrible consequences that the victim may face and exposes the victim cycle of cyber bullying. For example, the common victims of bullying are usually those who are considered different in appearance and social norms (Jones et al., 2021b). Emily Atack, as a model who became famous with the label of sex appeal in her early years, has a helpless high acceptance of online sexual harassment. She said that she was used to receiving hundreds of private messages of sexual harassment within a day, including but not limited to language abusing her personality, naked photos of her reproductive organs and provocative dirty words. She admitted that if it were in reality, she would certainly be difficult to accept direct audio-visual stimulation in this regard. However, whether this kind of harassment will be diluted on the network still depends on the users themselves, because this kind of sexual harassment is disrespectful to users and violates the community standards. 

 

"government" by Mike Lawrence is licensed under CC BY 2.0
“government” by Mike Lawrence is licensed under CC BY 2.0

The government issues relevant laws and regulations to protect the rights and interests of victims

After entering the Internet era, the laws of most countries cannot be completely updated and amended with the times, but witness the tragedies and consequences caused by cyber bullying on actual people in the physical world, which makes most governments consciously start to protect the legal provisions of Internet space. The victim suffers from depression, inferiority complex, anxiety, and other negative emotions due to cyber bullying, and continuous cyber bullying may also cause the victim to harm others or themselves. Nana, a Hong Kong girl, lived jump off a building because of cross-border cyber bullying. She was abused by a group of online users from the mainland of China who called themselves “toilet girls” through anonymous platforms. Jiang Yuhuan as a member of the Hong Kong Legislative Council said that because there is no law in Hong Kong to deal with or regulate cyber bullying, the Hong Kong police could not request mutual legal assistance such as investigation and evidence collection from the mainland. Regardless, she suggested that the Hong Kong government should first legislate against online bullying, so as to protect users like Nana who do not have full spiritual strength to bear malicious remarks on the Internet. At the same time, she advocated that the corresponding platforms and company organizations providing telecommunications services should monitor the chat records of telecommunications services according to law to prevent the Internet from becoming a medium for bullying minors and mentally incapacitated people. Another example is that Australia began to take effective measures to amend its laws on cyber security. The Enhancing Online Safety Act 2015 eSafety launched an online complaint service for Australians who suffered cyber bullying. According to the Act, legally binding notices and civil penalties can be issued for non-compliance with social media platforms. Nevertheless, the antecedent law still has its limitations for acts that cause specific injuries. In 2022, the Australian government drafted the Anti-Trolling bill to combat abuse and bullying on social media, which triggered discussion. The social media service platform needs to establish a complaint plan to disclose the contact information of the publisher with consent, to avoid the risk of bearing defamation liability. Although, its disadvantages lie in the prohibitive cost of bringing a libel lawsuit and the narrowness of the rights granted by the bill. There are instances of migrant women reporting online abuse materials to Facebook. The social media platform refused to delete them because they are not in English (Kwan, 2022). In fact, more countries are trying to formulate and improve relevant laws and regulations that are in line with national conditions for the spread of inappropriate online speech on social media. When people adopt a comparative perspective, research on platform governance can further explore the extraterritorial consequences of such speech (Ahn et al., 2022). 

 

Conclusion:

Figure 1.
“Future of the Internet” by Free Press/ Free Press Action Fund is licensed under CC BY-NC 2.0.

As a social media, the uncontrollability of digital platforms cannot be ignored. For content governance and audit, the platform side needs to understand the supervision and management responsibilities that it needs to take the initiative to undertake, and timely and effectively curb the spread speed of false and erroneous information. Public users need to improve the quality of social networking, reduce the occurrence of network malicious events from the root, and make digital platforms play an active role as media. The government needs to improve network security related regulations, so that network users are subject to the same judicial control as the physical world, so that the digital platform does not become an illegal place. 

 

 

 

 

 

Reference 

Ahn, S., Baik, J. (Sophia), & Krause, C. S. (2022). Splintering and centralizing platform governance: How Facebook adapted its content moderation practices to the political and legal contexts in the United States, Germany, and South Korea. Information, Communication & Society, 1–20. https://doi.org/10.1080/1369118x.2022.2113817 

Anonym. (2022b, August 11). 18-year-old woman commits suicide in Tin Shui Wai. The Limited Times. https://newsrnd.com/news/2022-08-11-18-year-old-woman-commits-suicide-in-tin-shui-wai-%7C-suspected-by-cross-border-internet-abuse-jiang-yuhuan–the-government-should-legislate-against-cyberbullying.SkZqOlMfR5.html 

Balaji, N., Karthik Pai, B. H., Manjunath, K., Venkatesh, B., Bhavatarini, N., & Sreenidhi, B. K. (2021). Cyberbullying in Online/E-Learning Platforms Based on Social Networks. In Intelligent Sustainable Systems (pp. 227–240). Springer Nature Singapore. http://dx.doi.org/10.1007/978-981-16-6369-7_20 

Barrett, A. (2022, June 1). What are digital platforms doing to tackle misinformation and disinformation? Croakey Health Media. https://www.croakey.org/what-are-digital-platforms-doing-to-tackle-misinformation-and-disinformation/ 

CYBERBULLYING & HATE SPEECH LAWS IN AUSTRALIA. (2021, April 12). The Lawyers & Jurists. https://www.lawyersnjurists.com/article/cyberbullying-hate-speech-laws-in-australia/ 

Dredge, R., Gleeson, J., & de la Piedad Garcia, X. (2014). Cyberbullying in social networking sites: An adolescent victim’s perspective. Computers in Human Behavior, 36, 13–20. https://doi.org/10.1016/j.chb.2014.03.026 

Facebook community standards. (n.d.). Transparency Center. Retrieved October 13, 2022, from https://transparency.fb.com/en-gb/policies/community-standards/ 

Hofmann, J., Katzenbach, C., & Gollatz, K. (2016). Between coordination and regulation: Finding the governance in Internet governance. New Media & Society, 19(9), 1406–1423. https://doi.org/10.1177/1461444816639975 

Jones, A., Plumb, A. M., & Sandage, M. J. (2021). Social media as a platform for cyberbullying of individuals with craniofacial anomalies: A preliminary survey. Language, Speech, and Hearing Services in Schools, 52(3), 840–855. https://doi.org/10.1044/2021_lshss-20-00159 

Kwan, C. (2022, March 10). Australia’s anti-trolling Bill blasted by senators, online abuse victims and organisations alike. ZDNET. https://www.zdnet.com/article/australias-anti-trolling-bill-blasted-by-senators-online-abuse-victims-and-organisations-alike/ 

Siapera, E., & Viejo-Otero, Paloma. (2021). Governing Hate: Facebook and Digital Racism. Television & New Media, 22(2), 112–130. https://doi.org/https://doi-org.ezproxy.library.sydney.edu.au/10.1177/1527476420982232 

Social media (anti-trolling) bill. (n.d.). Attorney-General’s Department. Retrieved October 14, 2022, from https://www.ag.gov.au/legal-system/social-media-anti-trolling-bill 

TRANSPARENCY. (n.d.). DIGI. Retrieved October 13, 2022, from https://digi.org.au/disinformation-code/transparency/ 

TV, Lad. (2021). Emily Atack shares her experiences of sexual harassment online I minutes with I UNILAD [Video]. In YouTube. https://www.youtube.com/watch?v=MvQKoYuXuV0 

About ZIYUAN WANG 2 Articles
2nd in USYD from Quanzhou, China. Major in digital culture and visual arts, love photography and poetry.