Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Why did these happened? Who should be responsible for stopping the spread of this content and how? 

Social Media Landscape (redux)” by fredcavazza is licensed under CC BY-NC-SA 2.0.

Introduction

Private message bombing, human flesh search, slander, cyber violence, and uncontrollable cyberbullying. These incidents are illegal and criminal. Digital technologies offer new affordances, enabling public interaction and information sharing in the virtual realm. Online platforms were initially celebrated as vehicles of the “participatory society” (de Kloet et al., 2019, p. 249) and the “sharing economy” (John, 2018, p. 69). Citizens’ political participation will reach unprecedented levels. The height of social media has the characteristics of sociality and connectivity. Platform users make their content and share; a more common way to participate in social media and online society is to share (John, 2018, p. 77). Some information on the Internet can spread rapidly in a “viral” manner. Bullying, harassment, violent content, hate, porn and other problematic content fill the virtual digital space. On the one hand, digital platforms upgrade the potential for judicious exchange between individuals; on the other hand, they increase preference, lack of interest, outrage and contempt between individuals, and false information can even cause social unrest. This blog will use several network cases to prove that the network platform is the carrier of online violence to prove that the network platform plays a key role when it comes to personality rights disputes.

Why does problematic content circulates on digital platforms occur? —Algorithmic bias and discrimination

Algorithm = death” by johntrainor is licensed under CC BY 2.0.

In 1996, Batya Friedman and Helen Nissenbaum pointed out that human biases can be incorporated into machine algorithmic processes in various ways. Technology is not neutral; it projects the inventor’s value bias, and the user’s position also plays an essential role in it. (Friedman & Nissenbaum, 1996 p. 339). Different digital platforms embed specific values​​into the platform architecture and algorithms for commercial or political purposes. Intelligent algorithms are abused in the production and dissemination of online content, creating hot events, forming online public opinion, and posing a challenge to national ideological security (de Kloet et al., 2019, p. 252). If the response is unfavourable, online information and The negative effect of public opinion will impact public management and public policies, endanger social stability, and affect the government’s credibility.

The need for digital platform regulation: personal data security and privacy protection

The penetration of digital platforms’ economic, governmental, and infrastructural extensions into the web and app ecosystems (Nieborg & Poell, 2018,p. 4276). The rapid proliferation of critical academic and popular discourses poses dangers, Abuse and Risk. Regarding personal data security and privacy protection on digital platforms, the Uber platform collects personal data. It sells user information as a commodity called Uber’s “God’s View” real-time tracking system for customers and drivers. In March 2018, it was reported that Cambridge Analytica, a British politician marketing firm, had improperly accessed the data of 87 million Facebook users and used the data to help some politicians run political campaigns. Many digital platform companies have taken a contemptuous attitude towards privacy, so data leaks occur from time to time. Large-scale data mining, global digital surveillance, and algorithm analysis have allowed the general public to “streak” in the digital world. At the same time, only a few powerful groups can obtain privacy protection in the digital realm. Technology empowers actors in the network to express themselves freely and amplifies differences and contradictions.

Uber in Beijing” by bfishadow is licensed under CC BY 2.0.

Blocked communication and group polarization (including cases): Harmful and anti-intellectual speech

The truth is not a piece of plasticine that can be kneaded at will; the truth is not a blank sheet of paper that can be cut freely. (People’s Daily)

After finding his biological parents, Liu Xuzhou, a Chinese boy looking for relatives, encountered many netizens who did not know the truth. At 17, he finally chose to commit suicide by the seaside in Sanya, which can be said to have caused a sensation in the whole of China. A “most beautiful veterinarian” from Taiwan, a seemingly beautiful and positive title, but a”death sword”. Before she got this title, she was attacked by tens of thousands of dog lovers and finally committed suicide by taking medicine. Her name is Jian Zhicheng. After graduating from college, she worked in an aid station. The rescue station stipulates that animals whose income exceeds ten days and who are not adopted will be euthanized. Jian Zhicheng, who originally wanted to save animals, now has to euthanize many animals. Jian Zhicheng has given away more than 1,000 stray cats and dogs for over two years. Later, this incident was exposed on the Internet, and countless so-called dog lovers united to attack and abuse Jian Zhicheng on the Internet. The increasing pressure finally overwhelmed the last line of defence in this young girl’s heart. After euthanizing more than 700 stray dogs, she left the last shot to her newly married self and bid farewell to this cruel world. There was nothing she could do; in the end, she could only prove her innocence by dying.

American society has anti-intellectual and anti-rational thoughts in both the traditional media era and the new media era. However, research shows that this situation is more difficult after the popularization of the Internet (Susan, 2008). At the same time, the opportunity for deep dialogue is lost, and everyone “speaks with their fingers”, weakening the ability to use a reflective brain to communicate. Cultivating, maintaining, and prospering civic virtues is more complicated. It is a severe obstacle to rational communication, the courage to reveal the truth, and the dialogue between civilization and rationality.

Systematic governance of digital platforms

Fake News – Computer Screen Reading Fake News” by mikemacmarketing is licensed under CC BY 2.0.

According to Kate Wright, accuracy, sincerity, and caring are ethical norms of media practice and virtues that need to be cultivated.

The governance of network information content is highly valued by governments of various countries. Germany has passed enactment against fake news and despise discourse in social media, requiring media companies to evacuate such substance or confront strong fines; the Joined together States has propelled a net-cleaning program to avoid citizens from being influenced by a contaminated data environment. Article 1197 of the Chinese Civil Code states that “If an online platform knows or should know that a user has infringed upon the civil rights and interests of others by using its network services, and fails to take necessary measures, it shall be jointly and severally liable with the infringing user. “As for the administrative responsibility of the platform in the dispute over the infringement of online personality rights, the online platform shall not disseminate illegal information that insults, attacks others, and encroaches upon the notoriety, protection and other authentic rights and interface of others. About to take warning rectification, limit account function. If the above responsibilities are not fulfilled, the network information authority will interview and warn the online platform. Suppose it refuses to make corrections or the circumstances are serious. In that case, it will suspend business for rectification and information updates until its business license is revoked and the website is closed. The Liu Xuezhou incident reminds us that online platforms should not deal with cyber violence passively and sluggishly but should take preventive measures and intervene during the incident. Digital platforms should improve community conventions and remind users of behavioural norms. For example, new users of the Bilibili platform register as Regular members who must pass the questions, and the 100 questions are mainly community norms and barrage etiquette. Digital platforms need to strengthen artificial intelligence machine learning to identify and filter illegal and infringing information: such as Douyin online comments, private message pop-up risk reminders, and unfavourable content filtering functions. If the system recognizes a comment is unfriendly, a pop-up window will remind the user to re-edit it.

Conclusion

Digital technology has strengthened social connections, which has led to more channels for communication between people, and the fusion between virtual platforms and the natural world has formed a networked society. The greater the ability, the greater the responsibility. While the network platform provides information content services for profit, it also assumes the primary responsibility of information content management. It should create positive energy, legal compliance, and positive cyberspace.

Reference

de Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019). The platformization of Chinese Society: Infrastructure, governance, and practice. Chinese Journal of Communication: The Platformization of Chinese Society12(3), 249–256.

John, Nicholas A. (2018) Sharing Economies. In The Age of Sharing. Cambridge: Polity. pp. 69-97.

Nieborg, D. B., & Poell, T. (2018). The platformization of cultural production: Theorizing the contingent cultural commodity. New Media & Society, 20(11), 4275–4292. doi:10. 1177/1461444818769694 

Nast, C. (2018). Facebook Exposed 87 Million Users to Cambridge Analytica. Retrieved 13 October 2022, from https://www.wired.com/story/facebook-exposed-87-million-users-to-cambridge-analytica/

Kimmons, R. (2022). Uber collects a lot of information: What you should know – Your Online Choices. Retrieved 13 October 2022, from https://youronlinechoices.com.au/uber-collects-a-lot-of-information-what-you-should-know/

Friedman, B., & Nissenbaum, H. (1996). Bias in Computer Systems. ACM Transactions on Information Systems14(3), 330-347. https://doi.org/10.1145/230538.230561

Wright, K. (2014). Should journalists be ‘virtuous’? Mainstream news, complex media organisations and the work of Nick Couldry. Journalism, 15(3), 364–381. https://doi.org/10.1177/1464884913483078

The Age of American Unreason in a Culture of Lies by Susan Jacoby: 9780525436522 | PenguinRandomHouse.com: Books. (2008). Retrieved from https://www.penguinrandomhouse.com/books/86167/the-age-of-american-unreason-in-a-culture-of-lies-by-susan-jacoby/

People’s Daily – Zhihu. (2022). Retrieved from https://www.zhihu.com/question/476070864/answer/2625344649

The souls of the dead under “Internet violence”, netizens from all walks of life are showing mercy. Retrieved from https://baijiahao.baidu.com/s?id=1732169192559762845&wfr=spider&for=pc