
“Stop the Violence” by Rosaura Ochoa is licensed under CC BY 2.0
Introduction
With the continuous innovation and development of media technology, information dissemination has the characteristics of rapid, extensive, interactive and cross-border. The Internet can connect information content in different fields around the world, and the general public can make good use of the convenience of social platforms to communicate with each other. Communicate or engage in some political topic. However, the virtuality, anonymity, and freedom of the Internet have led to the emergence of various social problems, such as pornographic videos, violent content, hatred, and bullying. While the benefits of the Internet far outweigh its negative aspects, these aspects cannot be ignored(Commission of the European communities 1996). These aspects also show that the governance system for illegal content on digital platforms is not comprehensive, Internet companies and the government should prevent the spread of such content. Moreover, users themselves should also conduct self-reflection.
Increased scrutiny of illegal content on platforms by internet companies
Internet companies should strengthen the strict scrutiny of platform content and self-reflection. Some have argued for much greater policing of content online, and companies like Facebook, YouTube, and Twitter have talked about hiring thousands to staff up their moderation teams (Glaser 2018). On the other hand, Simonite illustrated that internet companies have also spent money on moderation systems, using artificial intelligence systems to detect controversial content at an early stage. In the past few decades, the Internet was limited, and people only saw some simple words and pictures through Internet platforms. There are not so many capitalists or some extreme people who use social platforms to use ordinary users to make money or hurt others.In this era of self-media, everyone is a news publisher. However, some of the information disseminated on the platform now aims to gain attention, regardless of the quality of the content and the feelings of the audience, which spreads all kinds of negative and even harmful information on social platforms. With the support of big data on social platforms and the laxness of Internet companies in auditing, such negative and illegal content will be quickly spread, and some users will be forced to accept such information. The publishing and dissemination of such content production accounts involve violations of relevant laws and regulations, disruption of social public order, etc., which endanger the ecological health of the platform and the interests of the majority of users. Therefore, Internet companies must strengthen strict censorship and management of platform content and accounts. In addition, Internet companies also need to rethink the functions and algorithms of their platforms. Sylvain describes how Facebook until recently allowed advertisers to target users based on algorithmically identified “interests” that included phrases like “how to burn Jews” and “Jew hater.” Even though Facebook discontinued these interest categories after being exposed, it illustrates the anti-social outcomes that algorithmic decision-making can have. While Internet companies prevent and supervise users from publishing illegal information on their platforms, they also need to improve their self-management, clarify standard of behavior and comply with the Internet communication theory.
The government’s legal system construction and sanctions for illegal speech on the platform
Under the strict scrutiny of platform information, the government should also strengthen the legal system of the network society. The underdevelopment of network security technology and the imperfect construction of the legal system can easily lead to the opinions and dissemination of information published by some self-media and users on the platform being illegal or even criminal. The role of government in these is to what extent should the government be involved and what degree of toleration will be applied when disparate views of the common good and human dignity clash in the digital public square(Thacker 2021). As seen by the misleading content on social media, it is simply not enough, to just take down the content. There must be a way to stop the continued spread of harmful content (Kim 2020). For information publishers who have violated laws and regulations on the platform, as well as those online platforms that condone and help illegal information publishers on the platform to carry out
illegal activities, the government should punish them according to law; if a crime is constituted, the government needs to hold them criminally responsible according to law. For some publishers who do not change their minds and deliberately cause bad influence, the government will be included in the list of seriously untrustworthy according to the law. For example, the German government issued its first fine under the new law to Facebook in July 2019. The company had to pay €2m (£1.7m) for under-reporting illegal activity on its platforms in Germany (BBC 2020). The government should also strengthen and supervise the normative guidance of Internet platforms’ behavior, encourage and support the release of platform information in compliance with laws and regulations, and promote the standardization of the development of the Internet industry.
“Dislike Button Social Media” by sergio santos is licensed under CC BY 2.0
User’s personal self-reflection
For the content supervision of Internet platforms, the most important thing is to rely on users to initiate supervision in the process of posting spontaneously. Self-examine your own posts, publish and disseminate content on the basis of not violating relevant laws and platform regulations. In addition, when pictures, texts or videos are published on the platform, they should also face the supervision of other users. While watching violent or pornographic content, users can use the reporting of platform mechanism to supervise posts with illegal content. Aside from those extreme content, in many cases, some remarks or comments made by ordinary users on the platform may cause harm to other users. Users themselves may not be aware that this may be bullying language or racism, but such content can ferment on the platform and seriously damage the reputation of other users. For example, in China, people were locked at home because of the epidemic in April. A woman in Shanghai asked a deliveryman to deliver food to her elderly parents’ house 21km away. The lady wanted to pay him but he was refused. But after the incident was posted on the platform, the behavior of the deliveryman was praised by netizens. However, netizens began to scold the lady for not paying the corresponding compensation and attacked her personally, which caused the psychological breakdown of lady and jumped to death from a building the next day. Users expressing self-objective opinions on the platform without a clear understanding of the facts may lead to cyber bullying. Users should conduct self-reflection first when expressing their own thoughts or remarks.
Conclusion
The information governance of the Internet platform needs standardized and legal management, and the network platform needs to be rectified and managed. For some users who use Internet platforms to spread illegal information, Internet companies and the government will give correct punishments and severe sanctions. In addition to this, users themselves need to improve their own socialist core values and conduct self-censorship on the platform.
Reference:
Bbc. (2020, February 12). Social Media: How Do Other Governments Regulate It? Bbc. https://www.bbc.com/news/technology-47135058
Commission of the european communities . (1996, October 16). Illegal and Harmful Content on the Internet. LexUriServ. https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:1996:0487:FIN:en:PDF
Glaser, A. (2018, January 18). Want a Terrible Job? Facebook and Google May Be Hiring. Slate. https://slate.com/technology/2018/01/facebook-and-google-are-building-an-army-of-content-moderators-for-2018.html
Jiseop Kim. (2020). The Need for Stricter Control of Social Media by the US Government During the COVID-19 Epidemic. Voices in Bioethics, 6. https://doi.org/10.7916/vib.v6i.5895
Keller, D. (2018, June 4). Toward a Clearer Conversation About Platform Liability. Knightcolumbia. https://knightcolumbia.org/content/toward-clearer-conversation-about-platform-liability
Keller, D. (2018, May 7). Toward a Clearer Conversation About Platform Liability. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3186867
Simonite, T. (2018, December 18). AI Has Started Cleaning Up Facebook, but Can It Finish? Wired. https://www.wired.com/story/ai-has-started-cleaning-facebook-can-it-finish/
Thacker, J. (2021, June 30). Should the Government Regulate Social Media? Jason Thacker. https://jasonthacker.com/2021/06/30/should-the-government-regulate-social-media/