Introduction: Digital platforms are flooded with problematic content
The Internet has brought richness to the world and more convenience to daily life. Now it has permeated every corner of people’s social lives, allowing people to share information, entertainment, shopping and more through the web. However, due to its low threshold and fast transmission speed, the network brings positive effects to people, while its negative effects are gradually revealed. In the diversified Internet world, users on some digital platforms release pornographic, violent, hateful and vulgar information to attract public attention and create “eyeball economy” for the purpose of making profits or disturbing social order, which causes the environment of digital platforms to be polluted and becomes a big hidden danger to destroy the harmony of platform space. Of course, due to the characteristics of anonymity, virtuality and freedom of speech, online platforms provide a foundation and guarantee for the high-speed transmission of problematic content to a certain extent, which makes it difficult to control it in a short period of time. This essay will demonstrate that the government, relevant network regulatory authorities, platforms and citizens (i.e. users) all need to assume the responsibility of preventing the dissemination of problematic content, and analyze and study the different ways and means that the three parties can take to prevent the dissemination of such content.
“Social Media Keyboard” by Shahid Abdullah is marked with CC0 1.0. Retrieved from: https://www.flickr.com/photos/71195909@N03/32891617344
The reasons and means for governments and Internet regulators to prevent the dissemination of problematic content
Internet platforms are multidimensional spaces, with thousands of messages and content being released every moment. There is no guarantee that the spread of negative content can be completely eliminated in a short period of time, and people have to think about how to minimize the occurrence of negative content. The government and relevant regulatory departments should assume responsibility for managing online platforms, with the obligation to comprehensively clean up the behavior of spreading bad content on the Web and strengthen supervision and management functions. A prolonged flood of problematic content on digital platforms can only affect social stability and undermine national peace and security, especially hate speech. It is generally accepted that speech that is intentionally offensive and discriminatory against an individual or group based on inherent characteristics of identity, such as race, gender and religion, may constitute hate speech. Many countries have adopted different restrictions on online hate speech. Compared with other countries, where hate speech removal and account closures are most common, Germany’s penalties are severe.
In Germany, a citizen posted hateful incitement about immigrants on Facebook. The government has taken steps against insults, threats and harassment: authors can be criminally prosecuted, fined 1,000 dollars for minor infractions and even sent to prison for serious offences. The government’s way of supervising Internet disorder is to control and order offenders under legal and economic penalties (Gorwa, 2019). The penalty applies to all online postings of negative content, including violence, pornography and harassment. As the public’s ideology and values are easily led to deviate and distort by bad content, the government and relevant regulatory departments should set up special online enforcement departments to monitor online content in real time to prevent such content from affecting social stability and ethnic unity. More importantly, strict measures are needed to firmly resist the rampant spread and invasion of problematic content on various digital platforms. In addition, the government should strengthen and improve network legislation to create a secure, harmonious and stable network platform that provides strong guarantees. It is worth mentioning that, in view of the network supervision problem, it is not only necessary to rely on the government departments to legislate and enforce the law, but also the platform needs to assist the government to formulate corresponding rules, and carry out multi-faceted and multi-level supervision and control of the network content.
“No violence no hate speech” by faul is licensed under CC BY 2.0. Retrieved from: https://www.flickr.com/photos/98706376@N00/6329770277
Why and how platforms block the distribution of problematic content
Huffman, the co-founder of Reddit, proposed a content system for the platform: it is forbidden to publish any problematic content that incites violence, bullying and harassment against individuals or groups (Massanari, 2017). Only when platforms and governments complement each other and regulate together will the crackdown on undesirable content be strengthened. Digital platforms are the medium through which content is disseminated, providing publishers with a wide and free dissemination space. On digital platforms, virtual cyberspace undermines the public’s perception of authenticity in front of the screen, and people with reduced psychological stress are more likely to post violent content online. Typical characteristics of early cyber-violence incidents: Most of them are caused by social events. The main form of violence is human flesh searches, which involve large-scale involvement of internet users. At the same time, it is accompanied by a series of unconventional actions such as harassment and intimidation, often with tragic consequences. In 2012, Chinese director Chen Kaige made a feature film “Caught In the Web” about a white-collar woman with cancer who failed to give up her seat for an elderly man on a bus and was exposed on the Internet. Her identity was leaked, and she was lambasted and threatened with death. Eventually, she was forced to commit suicide. Through topics, discussions, groups and other forms, more and more people gather and communicate on the Internet, and assimilation and disagreement of opinions occur, thus expanding the connection between network participation, expression and society (Gillespie, 2018b). As a result, individual cyber violence is more and more likely to be incited into a group siege, leading to frequent and difficult to contain negative communication behavior.
Concerned about the anonymity of speech publishing on the platform, the platform can respond by adopting a real-name system for users, which is real-name authentication of the real identity information of registered users, as it is easier to track the information of the publishers and hold them accountable. The platform can also establish a background monitoring system and a database of blocked words to identify and filter sensitive words and banned words, and intercept the illegal content, so as to effectively reduce the bad content on the Internet from the dissemination (Gillespie, 2018a). Inevitably, sometimes the system cannot detect some illegal words, so the platform staff need to personally review. They identify bad content, warn publishers or force them to remove it. If they do not change, they will block their accounts (Gillespie, 2018a). In addition, the platform can set up a reporting system so that most Internet users can report and complain about such information and then submit it to the platform for processing. The platform does its best to prevent the spread of problematic content and improve the accuracy and coverage of supervision.
“Internet Open” by balleyne is licensed under CC BY 2.0. Retrieved from: https://www.flickr.com/photos/12494132@N00/2668834386
People need to cultivate self-awareness and improve discrimination from an early age
The key to preventing the spread of problematic content is to resist it at its source: that is, citizens voluntarily put an end to posting objectionable content on the Internet and impose strict management and requirements on themselves. These contents not only affect the social and cultural environment and the Internet environment, but also the mind of a person, especially a teenager who is not yet capable of good discrimination. According to Internet porn statistics, the average age of access to Internet porn is 11, and the majority of users of porn sites are male. Pornography not only distorts young people’s misconceptions about gender, but it also leads to physical violence and abuse. This in part makes it more difficult to guide and educate teens and gradually corrupts ideas and ideologies, leading to sexual bullying and rape on campus at an ever-younger age. Whether it is an adult or a child, they should cultivate a conscious awareness not to publish and spread bad information. At the same time, people should also improve their ability to identify and analyze and learn to report and complain when they encounter problematic content and information.
“pornography” by CGP Grey is licensed under CC BY 2.0. Retrieved from: https://www.flickr.com/photos/52890443@N02/4889181009
Conclusion
In short, the Internet is not a place outside the law, and netizens need to be careful about their words and actions. At present, the Internet has become an important platform for cultural creation and communication. If the digital platform is neglected to manage and build, it will lead to the decay of social culture, the destruction of social harmony and stability, and the impact and destruction of social mainstream ideology. There is still a long way to go to regulate the content of digital platforms, and all parties in society still need to make unremitting efforts to balance the relationship between freedom of speech and content posted online.
Reference list
Dignam, A. (2015, November 6). The societal impact of cyberbullying. A Lust For Life – Irish Mental Health Charity in Ireland. https://www.alustforlife.com/the-bigger-picture/the-societal-impact-of-cyberbullying?gclid=EAIaIQobChMI4aTU94PS-gIVRZpmAh1WYgyxEAAYAiAAEgL9rPD_BwE
Ecofunomics. (2020, September 26). Pornography and Economics: The rare talked Topic. Eco-Fun-Omics. https://ecofunomics.com/economics/673/
Gillespie, T. (2018a). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
Gillespie, T. (2018b). Governance by and through Platforms. In The SAGE handbook of social media. (1st ed.). London, 254-278.
Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407
Massanari, A. (2016). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
Nations, U. (n.d.). What is hate speech? United Nations. Retrieved October 8, 2022, from https://www.un.org/en/hate-speech/understanding-hate-speech/what-is-hate-speech
Satariano, A., & Schuetze, C. F. (2022, September 23). Where online hate speech can bring the police to your door. The New York Times. https://www.nytimes.com/2022/09/23/technology/germany-internet-speech-arrest.html
Toro, G. (2013, December 7). IndieWire. IndieWire. https://www.indiewire.com/2013/12/review-chen-kaiges-internet-drama-caught-in-the-web-90910/