“Video Game Violence (55 / 365)” by somegeekintn is licensed under CC BY 2.0.
Since the Internet began to take over the world, every previous audience has been eager to participate in it, knowing that this is a rare opportunity for them. In order to attract attention, a lot of unhealthy content has started to appear on the Internet and social media platforms. Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Whether it’s the government or the platform or the users, everyone is a victim, that is why should everyone take responsibility and do their part.
For the government, the spread of harmful information means loopholes in management and laws. It should be known that the public will feel uncomfortable and uneasy after seeing these videos or pictures, and these emotions will eventually turn into distrust and questioning of the managers in power. In the past, both newspapers and television had to be reviewed several times before they were printed and broadcast, even if the truth was reduced, but people did not see harmful information. But in the modern era of rapid technological development, the Internet can bring pictures or videos around the world in a flash, and policing is difficult. If supervision is difficult, then the government should strengthen punitive measures and call on the public to boycott such harmful information. For example, relevant legal provisions can be established to emphasize the severity of spreading harmful information on the Internet and warn those who attempt to do so. Meanwhile, the disseminator of information who have been punished can be made to post videos apologizing and warning. In addition, the government can also require real-name Internet access, so that everyone knows that the Internet is not outside the law, and everyone needs to be responsible for their own words and deeds and be careful with every word they type on the Internet. Finally, the government should punish social platforms that spread recklessly or fail to supervise them, and make every social platform pay attention to the content posted by users to reduce the spread of such harmful information. Under such conditions, the dissemination of harmful information becomes illegal and despised by all, and the government will not be suspected and distrusted by the public.
For the platform, harmful information also has a lot of harm. Platforms vary, in ways that matter both for the influence they can assert over users and for how they should be governed. (Gillespie. (2018).) If the management of the platform is not in place and the disseminators of harmful information are made aware of the loose management of the platform, they will spread like crazy and attract more people to spread the information. The result must be that normal users are pushed off the platform and choose to use other platforms. This result is unbearable for the platform, and the influx of inferior users is a harbinger of the platform’s future collapse. Meanwhile, regulators within the platform are in a similarly dire position. As a central and mission-critical activity in the work flow of online digital media production, commercial content moderation is little known, frequently low-wage/low-status, and generally outsourced. (Roberts,(2019)) This means that reviewers are also exposed to harmful information, much of which can cause sleepless nights and even nightmares. In order to reduce the appearance of harmful information, platforms should identify review teams and provide them with appropriate psychological counseling on a regular basis. Platforms should also warn users who spread harmful information and ban them if necessary. At the same time, platforms should reward users who report harmful information, such as MEDALS or achievements on a platform, and call on every user to join in and supervise each other. In addition, the platform can also appropriately increase the time of audit work, so that the pictures or videos published by users at the beginning need a certain amount of time to spread to other users, so as to effectively block the spread of harmful information. Big data can also be a good assistant. The platform can collect the data set of accounts that have watched harmful information many times, find out the users who spread harmful information and ban them. The fantasy of a truly “open” platform is powerful, resonating with deep, utopian notions of community and democracy—but it is just that, a fantasy. (Gillespie,(2018) ) In order to ensure social stability and the mental health of all people, online platforms need to give up some freedom in exchange for stability.
“IF 12: Cultivate” by n.W.s is licensed under CC BY 2.0.
As for users, they are both victims and some people benefit from it. Digital platforms are treated as neutral gateways between consumers and multiple applications, with consumers gaining benefits from information access. (Mansell, & Steinmueller, W. E. (2020)) The reason there’s supply is because there’s demand for it. Because of this, so much harmful information is spread. There is no doubt that users should unite to resist such information, because of the convenience of the Internet, even children can easily search all kinds of information. No one wants their children to be affected by such harmful information. In order for the Internet and platforms to be good and good tools, every user should not spread harmful information or buy relevant pictures or videos. If there is no interest behind it, most of the harmful information will be driven out of the platform, and the environment of the network platform will be better. At the same time, those who spread harmful information should be despised by the public, so that other people with this intention know that the consequences of doing so are serious, and choose to maintain a good network environment together. At the same time, users should pay more attention to the positive information, and actively spread it. That’s what web platforms are built for. Everyone can participate in discussion, have a collision of ideas and agree to disagree. I believe that with the efforts of all people, the harmful information on the Internet will be eradicated, and the Internet will be closer and closer to the space of free discussion that everyone longs for.
In a word, everyone should act as a manager to maintain the security of network platform. Whether it’s governments, platforms or users. Everyone should take responsibility and reduce blame and blame game. Only by working together can we truly eliminate harmful information.
References list
All Platforms Moderate. (2018). In Gillespie, Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press,. https://doi.org/10.12987/9780300235029
Understanding Commercial Content Moderation. (2019). In Roberts, Behind the Screen : Content Moderation in the Shadows of Social Media (pp. 33–72). Yale University Press,. https://doi.org/10.12987/9780300245318
Gillespie. (2018). Governance by and through Platforms. In Burgess & ProQuest (Firm) (Eds.), The SAGE handbook of social media (pp. 254–278). SAGE Publications.
From https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/detail.action?docID=5151795
Mansell, & Steinmueller, W. E. (2020). Economic Analysis of Platforms. In Mansell & W. E. Steinmueller, Advanced Introduction to Platform Economics (pp. 35–54). From ProQuest Ebook Central – 详细信息页面 (sydney.edu.au)
User, S. (2022) Posting harmful information on the internet, Laws and Penalties | Leah Legal Criminal Defense. Available at: https://www.leahlegal.com/practice-areas/domestic-violence/posting-harmful-information-on-the-internet#:~:text=Posting%20harmful%20information%20on%20the%20internet%20is%20crime,two%20types%20of%20electronic%20harassment%3B%20indirect%20and%20direct%3A (Accessed: October 8, 2022).