Who should be responsible for stoping the spread of problematic content circulates on digital platforms?

With the rapid globalization of technology, comments or videos posted by an online user can be quickly retweeted and commented on by users around the world. The establishment of a regulatory authority can deliver certain results, however, the effectiveness of any regulatory regime would not be fully demonstrated (Flew et al., 2019). Some individuals publish images or comments on various platforms on the internet that have a negative impact, such as bloody, violent videos and racist comments. This article will discuss the responsibility of netizens, internet companies and the government to discourage the publication of these undesirable statements.

 Platforms and Online Users

“2nd Anoniversary 11” by Anonymous9000 is licensed under CC BY 2.0.

Online users and media companies may both have a certain responsibility for the publication of vulgar contents. Some internet users who like to get attention may consistently share their personal experiences and comments about their lives online, this oversharing may create a feeling of boredom. These actions may irritate audiences and may cause some even worse effects, for example, online users may also post insulting comments and behaviors. It is possible that these actions could have resulted in certain negative social implications. Muldoon(2022)believes that platform companies provide complimentary services to individuals and that companies leverage the collected customer information to target advertising in exchange for profits. Besides this, the platform’s censorship mechanism has not effectively restricted the posting of vulgar, bloody videos and images. Western media platforms would not restrict the content published by their posters and would like to ensure that a person could have freedom of expression. They offer wider and freer opportunities for communication and interaction, but there are no clear restrictions on pornographic, illegal and dangerous expression. The community platform would expect publishers to regulate themselves or supervise others.

image 1

The platform might incorporate a manual review process to block accounts of users who repeatedly post bloody content. This procedure simultaneously reduces the nasty practice of plagiarism and script-washing by different bloggers. The platform probably should introduce restrictions on certain vile content to reduce the pernicious guidance it provides to the under-aged. At the same time, it may promote more positive content to form a harmonious and amicable online society. Online users are also expected to strengthen their moral discipline and prevent their inappropriate comments from indirectly hurting others. Platforms and users should assume part of the responsibility for the occurrence of inappropriate comments. Members of the public may reduce the occurrence of similar incidents by strengthening their moral restraint and digital platforms by strengthening the content audit.

 

 

  The Responsibility of The Government

“Government” by Nick Youngson is licensed under CC BY-SA 3.0.Retrieved from:https://www.picpedia.org/highway-signs/g/government.html

The policies instituted by the government would also prevent the spread of objectionable content. The first point discussed in the SMC model is the rules for content review. International human rights standards provide an appropriate universal legal framework, but there may be different ways to apply this set of rules (Docquir & Centre for International Governance Innovation, 2019). Internationally established standards may not be applied to any individual city, they may only provide a guideline. Individual governments should follow their laws and develop censorship rules that operate under stricter guidelines than the broad reference to international standards. it would also create space for businesses to exercise their perspectives on what they are allowed to express on their platforms. According to a recent survey, practically half of the Swedish population uses Facebook (Magnusson et al., 2012, p. 2). Therefore, government agencies, such as municipalities, have started to implement Facebook to distribute information. Municipalities use the wall mainly for marketing activities, and citizens have demonstrated various ways of using it, including requests for information or services, reporting service failures and making complaints. While people access a variety of information on the Internet, work and entertainment, they are also able to view the latest announcements released by the government. The public can share their thoughts and improvements on the government page on Facebook, and the government can also collect them rapidly to help improve the government’s subsequent work.

“French and US social media icons” by jenniferlweeks is licensed under CC BY-NC 2.0.

The government imposes policies to protect the rights of users as well. In an environment of data transparency, a user who uses a free app also provides many personal details. If this information becomes available for sale or improper use by companies, it would cause terrible social repercussions. The rules promulgated by the government may influence the benefits of certain groups. For example, a minority of platform companies that combine data collection, media distribution and advertising now have enormous influence to influence behaviour. companies such as Facebook and Google can allocate artificial intelligence (AI) to the optimization and targeting of information sent to us based on a complex configuration regarding individuals’ weaknesses, demands and tools, enough to change individuals’ behaviour, from purchasing to dating and voting (Tambini,2019). When a person opens their phone to browse the news or emails, they invariably receive an advertisement placed by a business. This is a capitalist’s means of stimulating people to spend money. However, if politicians advertise numerous false slogans in their web pages to attract voters to vote. The online platform becomes a tool for politicians to struggle.

The government, the platform and the user should all be partially responsible for the incident of publishing a deplorable video or comments. A harmonious online society would be achieved by online users have a high standard of sophistication and are not interested in attracting the attention of the masses by publishing vulgar content. If online platform companies would improve their rules for regulating content and strengthening censorship to moderate the terrible expression of persons at origin. This can also create a safe online environment. The governments could develop their own practical regulations based on international standards. This is an effective approach and may directly protect the rights and freedoms of Internet users. However, this regulation could also become a tool for political groups to struggle. The government should probably regulate the content on the website through a method that is both consistent with the common welfare and transparent.

 

 

References

Docquir, P. F., & Centre for International Governance Innovation. (2019). The Social Media Council: Bringing Human Rights Standards to Content Moderation on Social Media. In Models for Platform Governance (pp. 9–13). Centre for International Governance Innovation. http://www.jstor.org/stable/resrep26127.4

Etlinger, S., & Centre for International Governance Innovation. (2019). What’s So Difficult about Social Media Platform Governance? In Models for Platform Governance (pp. 20–26). Centre for International Governance Innovation. http://www.jstor.org/stable/resrep26127.6

  Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1

Muldoon, J. (2022). Platform Socialism: How to Reclaim our Digital Future from Big Tech. Pluto Press. https://doi.org/10.2307/j.ctv272454p

Magnusson, M., Bellström, P., & Thoren, C. (2012, July 29). Facebook usage in government : A case study of information content. Researchgate. Accessed on 12th October https://www.researchgate.net/publication/270449424_Facebook_usage_in_government_A_case_study_of_information_content

TV, B. (2022). Netizens think these idols are OVERDOING to get attention [Video]. In YouTube. Accessed on 10th October 2022. https://www.youtube.com/watch?v=2eF6z8_2D6g

Tambini, D., & Centre for International Governance Innovation. (2019). Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination. In Models for Platform Governance (pp. 91–95). Centre for International Governance Innovation. http://www.jstor.org/stable/resrep26127.17