Content Moderation: the Eternal challenge of digital platforms

"Internet A Series Of Tubes" by Jeremy Brooks is licensed under CC BY-NC 2.0.
“Social Media Logos” by BrickinNick is licensed under CC BY-NC 2.0.

With the advent of the Web 2.0 era and many complex external factors, people are increasingly accessing various forms of Internet content through social media platforms and applications (Flew & Suzor, 2019). These platforms are almost always stable and profitable, and large-scale and privately owned digital media continue to grow in economic and cultural power (Gillespie, 2019). Numerous well-known companies, including Facebook, Amazon, Tiktok, and other platforms, have been developed as a result of technological advancements and the widespread usage of media platforms.As our public discourse, cultural production, and social interactions mare increasingly moving online (Gillespie, 2019), which is one of the causes of the escalating worldwide “techlash,” the model of community communication in the modern Internet era has also altered considerably.

 

On these digital platforms, bullying, offensive speech, violent content, pornography, harassment, and other problematic content are gradually spreading, and these problems are largely caused by inadequate regulation by platforms and government agencies. This new set of state responses to the development of the Internet has given rise to a new shift in focus, the “regulatory turn” (Schlesinger, 2020). Content auditing has become critical and may influence the future shape of the public domain. Content auditing is the screening of information content posted by online users to Internet sites, social media, and other online channels. While the information content is mainly by big data algorithms, it will be an incalculably large job to be done by human. Although there are media platforms with multiple rules governing the “management” of users and governments, there are still loopholes in platform regulation. Therefore the platforms face a huge challenge about how to intervene in the spread of undesirable content and thus maintain a healthy community to help the Internet to develop sustainably.

 

Inadequate regulation of the platform?

Social media platforms create a space for the public to disseminate or change information to each other, as a truly “open” platform resonates with utopian notions of community and democracy (Gillespie, 2019). With the public disseminating or receiving information on social media platforms, regulatory regimes will be challenging and difficult (Flew & Suzor, 2019). In China, for example, where the size of Internet users reaches nearly 1 billion, cases on online media platforms are growing year by year, with a 30.12% year-on-year increase in the number of cases received as far as 2020 (n.d., 2021). In addition, global Internet users have climbed to 4.95 billion by the beginning of 2022, with Internet penetration now representing 62.5% of the world’s total population (Hannahcurrey, 2022). In addition to the issue of scale, the complex process of classifying user-uploaded material as acceptable or rejected is far beyond the capacity of the software or algorithm itself (Roberts, 2019). The loopholes in platform regulation and the difficulty of regulation are thus evident, including the large scale of users, immature technology, inadequate regulation, and complex events, among other reasons.

 

Status of the platform and governance

There are numerous undesirable issues in media platforms, to which the platforms have countermeasures, for example, the TikTok platform reviews videos or reported live streams and will take down the content. However, speech on the platform has always been a difficult element to define and regulate in the governance of the platform. Because of the user-driven nature of its content, often biased comments or malicious speech can lead to psychological or cultural harm to the user being commented on, and false speech can lead to economic downfall or mislead the public. As well as the proliferation of “fake news” on media platforms (Gillespie, 2019), “fake news” can mislead voters or benefit from deluded users. For example, in the 2016 United States elections, there was interference in online messaging during the presidential campaign. Too many concerns have been raised about the involvement of platforms in spreading “fake news” and allegedly manipulating elections for fraudulent purposes. There have also been insinuations of privacy breaches and data misuse by platforms, abuse of market power by some agencies or hackers, and more serious problems with state secrets being leaked.

“Unwanted politicians: Clinton AND Trump” by Can Pac Swire is licensed under CC BY-NC 2.0.

 

Regulation stems from the media platforms themselves, and Gillespie states that no platform is untenable without imposing rules (Gillespie, 2019). For platforms both require significant manpower and resources to handle complaints, and the establishment of a security control department is predicated on expanding manpower and getting technical support. In addition, media platforms need to develop a better rule to regulate users’ bad behavior on the platform, such as establishing a “pre-user notice” and strengthening the punishment for users who disseminate bad content. The definition of objectionable content is also a major challenge in platform governance, and some shocking atrocities may also have defensible material. One of the biggest challenges for platforms is therefore to establish and implement a content review mechanism that addresses both extremes (Gillespie, 2019).

 

Government Involvement

Regulators tend to develop policies to supplement or give legal effect to laws that are authoritative and better able to restrain wrongdoers. The current public concern about the content of political communications that are “fake news” or “misinformation” is highlighted by the fact that fake news has led to significant economic losses in Ukraine, and on February 15, 2022, the Ukrainian side criticized the

“Cloud Solutions – Creative Commons” by NEC Corporation of America is licensed under CC BY 2.0.

Western media for spreading false information about Russia’s alleged preparations for On February 15, 2022, the Ukrainian side criticized the Western media for spreading false information that Russia was preparing to “invade” Ukraine, stating that the resulting panic had caused significant losses to the Ukrainian economy, such as causing many exporters to refuse orders and causing the economy to decline (Kathrin Wesolowski, 2022). The operation of the regulatory process and the underlying principles that permeate it are closely related to the forms of state and economic relations prevalent in any social order (Schlesinger, 2020). A strong EU response was the publication of the “Code of conduct on counting illegal hate speech online” (European Commission, 2016). Therefore, furthermore, government departments should strengthen the management of the platform and improve the law to increase the practical operation. Avoiding false events that are detrimental to the country is a way to protect citizens and the country.

 

 

Bad online environment: individuals

Users are the source of all incidents of undesirable content distribution. Users are well aware of how social media platforms are regulated (Gillespie, 2019), and citizenship confers rights and obligations when we don the trappings of citizenship (Schlesinger, 2020). Therefore individuals should join the queue to protect the Internet environment to monitor themselves to have good virtues and citizenship, which maintains a healthy Internet environment while greatly reducing the difficulty and workload of regulatory tasks for platforms and government departments. Most people would prefer that platform communities monitor themselves, but a better solution would be for users to refrain from posting objectionable content from the start (Gillespie, 2019).

“Freedom of Expression” by littlestar19 is licensed under CC BY-NC 2.0.

 

Is moderation truly equitable?

The social media industry is built on capitalist ideals of the Silicon Valley

It’s no secret that the majority of tech platforms or companies have an agenda,There is a money over ethics mentality. When there is a conflict of interest, these businesses do not prioritize user protection. Even now, the public sphere is a crucial setting for government involvement as well as a key setting for the development of such operations’ plans and methods.

For instance, the site recently removed a Teleplay that had been reviewed but received complaints from viewers about the presented content that it was “disgusting,” “insulting to women,” and “lowbrow.”In the comments section where the Teleplay was flirted with, netizens satirized that a Teleplay that was ‘insulting to women’ could pass the review but could prevent the shelving of another animated movie with elaborate and beautiful special effects. The rules imposed by social media platforms today respond to contemporary fears such as terrorism and bullying (Gillespie, 2019), but should also be revisited and treated fairly for every media content.

 

Conclusion

The organization of media outlets into an online public provides new chances for interaction and communication with a wider range of people. A new, seemingly utopian online world has emerged, but there are also risks to be aware of, including pornography, obscenity, violence, criminal act-

“Social Media Influence” by Intersection Digital is licensed with CC BY-NC 2.0.

ivity, abuse, and hatred (Gillespie, 2019). Media material is progressively flowing into digital media and communication platform firms in quest of new audiences (Flew & Suzor, 2019). Second, governments should enhance and improve the control of platforms and, more crucially, users themselves. Platform

moderation of users is essential. Platforms host people and entire communities with extremely varied value systems as the user base grows and diversifies (Gillespie, 2019). We collectively look forward to reforms in the platforms to regulate content and resolve disputes. The government sector needs to find a balance that is fair and comfortable for the public while participating in stopping the spread of undesirable content.

 

 

 

 

 

 

 

Reference list

Gillespie, T. (2019). All Platforms Moderate. In Custodians of the Internet (pp. 1–23). Yale University Press. https://doi.org/10.12987/9780300235029-001

 

Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen (pp. 33–72). Yale University Press. https://doi.org/10.12987/9780300245318-003

 

Schlesinger. (2020). After the post-public sphere. Media, Culture & Society, 42(7-8), 1545–1563. https://doi.org/10.1177/0163443720948003

 

Flew, Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1

Hannahcurrey. (2022, February 1). Digital 2022: Another Year of bumper growth. We Are Social China. Retrieved October 13, 2022, from https://wearesocial.com/cn/blog/2022/01/digital-2022-another-year-of-bumper-growth/

n.d. (2021,July 1). Beijing Internet Court: Year-on-year growth of cases involving online social media platforms. Beijing Internet Court. Retrieved October 13, 2022, from https://baijiahao.baidu.com/s?id=1701341837658941017&wfr=spider&for=pc

 

n.d. (2022, October 10). Russian interference in the 2016 United States elections. Wikipedia. Retrieved October 14, 2022, from https://en.wikipedia.org/wiki/Russian_interference_in_the_2016_United_States_elections

Kathrin Wesolowski. (2022, April 28). Fact check: Fake news thrives amid Russia-ukraine war. DW.COM. Retrieved October 13, 2022, from https://www.dw.com/en/fact-check-fake-news-thrives-amid-russia-ukraine-war/a-61477502

European Commission (2016), Code of Conduct on Countering Illegal Hate Speech Online, Brussels: European Commission. https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en