Platform self-regulation requires assistance from government supervision

Introduction

According to Flew, T. et al.(2019). , with the trend of oligopoly in the internet industry, it is becoming increasingly urgent for different roles to strengthen the supervision of digital platforms. The emergence of false information, privacy breaches, data abuse, and increasing online hate speech or harassment on the platform is increasingly impacting people’s initial vision of using the platform. The scale and scope of this heavily criticized information make it necessary for people to consider the effectiveness of existing media policies and regulatory forms in addressing these issues, as well as whether platform self-regulation needs to be combined with government intervention and control as a third party. Due to the lack of clear rules for the operation of digital platforms, user trust, as a basic public resource, has been abandoned and exchanged to some extent when pursuing self-interest(Cusumano, M. A. et al.,2021). So in my opinion, the existence and continuous expansion of network hazards prove that platforms cannot achieve complete self-regulation or that self-regulation is not an effective measure and approach. More binding and convincing government regulatory policies can play a positive auxiliary role in restoring public trust and reducing network harm as a tool to assist self-regulation.

Digital platforms bring individuals and organizations together so that they can innovate and interact using modern software and network technologies. Multiple market participants can be promoted to trade digital resources on digital platforms. Compared to government regulation, self-regulation can be described as any rules imposed by non-governmental actors, and can further be interpreted as steps taken by companies or industry organizations to seize or supplement government rules and guidelines(Cusumano, M. A. et al.,2021).

Self-regulation is not perfect

  • The positioning of the platform leads to loose and moderate supervision policies:

The platform itself likes to emphasize the widespread and open dissemination of content, as well as the fairness of content in the regulatory process. The review and screening of disposal content is a key step for the platform to self-regulate and protect the interests of the platform and the company. Whether through machine auditing, manual means, or a combination of these two auditing methods, reconciling content in social media and other applications that rely on user-generated content has always been a relatively unknown and not fully public aspect. Many online social media websites and other transaction user platforms believe that their detailed internal information on review times and policies is proprietary and does not need to be fully disclosed. In the words of Gillespie, T. (2018)., when platforms recognize modernity, they usually define themselves as open, impartial, and non-interventionist. Part of the reason is that the brand founder fundamentally believes in such self-positioning, and another part is to avoid unavoidable obligations and responsibilities. These global digital media companies choose to delay their operations and defend themselves in the face of their interests and seek management strategies. Platform management believes that the platform is only a content intermediary or service provider, so there are difficulties when facing hosted content. To make it easier for users to distribute content, promote mutual communication, and practice in a more open online environment, the platform is positioned as a curator and intermediary for online content, benefiting from meeting terms and agreements as well as commercial foundations. The increasing responsibility for managing and monitoring user activities is not only to meet legal and treaty requirements but also to avoid losing users who have been offended and harassed while comforting advertisers to establish and protect their corporate image by connecting their brands with healthy online communities(Flew, T et al.,2019).

  • Cultural disputes persist for a long time, making it difficult to define what is wrong:

Although the original intention of social media platforms was to present themselves as a universal service platform suitable for everyone, due to the participation of communities and groups with different value systems, the content on the platform does not always fit people with different cultural backgrounds and acceptance abilities. It is difficult to achieve a balance and clear standards in promoting expression and protecting communities around inequality caused by factors such as gender, race, or class. The group that believes that the platform is too tolerant believes that the information is filled with extreme meanings such as obscene, racial discrimination, and self-harm. On the other hand, those who criticize the platform’s excessive intervention believe that materials with cultural value have also been widely prohibited and forcibly deleted during the review process. So the task of creating rules applicable to everyone in a self-made online community is difficult and cumbersome. The mutual contrast between real-life and online platforms leads to the growth and diversity of the user base, which also affects the integrity and cultural diversity of platform representation(Gillespie, T.,2018). 

“ภาพถ่ายหน้าจอ 2565-08-19 เวลา 03.15.08” by Wangyibo Cr Yue Hua is licensed under CC BY-SA 4.0.

Based on the analysis of  Basu, S. (2022). , in February 2020, 227 event occurred on Chinese social media platforms. A loyal fan of a male celebrity named Xiao Zhan has released her self-compiled novel featuring the star and another Chinese actor named wang yibo on the Chinese fan fiction platform A03. Faced with widely circulated fan novels (some of which are explicit and pornographic), a significant majority of fans of different types and positioning cannot accept the image of their idols being fabricated on the fan novel platform. So the power of the group was used to jointly report the platform, causing it to be permanently blocked from users by the firewall of China Internet Censorship Group. The loyal users of the platform erupted in great dissatisfaction, leading to a large-scale online violence event that lasted for a long time, causing great setbacks to the social media accounts and personal brand endorsements of the male star named Xiao Zhan. “In the words of C-entRoyals.(2023).,“Shamelessly, Xiao Zhan’s fans are trying to manipulate the public by accusing all those who hate and criticize xz are Wang Yibo’s fans, even though they claim there’re only 300 Wyb fans.”So what can be directly felt is that in the face of different values and standards, the results of a self-audit on the platform are prone to bias and dissatisfaction among users.

  • Employee Diversity&Algorithm:

The vast majority of full-time employees on the platforms are technical male-white individuals, which may lead to teams neglecting minority perspectives when reviewing content. The key issue that can easily arise in platform audits is that auditors are unable to process information on a large scale. The scale and speed of user publishing content means that the possibility of platform pre-adjustment and review is almost zero. Taking YouTube as an example, YouTube relies on deep algorithms to review risky content and report complaints, while recruiting users to search for offensive content such as text and videos. Only when machine audits or sufficient user complaints are received, these supervised content will be classified into the scope of human audit work. The complex content and workload make the human audit process full of difficulties and risks(Flew, T et al.,2019).

What about government regulation?

Cusumano, M. A.et al. (2021) pointed out that government regulation can take direct forms such as legislation and penalties for violations, as well as indirect forms such as tax permits and similar measures. The need for platforms to be regulated is a common view among both the public and politicians. Convincing government actions can have an impact on the entire industry. If there is no pressure from government actors and public groups, the likelihood of self-regulation by enterprises will be reduced. The prospect of government regulation is more intrusive and is believed to have the ability to influence industry-level rating systems and self-enforcement agreements. In the era of the Internet, government regulatory actions have moderate costs and more proactive forms of regulation, which can encourage platforms to proactively self-regulate.

Conclusion:

So I believe that platform self-regulation cannot be achieved and ensured due to factors such as platform definition, cultural disputes among employees, and intervention techniques. Convincing the role of the government as another regulatory agency can help reduce the occurrence of harmful information and inequality on the internet.

References:

  • Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet (pp. 1–23). Yale University Press. https://doi.org/10.12987/9780300235029-001
  • Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1
  • Cusumano, M. A., Gawer, A., & Yoffie, D. B. (2021). Can self-regulation save digital platforms? Industrial and Corporate Change, 30(5), 1259-1285.
  • Basu, S. (2022). The Xiao Zhan Controversy and the Case of Misplaced Fan Activism.
  • C-entRoyals[@centroyals] . (2023, May 29).During 227, based on incomplete statistics, there are 94,483 victims of #XiaoZhan’s fans.https://x.com/centroyals/status/1663157106663366657?s=20
  • B2Bwhiteboard(2023,July 6).Government regulation or Self regulation for Digital Platforms?https://www.youtube.com/watch?v=eKd09_jciZg