Self-regulation? Government regulation? Co-regulation?

The disputable topic between self-regulation and government regulation against violating contents is a trolley problem. From the perspective of platform companies, they are the world’s largest technology companies that own hundreds of millions of users that could potentially influence socio-cultural, economic, political, and national security. Content-sharing platforms would emphasize their legal regulation to maintain new users circulating, protect their corporate image, and exaggerate their institutional ethics. Inevitably, the event of ‘techlash’ partially proved that the platforms are argumentative and clearly puzzled in self-regulation (Schlesinger, 2020).
On the other hand, national and international governments are seeking strategies to regulate globalized platform companies. The ‘complexity of regulating in a contested global arena’ is a diversified and complex problem when harmful content is circulated in mass communication (Flew et al., 2019). As government regulation has no accountability when facing the internet’s core value of ‘freedom of speech’, there is no balance point of deciding ‘who’ is truly responsible to stop the spread of negative content. This essay will discuss the indispensable roles and accountability of both regulations, focusing on the contemporary framework of how platforms and governments should be tackling harmful content in different ways. The stance for this essay stands partially against the government, but government regulations are beneficially effective to stop the spread of negative content.
Benefits and Limitations of self-regulation
“Wordle Cloud of the Internet Marketing Blog – 08/15/08” by DavidErickson is licensed under CC BY-NC 2.0.
In the contemporary framework, the unprecedented nature of mass communication has merited prosocial uses for users, private companies, platforms, and governments. As platform companies operate self-regulatory framework through globalisation, platforms have soft policies which can be applied to most of the national regulatory contexts. In addition, platforms have algorithmic moderations to tackle the scale of creative contents. For instance, the growing role of AI moderation has numerous benefits for self-regulation (Darbinyan, 2022):
- Scability and speed: As humans are difficult to keep the pace of violating contents, algorithmic moderations can detect and analyze user-generated contents, which could rapidly process extreme amounts of data.
- Automation and content filtering: Algorithmic moderations can filter and classify texts, images, and videos for violating contents. The development of AI efficiently replaces human moderators in protecting their platform to be safe and clean.
- Less exposure to harmful contents: As negative psychological contents can also impact human moderators, harmful information such as sexual harrassment, porn, adult content, cyberbullying are contents that could spread in everyday discourse.
To an extent, the core of ‘self-regulation’ is the technology embedded in the infrastructure that shapes the dynamic of platform-driven sociality. Governments and public institutions, for their functioning, heavily relies on private online infrastructures to control user data and for economic success, which is the reason that corporately owned platforms cannot give their administration directly to the government in order to protect their private and public investments. Correspondingly, the event of techlash has critically reformed the local, national, and global level of policy development (Smith, 2018). The benefits of self-regulation is also heavily undermined: the speed of fragmentation and polarization requires platform to be unbiased. From a political perspective, the economic and military rise of China is creating challenges to America that has been questioned for hegemonic ideologies. If governments control the platform, the international relationship would be further fragmented. Most importantly, large technology, telecommunication companies and intermediaries play dominant roles in moderating harmful contents today (Flew et al., 2019). These companies are required to follow their domestic laws and to protect other countries business interests. Regulation is hard and unclear especially in big online communities, the core value of freedom ideology in platform society is governing their community with democratic process and matching values. The meaning-making process and symbolic language is critically imperative in ritual model of mass communication. However in mass criticism, the skill of self-regulating would be protecting fundamental moral values in policies and not leading issues ‘too wrong’. From a broader sense, intermediaries have become scapegoats that continues to maintain transparency for the internet development.
Benefits and Limitations of Government Regulation
To an extent, as social and economic processes that are designed into algorithms and business models are hidden, platform society has also become opaquer in the American context. In contrast to American platforms, the rapid emergence of Chinese platform has raised different aspects of awareness:
- Intrusive role of the state: The ownership of user data, efficiency, lack of free speech, and omnipresent government regulation.
- Multiparty infrastructure (Kloet et al., 2019): Entanglement between state and corporations, rapidly improving infrastructures for everyday discourse, e.g. Alipay and Wechat Pay.
- Different political ramifications: America highlights individualism and democracy. China emphasizes collectivism and socialism.
The benefits of country-as-platform strategy could potentially impact the macro-level of digital economy and social order in the platform society. The majority of regulation is different to America’s freedom ideologies, however China has been very unclear with the term ‘platform capitalization’ due to their political stance. Although China is not a ‘capitalist’ society, but it is a patriotic society like many other countries. For instance,,Tangshan attack in China was a recent case that caused unprecedented level of mass criticisms (Mao, 2022). In Weibo, many accounts and posts were deleted as feminists supported the victims. In addition, sexual harassments and toxic masculinity is disseminated by the government, social media companies, and users. This conflict is a tragic and ironic issue as structural oppressions allow the existence of problematic contents (sexual harassments). When facing civilization collapse or unfair structural oppressions, political authorities become stubborn, hegemonic, and relentless. Aggressive public expectations are often algorithmically monitored and disappointed, and Tangshan attack has dramatically increased political depressions and disappears in Chinese platform society. Nevertheless, there is no doubt that government regulations are promoted to be collaborative with state and platform companies, Chinese platform society has clearly succeeded in constructing and managing stabilized social order.
Why not separate regulations in different countries?
In the current phenomenon, China’s corporate platforms are indirectly controlled by the government, and most of the European countries prefer a platform society managed by government and citizens in cooperation with private companies that protects public values. If platforms would continue to self-regulate, the only possible method is to upgrade institutional organizations and intermediaries that build trust with users for responsibility and accountability (Gillespie, 2019). Platforms must be moderate to protect users and ethical values. For instance, TikTok has been forbidding under-13 years old to use the application (McCallum, 2022). The statistics revealed that 44 per cent from aged eight to twelve are using such apps, which the contents are often filled with negative, addictive, and possibly cyberbullying in private boundaries. Self-regulation is suited for America and other countries that still want to retain neoliberalism and capitalist values in platform society. From personal perspective, democracy is an enlightened ideology and virtue that should be respected and maintained in the platform society. Although governments may desire to control platforms, the majority of Western citizens highly respect freedom ideologies. While surveillance issue is also discouraging user’s trust, the public expectation of regulating fake news, political depression, extremist content, cyberbullying, sexual harassment, and hate speech has become critically urgent. American government will not severely intervene if the five largest platform companies creates clear and efficient methods for self-regulation. Notably, users should be responsible to stop the spread of negative content. However, in the framework of mass communication, platforms should increasingly gain responsibility and accountability to answer public expectations and algorithmically monitoring activities of users.
44% of 8-12 year olds in the UK use TikTok, despite its policies forbidding under-13s on the platform. It is crucial to have effective solutions in place to protect kids from harmful online content, especially while using unregulated platforms. https://t.co/0e1MN2rLAB #eSafety
— Netsweeper (@netsweeper) October 12, 2022
Conclusion
Inevitably, toxic media culture and structural oppressions are mapped in all of the platforms and countries. Although China has constructed seemingly effective and perfect regulation, freedom of speech is heavily challenged and patriotic values are encapsulated into social media, which fails to protect vulnerable groups and directly answer to public expectations. In American context, technology companies, telecommunication, and intermediaries play distinctive role in protecting social order. However, American platform companies are rather accepted in national level, but not in an international level. Anglo-Saxon countries (expect America) has large public expectations in moderating violent contents, which prefers co-regulation of state and companies, partially similar to Chinese country-as-platform strategy. No regulatory regimes are absolute correct or effective, platform regulation refers to seeking an ethical solution that is most acceptable in different socio-cultural context.
Reference List
Darbinyan, R. (2022). The Growing Role Of AI In Content Moderation. Forbes. https://www.forbes.com/sites/forbestechcouncil/2022/06/14/the-growing-role-of-ai-in-content-moderation/?sh=1ad8f0264a17
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1
Kloet, J., Poell, T., Guohua, Z., & Fai, C. Y. (2019). The platformization of Chinese Society: infrastructure, governance, and practice. Chinese journal of communication, 12(3), 249-256. https://doi-org.ezproxy.library.sydney.edu.au/10.1080/17544750.2019.1644008
Mao, F. (2022, June 23). Tangshan and Xuzhou: Fury and questions over China’s treatment of women. BBC News. https://www.bbc.com/news/world-asia-china-61906803
McCallum, S. (2022, September 26). TikTok may be fined £27m for failing to protect children. BBC News. https://www.bbc.com/news/technology-63033263
Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029
Schelesinger, P. (2020). After the post-public sphere. Media, culture & society, 42(7-8), 1545-1563. https://doi-org.ezproxy.library.sydney.edu.au/10.1177/0163443720948003
Smith, E. (2018, January 20). The techlash against Amazon, Facebook and Google—and what they can do. The Economist. https://www.economist.com/briefing/2018/01/20/the-techlash-against-amazon-facebook-and-google-and-what-they-can-do