
The early regulatory model of Internet opening to users brought a shared open space for freedom of speech and technology. Barlow also proposed that cyberspace should not be constrained by regulatory measures or systems in the real world (2021). However, the regulatory environment under the network freedom has not achieved a healthy and positive utopian network world. On the contrary, the content of bullying, harassment, violence, hatred, pornography and other problems hidden behind the freedom emerge in endlessly on the major digital network platforms, causing adverse effects and wrong guidance to users. Therefore, it is necessary for digital platforms and the government to make clear why they want to stop the spread of such content and give appropriate intervention at the right time.
How can the digital platform stop it?
1. Enhance the sense of network responsibility
The political preference formed by the long lobbying history of the media elite not only enhances the competitive advantage of the technology giants in the industry and their right to speak in the commercial field, but also makes the technology giants have a deeper foundation in the digital platform and become stakeholders with the platform and the government. The platform benefits from the advertising profits provided by large-scale technology companies and the benefits gained by selling customer information to them, thus deregulating the content articles beneficial to technology companies, which seriously disturbs the balance of the content tendency of pushed articles on the digital platform, and at the same time provides an opportunity for other problematic content to spread on the platform.

Therefore, the digital platform should make it clear that the content supervision is due to its responsibility to the public and the society, rather than just placing itself in the intermediary status of online content hosting, ignoring censorship or encouraging the spread of content that indulges problems.
2. Decentralize the right to audit and supervise
At present, it is known that the major digital platforms (such as Google, Amazon, Facebook, etc.) screen problematic content by employing a large number of low-paid content moderators to manually annotate problematic materials for artificial intelligence algorithms to help algorithms learn to review standards. Content auditors are treated exploitedly by the platform with low salary, poor working environment, high-intensity oppression and lack of social care (Roberts, 2019). The platform also uses the quality control mechanism to force employees to control their own regulatory behavior according to the guidelines of the platform (Talks, 2020). This long-term imbalance of rights will directly lead to employees’ negative work attitude, resulting in low work efficiency and the consequences of missed and wrong trial.
In order to solve the current situation, the platform can set up the headquarters database to automatically filter the commonly problematic content with priority and cooperate with the regional database to batch process the problematic content in countries or regions with different cultures with different screening standards. In addition, the platform can select a large number of high-quality content publishers from different regions or individuals whose values conform to the criteria of users in local regions for judging problematic content, so as to build a citizen censorship community similar to hacker culture. This can not only reduce the heavy pressure of content examiners, but also strengthen the discussion of topics on the platform and prevent the spread of problematic content more efficiently and quickly.
However, there may be hidden dangers in the citizen censorship community, such as conflicts caused by the disagreement of free auditors. Therefore, platforms can learn the development history of hacker culture and emulate its value system, which means combining the joy of censorship with the reputation among peers (Castells, 2002). The platform can use big data to record the review process and achievements of free reviewers, and provide reward mechanisms, including awarding high-quality reviewers’ medals and annually selecting the volunteers with the highest contributions in the community as community administrators to help resolve internal contradictions or conflicts of opinions.
What can the government do?
1. Encourage and help trade unions develop

As the supervisor of digital platform, the government should actively encourage the development of trade unions so as to restrict the monopoly of technology giants in the media field to a certain extent. At the same time, the government should also provide training or financial support to help trade unions keep up with the development and changes of digital technology, so that when digital platforms do things harmful to content auditors or public interests (such as allowing problematic content to be disseminated and discussed), trade unions can protect auditors’ working rights in time, and the government can also provide corresponding countermeasures to safeguard public interests.
2. Establish a platform criterion centered on public value
As a policy maker, the government should consider the common interests of the society, the public and the platform, and create a public value criterion with the collective interests as the core. As emphasized by Van, the system and legal framework can maintain the situation of mutual checks and balances rather than monopoly (2018). Therefore, the government should actively infiltrate the digital platform as a user, and reasonably use the right of policy making to formulate written institutional guidelines for the platform. For example, the platform is required to pay content auditors no less than the local minimum wage, improve the working environment of employees and give them welfare care, and give employees the right to exercise certain subjective initiative.
Form a multi-stakeholder supervision mode
Although public discourse may not be shaped by the platform, the form of public discourse is indeed generated and shaped by the platform (Gillespie, 2018). Considering that the monopoly rights of the technology giants are too large, it is necessary to give the government certain supervisory power over the platform. Specifically, the government can properly participate in the operation process of marking, warning or forced removal of the problem content by the platform in order to prevent it from spreading. However, in order to solve the disagreement between the government and the platform and prevent the government from completely suppressing or abusing its rights, each citizen censorship community will be given the right to vote to jointly determine whether the content of the article is bullying, harassment, violence, hatred, pornography and other problematic content, whether it should be prevented from spreading to the public and how to deal with it. At the same time, non-profit citizen communities should also be jointly supervised by the platform values established by the government and the guidelines of digital platform, so as to ensure that decisions are made on the road of safeguarding collective interests.

For example, in the typical example of “Napalm Girl“, Facebook forcibly removed the photo because a 9-year-old naked girl appeared in it (Levin et al., 2016). However, under the pressure of the government news media and the community, the photo was restored a week later. The reason is that this photo full of obscenity and violence has great historical significance. Therefore, the formation of a multi-stakeholder supervision mode with mutual checks and balances between the national platform and the citizen community can maintain the balance between platform opening and supervision, thus establishing a platform system with healthy and sustainable development.
For the future
Preventing the spread of bullying, harassment, violent content, hatred, pornography and other problematic content is the common responsibility of the platform, the government and the citizen community. The platform itself can improve its sense of network responsibility and delegate the supervision power to the citizen community. The government itself can strengthen the rights of trade unions and establish the standard of public value platform centered on collective interests. Platform, government and citizen community implement multi-stakeholder supervision mode to realize mutual supervision and checks and balances, which can better make decisions on the content of problems.
Reference list:
Barlow, J. P. (2021). A declaration of the independence of cyberspace. Commonplace. https://doi.org/10.21428/6ffd8432.ea8cd895
Castells, M. (2002). The culture of the internet. In The Internet Galaxy (pp. 36–63). Oxford University Press. http://dx.doi.org/10.1093/acprof:oso/9780199255771.003.0003
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press.
Levin, S., Wong, J. C., & Harding, L. (2016, September 9). Facebook backs down from “napalm girl” censorship and reinstates photo. The Guardian. https://www.theguardian.com/technology/2016/sep/09/facebook-reinstates-napalm-girl-photo
Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (pp. 33–72). Yale University Press.
Talks, Ted. (2020). Content moderators: The gatekeepers of social media [Video]. In YouTube. https://www.youtube.com/watch?v=ajjov8Ve4Ik
Van, J. (2018). Governing a responsible platform society. In The Platform Society (pp. 137-162). Oxford University Press. http://dx.doi.org/10.1093/oso/9780190889760.003.0008