The regulation of digital platforms can easily be regarded as a conflict and challenge to “individual freedom of speech”. But infact，debate and quarrel themselves is a part of freedom of speech. People from different backgrounds and different understanding will eventually understand each other’s opinions within the debate which will finally achieve the prosperity of knowledge.
In some cases，people oppose the content moderation of digital platforms for an interesting reason. They think it unilaterally hinders some people’s right to freedom of speech and artificially formulates “winners” and “losers” in the debate of a social media topic. As a result, it makes the “winners” feel right, makes the “losers” hate and continue to spread this negative emotion on the digital platform, which eventually leads to a vicious circle of platform information. But in the meantime, the platform is bearing greater pressure than their users if people can view it from another aspect, : they will never know how their users will post next. Obviously, more often, users’ activities in digital platforms are not guided or implied by the platform, but spontaneous behaviour. In a great measure，one of the biggest challenges that digital platforms face is establishing and enforcing a content moderation regime that can set full community standards without risking the engagement of platform users (Gillespie, 2018).
In China, the platform needs to bear most of the responsibility for the supervision of users’ speech. This situation is called “speech supervision and guidance”. But interestingly, if a kind of speech can be accurately guided, where is the right to freedom of speech? Thus，as a result, “People are getting tired of the periodic “public shocks” on digital platforms, and feel that the company’s own response to these shocks is mostly pretentious. This also reflects the so-called ‘platform station of the internet’, as online life is increasingly mediated through the likes of Google and Facebook.” （Terry，2019）
A recent “public shock” happened on October 14, 2021, Mercedes Benz published a video on Sina Weibo —— a social media platform. It is an advertisement jointly shot with Vogue to promote their latest Mercedes Benz C-class car. In the advertisement，a famous talk show actress Yang Li was surrounded by the staff and sat in the Mercedes Benz under the rendering of the red carpet and flash. However, just a few hours later, the comment area of Weibo was overwhelmed by angry comments. Soon, all social media platforms deleted relevant advertising content. Previously, in 《Chinese Talk Show Season 3》, Yang Li sparked with a sexual discrimination speech：”why does male all so ordinary, but they can be so confident”. Finally, Yang Li won that match and was supported by many female users in real life. However，users thought that Yang Li’s example was untenable and suspected of deliberately belittling men. It was just to please the audience with a larger proportion of women and harvest more votes for promotion, rather than the so-called voice for women.
These days，more and more governments start trying to put a greater role in enforcing content moderation restrictions of social media platforms. On November 1, 2021，《The New Personal Information Protection Law of the People’s Republic of China》 will be executed. There are more norms and constraints on the content supervision of digital platforms and a great restriction on using and protecting personal information. However, for these well-known Chinese digital platforms, when they implement government these regulatory requirements， but also meet a greater challenge for balancing both protection of their users‘ rights and “government instructions” than ever before.(de Kloet ;Poell ; Guohua;Yiu Fai, 2019) In October 2021, the China performance association announced on the microblog of the social media platform that “Li Yundi, a famous Chinese pianist known as the “Prince of the piano” , was administratively detained by Beijing public security according to the law on suspicion of prostitution. The China Performance Industry Association will start the industry ethics self-discipline evaluation procedure for Li Yundi in accordance with the regulations. The post contains Li Yundi’s name, time involved and other case details.
According to the comments of the moral self-discipline Committee of China Performance Industry Association, the association morally reprimands the illegal acts of performer Li Yundi, and requires subordinate units to boycott him in accordance with the provisions of the administrative measures for the self-discipline of performers in the performance industry. This is not the first time that the China Performance Industry Association has made a “moral reprimand” to artists on social media. As early as August this year, the association made a moral reprimand for the improper behaviour of actor Zhang zhehan’s visit to the Yasukuni Shrine.
According to the news introduction of China philanthropists, the “moral reprimand” of the trade association belongs to the internal regulations of the trade association. When applying to join the association, he indicated that he was willing to be bound by these regulations, which has nothing to do with the law. In other words, moral reprimand is an act of public condemnation on social media to show the attitude of the industry. However, it is worth pondering that the announcement made by industry associations on social media is a deliberate social stigmatization, which not only violates the law, but also seriously infringes on citizens’ right to privacy, but the platform turns a blind eye to it. Due to the social stigmatization of prostitution, public opinion itself will have greater consequences than punishment. The “moral reprimand” of trade associations is not in line with China’s constitution and damages citizens’ basic rights.According to Chinese law, whoring belongs to personal privacy. It is understandable that the public security organ punishes the parties according to relevant laws, but neither individuals nor industry associations have the right to publish such whoring involving personal privacy on the social media platform.
In this case, what needs to be discussed is not only how a social organization can obtain the personal information of citizens who should be protected by law, but also “How do digital platforms can to implement their content modulation function?” when the judicial organs deliberately ignore it.
To summarize，digital platforms are enforced by the government and working on a greater content moderation restriction. Platforms are performing a real effective content regulation. However, it needs to be asked “Will content regulation still able to be fair and effective under the guideline and monitor by the government?” Under the supervision of the government, the digital platforms are stifling users’ freedom of speech while succumbing to some specific authoritarian organizations. They are turning a blind eye to the obvious illegal content published on their platforms, and refuse to perform their due role in protecting users’ personal information. Government and authoritarian organizations are turning content regulation into a tool to control public opinion, and the platform can do nothing with it. Sadly, in the current stage，it can be seen that governments have played a highly negative role in enforcing content moderation restrictions on social media platforms.
Hampton, K. N., & Wellman, B. (2018). Lost and Saved . . . Again: The Moral Panic about the Loss of Community Takes Hold of Social Media. Contemporary Sociology (Washington), 47(6), 643–651. https://doi.org/10.1177/0094306118805415