Preventive policies for traditional digital “platforms”
In the concept of American information policy, “platform” as a term has not gained much attraction. At the beginning, “online intermediaries” was a better term to describe the concept of “platform”. They act as a middleman between users who create content and users who may want content, which makes them similar to both traditional newspapers and search engines (Burgess et al., 2017). This platform mode also allows intermediaries to audit content and prevent the spread of porn, violence and other illegal content according to legal requirements. Content audit is a broad and complex social and technological phenomenon. By selecting and filtering acceptable content in accordance with regulations, legal requirements, and cultural norms, it influences the information exchange process and user behaviors (Flew, Martin, & Suzor, 2019). In this way，the platform’s developers can obtain the free inspiration they had promised and use it to offer a social opportunity to talk, communicate, and interact with a wider range of individuals (Burgess et al., 2017). Therefore, under the content supervision of traditional data platforms, violence, pornography and some other immoral content can be properly solved.
Modern digital platform
Various sorts of platforms have steadily been enriched with the expansion of media networks, but typical private information providers can be split into four parts: publishers, broadcasters, retailers, and telecommunications (Burgess et al., 2017). These media platforms, as broadcasters and retailers, are not only in the middle between users and users, users and the public, but also between citizens and law enforcement departments, policy makers and regulators. At the same time, they need to coordinate and supervise the contact between various groups and individuals. With the development of Facebook, Youtube, and Twitter, users now have more convenient and unrestricted methods to share information while also benefiting from some services’ anonymity, encryption, and temporary Internet connections. Therefore, some illegal content is easy to spread in regional jurisdictions, and has an indirect or cumulative impact. In the mid-1990s, the United States no longer targeted individual users but Internet service providers that spread content against these undiscovered “publishers” (Burgess et al., 2017). However, due to the low cost, wide range and fast speed of social media, any mishandling will have a negative social impact. Moreover, the rise of short video platforms dominated by tiktok has made the issue of platform supervision mentioned again and again.
TikTok and short-video
People’s political and cultural activities have become more and more integrated into a global ecosystem based on digital platforms as digitization gains popularity in society (Zeng & Kaye, 2022). People are willing to spend their spare time on political and cultural exchanges through the Internet. The short-video operation mode promoted by TikTok can be watched at any time and put down at any time, which fully conforms to the era of fragmentation of citizens’ life time. According to statistics, in September 2021, TikTok’s monthly active users in the world reached 1 billion (TikTok. ，2021b). In addition， A TikTok annual usage report in 2021 shows that 60% of young Americans (18-24 years old) use the platform every day (TikTok. ，2021a). The youthful generation has great imitation and learning skills, as well as strong self shaping and role shaping. They swiftly socialize by adopting and assimilating various cultural aspects (Erikson, 1968). In the growth process of youthful people, premature or excessive exposure to violence, pornography and other harmful content is likely to cause harm to youthful people’s hearts or urge them to imitate bad behavior. For example, in 2020, a video depicting personal suicide spread rapidly on TikTok, causing a bad social impact. At the same time, interviews show that for many interviewees, some viral short videos mean sudden contact with new audiences, who are more likely to leave abusive comments, and these contents will increase at the peak of creation (Zeng & Kaye, 2022). For countries with relatively successful short video supervision, China, where Tiktok’s company is located, is a country that can provide rich experience.
The Development and Supervision of Digital Media in China
As the boundary between the platform and public infrastructure becomes more and more blurred, the platform has increasingly become a part of the daily life of Chinese citizens (de Kloet, Poell, Guohua, & Yiu Fai, 2019). As of November 2016, China had grown to be one of the largest Internet marketplaces in the world, with 710 million Internet users, ranking first globally and surpassing both India’s and the US’ combined Internet user bases. Platformization is not a unified process, but develops gradually along the progress of infrastructure, governance and practice. In China, the policies of the Chinese government are often closely related to technology. The Chinese government’s commitment to infrastructure construction is the main reason for promoting platform development. The general urbanization process has accelerated due to the explosive development of mobile Internet usage and mobile phone ownership. In addition，local culture, social organizations, and broader social change have all been significantly impacted by Internet connectivity in rural places (Ji, 2021). Although the United States and Europe have experienced similar development, the development speed and intensity of this process are greater in China. More and more people flock to the platform, blurring the boundaries between the sender and the receiver. The sender can also be the receiver, making scale economics change. With the increase of production, the average cost continues to decrease. Then match all kinds of information with big data to match the corresponding sender to the receiver. This business model has brought a lot of employment opportunities to China and promoted business development. For example, farmers introduced fruits to urban residents through TikTok’s short videos to promote agricultural sales channels (Ji, 2021). However, such a convenient way of publicity also provides space for some industries engaged in underground pornography and violence to connect with Internet users. In order to prevent the spread of these problem contents in China’s Internet media, the Chinese government mainly prevents or restricts the spread of bad information from two aspects.
The Impact of China’s Network Infrastructure Construction on Rural Education
First of all，the Chinese government uses a “Restricted access” mechanism that requires moderators to censor, condemn, or prohibit inappropriate users (Wachhaus, 2017). This is the first line of defense in social networks, limiting who can enter the network and banning offending user accounts. Such a strict ban system can quickly solve some problematic content, but it cannot completely restrict the individual running the account. As a result, this method requires fast and efficient content audit, so many Chinese software will use reputation to count user points to encourage high-reputation users and quickly filter out low-reputation users.
The reason why the Chinese Internet can achieve such a rapid account ban is because of the “Collective sanctions” mechanism. Protecting contact and communication through the establishment of appropriate behavior and the exclusion of members who fail to adhere to it is known as “collective sanctions” (Wachhaus, 2017). In China, the government does not want the problematic content of other countries to enter Chinese social media in various ways, so it has built an Internet “wall” between the country and the world. The capacity of an individual or group to carry out all necessary regulatory tasks without the intervention from an external authority is referred to as “self-governance” under this paradigm (Esmark & Triantafillou, 2016). On the other hand, China has cultivated a domestic macro-culture while ensuring network autonomy. Almost all users here speak Chinese and are in the same cultural system. Citizens can often agree on shared values, goals, and problem-solving directions, which also enables ordinary people to participate in the content moderation system and become an integral part of it.
For Western countries, because of the emphasis on the social concept of “freedom and democracy”, it is difficult to establish a “self-governance” system like China to ensure the unity of language and ideology in social media. Therefore, how to use “Restricted access” and “Collective sanctions” to protect the network environment or establish a macro-culture of citizens while accommodating various ideas and cultures is a key step to limit or prevent the spread of problematic content.
Burgess, J., Marwick, A., & Poell, T. (2017). The SAGE Handbook of Social Media (pp. 254–280). SAGE Publications.
de Kloet, J., Poell, T., Guohua, Z., & Yiu Fai, C. (2019). The platformization of Chinese Society: Infrastructure, governance, and practice. Chinese Journal of Communication, 12(3), 249–256. https://doi.org/10.1080/17544750.2019.1644008
Esmark, A., & Triantafillou, P. (2016). A macro level perspective on governance of the self and others. In The Politics of Self-Governance (pp. 25–41). Routledge. Retrieved from http://dx.doi.org/10.4324/9781315554259-2
Erikson, E. H. (1968). Identity youth and crisis (pp. 117–118). New York : W.W. Norton.
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1
Ji, G. (2021). Kuaishou short video social network from the perspective of urban-rural cultural linkage: A field study on the mobile internet practices of young Monguor villagers in China’s Qinghai Province. International Journal of Anthropology and Ethnology, 5(1). https://doi.org/10.1186/s41257-021-00049-2
Mansell, R., & Steinmueller, W. E. (2020). Economic analysis of platforms. In Advanced Introduction to Platform Economics (pp. 35–54). Edward Elgar Publishing.
TikTok. (2021a, February 24). TikTok transparency report. TikTok Safety. https://www.tiktok.com/safety/resources/transparency-report-2020-2?lang=en&appLaunch=
TikTok. (2021b, September 27). Thanks a billion! TikTok Newsroom.
Wachhaus, T. A. (2017). Network governance as a mechanism for responding to internet violence. International Journal of Public Administration, 41(11), 888–898. https://doi.org/10.1080/01900692.2017.1300914
Zeng, J., & Kaye, D. B. V. (2022). From content moderation to visibility moderation : A case study of platform governance on TikTok. Policy & Internet, 14(1), 79–95. https://doi.org/10.1002/poi3.287