
"website is down" by Sean MacEntee is licensed under CC BY 2.0 .

Along with the development of Internet technology, the diversity of digital platforms has been previously enhanced. More and more digital platforms are penetrating more areas with the help of technology. According to the Australian Competition and Consumer Commission (2019), almost half of the population uses digital platforms to obtain information. In particular, Internet giants such as Google and Facebook have virtually monopolized users’ attention. Digital platforms have broken the boundaries of time and space, providing users with a free platform to share information. However, freedom of expression on digital platforms does not only mean rich and positive information sharing. Due to the lack of mandatory regulations and adequate supervision, freedom of expression on digital platforms is accompanied by the proliferation of problematic content, such as violence, pornography, bullying, hate, and harassment. This problematic information constantly undermines the healthy development of the Internet. In the face of this situation, there is a growing concern for effectively curbing inappropriate content. Despite the commercial nature of digital platforms, it is difficult for digital platforms to avoid the responsibility of regulating the content of their platforms as a content distribution service. However, the commercial nature of digital platforms and the diversity of information involved in the field make it impractical to rely solely on their self-regulatory. In addition to self-regulation by digital platforms, the government should enact better regulations to govern the Internet behavior of platforms and users. At the same time, mandatory government enforcement can be more effective in curbing the development of problem content into further crime. Therefore, although digital platforms have an inescapable responsibility to regulate the content they disseminate, stopping the spread of undesirable information requires the cooperative efforts of digital platforms and the government.
Can digital platforms escape their responsibility to regulate the content they distribute?
It is difficult for digital platforms, as media providing content distribution services, to escape their responsibility to regulate the content of their platforms. Although digital platforms provide users with the convenience of interacting with a wide range of people. Insufficient information comes along with the utopia of freedom of expression such as violence, hate, obscenity, and abuse (Burgess, Poell & Marwick, 2017). These messages use digital platforms to form digital intimacy and massive data analysis to cause different degrees of harm to platform users. According to Nicholas (2021), digital platforms provide users with new ways of socializing that increase digital intimacy between users. However, such a situation increases the risk of users’ exposure to privacy. The impact of privacy exposure and inadequate platform privacy regulations increases the risk of online violence and bad information and the inability to use established norms to protect themselves (Gillespie, 2018). On the other hand, when used negatively, preference data of digital platforms provide more permissive conditions for the pushing and development of undesirable information. Faced with this situation. Digital platforms need to take advantage of positive technological developments to strengthen self-regulation. By improving users’ privacy protection regulations, digital platforms can reduce the risk of users being bottomlessly searched. Moreover, by upgrading algorithms as well as backend auditing standards, digital platforms can effectively stop the spread of undesirable information. It is important to note that digital platforms as commercial communication mediums have their own market competition need. At the same time, the psychological trauma of backend auditors facing the dark side of the Internet alone needs to be paid more attention and protection. As a result, despite the unresolved issues of self-regulation, digital platforms need to take responsibility for their dissemination of undesirable information and act in an organized manner.
“How Facebook Feeds Your Outrage | Real Time with Bill Maher (HBO)” by Real time with Bill Maher. All rights reserved. Retrieved from: https://www.youtube.com/watch?v=cSj2SmQqNc8
Is self-regulation of digital platforms feasible? The regulation of problematic content requires the help of the government.

However, the self-regulation of digital platforms needs to be helped and restrained by better government policies. Due to the commercial nature of digital platforms, the effectiveness of their self-regulation is questionable. Government support is needed to stop the dissemination of problematic content on digital platforms. Self-regulation of digital platforms relies heavily on the collection of data on users’ privacy preferences and on advances in algorithms, which, nevertheless, are also used to attract users and advertisers to digital platforms for greater commercial gain. In the absence of government regulations, it is difficult for users to ensure whether their information data are being used by platforms to compete in the marketplace. Flew, Martin, and Suzor (2019) highlight political events in which digital platforms have engaged in the dissemination of disinformation and the exploitation of user preference data for reasons such as advertising profits. Furthermore, digital platforms’ algorithmic upgrades are difficult to completely stop the spread of problematic content; on the one hand, the algorithms’ screening basis is relatively singular, and it is difficult for platforms to completely block all risks through a single screening criterion (Flew, Martin & Suzor, 2019). On the other hand, the platform’s algorithm is somewhat oriented. Stimulating a controlled and effective impact among users through algorithmic upgrades and changes has helped digital platforms solidify their user base to some extent. Thus, self-regulation of digital platforms requires effective government policies for the protection of users’ private data to regulate platform behavior. Furthermore, the role of digital platforms in providing space for multiple groups to interact has contributed to their emergence as one of the dominant public spheres of the day. Along with the influence of freedom of expression and power relations in this domain, the blocking of problematic content needs to take into account the political elements of platform information and the influence of Internet geopolitics (Schlesinger, 2020). At the same time, the two-way nature of the rapid development of Internet technology is taken into account. The existence of problematic content is not only the dissemination of relevant information but also implies breeding some new types of crime. For example, the scale and criminal intent of the Room N incident in South Korea were difficult to develop in a short period. By tracing suspects’ past Internet records, their plans to exploit technical loopholes and platform rules to commit crimes have gradually developed along with the advancement of Internet technology and the lag of regulations. Governments have more effective coercive enforcement power and political intent than platforms. While self-regulation of digital platforms requires political considerations and is limited by privacy regulations, government enforcement and regulations can be more effective in stopping the spread of problematic content and curbing the germination of new crime. Therefore, better regulations and government agencies are required to stop the spread of objectionable content on digital platforms.
The regulation of content on digital platforms requires joint regulation by the digital platforms themselves and the government

In conclusion, stopping problematic content on digital platforms requires not only platforms’ self-regulation but also more perfect regulations to restrain the behavior of all parties. On the one hand, digital platforms have the role of providing content distribution services and have an inescapable responsibility for the regulation of platform content. The development of Internet technology has led digital platforms to provide a more sophisticated platform for users to interact with each other. Users’ freedom of expression and information-sharing technology has contributed to the current situation of information diversity on the Internet. While the diversity of information attracts more users to the platform. It also puts pressure on the content regulation of the platform. Influenced by data algorithms and digital intimacy generated by numerical platform interactions, problematic content continues to breed and affect normal Internet communication. Digital intimacy breaks through the limits of time and space, providing users with greater social space and the potential for online violence. In this situation, digital platforms need stricter privacy regulations to safeguard user experience and information security. Moreover, the backend manual to review the standards of digital platforms needs to be further strengthened. However, the commercial nature of digital platforms does not enable their self-regulation to completely circumvent market influence. It is difficult for users to prevent platforms from leveraging the collection of private preference data and changes in algorithmic technology to compete in the marketplace. As a result, governments are required to put in place better policies to protect and regulate the behavior of digital platforms and users. Furthermore, as the principal public sphere of expression, the problematic content of digital platforms is to a certain extent political and has the potential to develop into a new type of crime. The potential for new types of crime. Preventing these situations from occurring requires the presence of coercive government enforcement. Therefore, both digital platforms and governments should stop the dissemination of inappropriate content on digital platforms. Digital platforms need to use advances in algorithms and platform codes to limit the spread of objectionable information. At the same time, governments need to improve regulations to protect the reasonableness of digital platforms and users and to curb the further development of problematic content into ethical issues and new types of crime.
Reference
Burgess, J., Poell, T., & Marwick, A. E. (2017). The SAGE handbook of social media. The SAGE Handbook of Social Media, 1-662.
Casey, N. (2019). The traumma floor: the secret lives of Facebook moderators in America. The verge. Retrieved October 13, 2022, from The secret lives of Facebook moderators in America – The Verge
Competition, A., & Consumer Commission. (2019). Digital platforms inquiry-final report.
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50.
Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
How Facebook Feeds Your Outrage | Real Time with Bill Maher (HBO). (2021). Retrieved October 13, 2022, from https://www.youtube.com/watch?v=cSj2SmQqNc8
Nicholas, C. (2021). Media and Society: Power, Platforms, and Participation. Sage Publications Ltd.
Schlesinger, P. (2020). After the post-public sphere. Media, Culture & Society, 42(7-8), 1545-1563.
The Nth Room case: The Making of a Monster [Documentary on online sex crime in Korea]. (2021). Retrieved October 13, 2022, from The Nth Room case: The Making of a Monster [Documentary on online sex crime in Korea] – YouTube