
Introduction
With the development of technology and changing times, the Internet has become an indispensable part of people’s lives. With the rapid development of the Internet, the rise of digital platforms has brought many changes to our lives. Among the many digital platforms, social media has brought the most significant changes to people. Social media platforms provide users with the opportunity to communicate and interact with a wider range of people and organize them into an online public (Gillespie,2018). Users can post comments on topics of interest to them or comment on content posted by other users on the social media platform. However, over time, the dangers of social media platforms have been exposed, with pornographic, obscene and violent content popping up all over the place. This questionable content is not just found on one social media platform, it is seen on almost every digital platform. In this case, the government, social media platforms and online users all have a responsibility to stop the spread of this content.

Harmful content on social media platforms
Digital platforms like Google and Facebook have grown dramatically in usefulness and importance over the past decade, and they are now an integral part of most Australians’ lives (ACC, 2019). As time progresses, there is a proliferation of problematic content on the Internet. One of the most frequent problematic contents is hateful statements and trolling of others. When browsing social media platforms, it is common to find many bloggers with hateful comments under their published content. Users who post these comments are usually trolling comments about content on the platform for no reason. The second type of problematic content that often appears on social media platforms is harassing content. This kind of harassing content usually appears in the comment section of female bloggers. For example, on Instagram or Facebook, when women bloggers post content, the comments are always abusive, even with insults such as ‘bitch’ and ‘whore’. Another type of questionable content that often appears on digital platforms is fraudulent content. Digital platforms offer many conveniences for businesses and consumers, but they also pose many problems. For example, when consumers purchase items on digital platforms, they may encounter problems with the goods they receive not matching the display image. In addition to consumer situations, there are many fraudulent messages on social media platforms disguised as sweepstakes. These messages prey on people’s curiosity, luring them to click on links and fill out personal information. In addition to these three cases, there is a lot of questionable content on digital platforms. To manage this situation, a joint effort of government, platforms and users is needed.
Platform governance measures
When questionable content appears on social media platforms, the platforms should bear the brunt of the responsibility to stop the spread of such content. The platform has a strong intermediary role in regulating the flow of digital information. Digital companies should have a seat at the table when it comes to Internet regulation (Popiel&Sang, 2021). In terms of platform governance, there are two focuses, the first is for the management of platform users and regulation of user behavior. When content about fraud, pornography, violence and other questionable content appears on digital platforms, the platforms first deal with users who post such content. The platform will generally punish those users who post questionable content, such as banning or forcibly canceling the user’s account. In addition to the penalties made after the content has been published, the platform should also make appropriate measures in the early stages of user identification. There are many sites that require users to provide information about their true identity when they register. Facebook, for example, is attempting to implement a real-name user policy to improve security and accountability. In addition to the management of users, the platform should also review the content on the platform. When conducting content reviews, those who manage social media platforms may use a combination of three primary methods of review: artificial intelligence-driven algorithms; paid, trained human content moderators; and unpaid peer users (Reuber& Fischer, 2022). All three methods have their own benefits and drawbacks, and platform managers should not rely on only one audit method. As the platform grows and content increases, managers should combine all three to better audit content.
Government Governance Measures
In addition to the platforms’ own regulation, the government also has a responsibility to stop the spread of such questionable content. Scholars recommend that the government mandate a specific national regulator that is able to address the issues arising from digital platforms in a comprehensive manner. In response to the proliferation of harmful content on social media, various government agencies have taken measures to address this. In the EU, “co-regulation”, “self-regulation” and “soft law” solutions have long been used to address policy issues involving companies, and the internet is no exception. For example, in 2010, the Netherlands, the UK, Germany, Belgium and Spain sponsored a European Commission project called “Clean IT”. The project will develop “general principles and best practices” to combat the spread of terrorist content and other illegal uses of the Internet (Gorwa,2019). In addition to the EU, the Chinese government has measures in place to manage digital platforms. The Chinese model of governance of digital transformation is one of mass surveillance and full state control. The platforms are not under democratic control and there is close cooperation between the state and the platforms (Schneider,2020). Looking at the examples of the regulation of digital platforms in the EU and China, each country has different measures in place to regulate digital platforms, but each country has different levels of government involvement in regulation.
(Models for Platform Governance)
Online user Governance Measures
In addition to the efforts of governments and platforms, it is also the responsibility of online users to stop questionable content from being distributed on digital platforms. Online users do not have a great deal more power to directly manage digital platforms than governments and platforms, but users can take it upon themselves to build a healthier digital platform environment. Firstly, users should be aware of whether the content they post is in line with the platform’s specifications. Secondly, when users see content on the digital platform that does not match their own views, they can argue among themselves, as the digital platform is a place for free speech but users should be careful what they say when arguing. Finally, when users see abusive or hateful content on digital platforms, they have the option to report the speech and have the platform penalise the user who made it, rather than hitting back with abusive comments. Overall, while online users cannot directly manage the digital platform, they can work together to create a better digital platform from their own perspective.

Conclusion
In summary, as more and more people use digital platforms, the dangers of digital platforms are exposed, with fraud, violent content, hate, pornography and other questionable content popping up all over the place. If such content continues to be distributed on digital platforms, it could be harmful to the minds of young people or incite hatred in others, which could lead to a major disaster. In order to create a good Internet environment, the government, digital platforms and online users should work together and contribute their share.
Reference list
Schneider, I. (2020). Democratic Governance of Digital Platforms and Artificial Intelligence?: Exploring Governance Models of China, the US, the EU and Mexico. EJournal of eDemocracy and Open Government, 12(1), 1–24. https://doi.org/10.29379/jedem.v12i1.604
Gorwa, R. (2019). The platform governance triangle: conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407
Reuber, A. R., & Fischer, E. (2022). Relying on the engagement of others: A review of the governance choices facing social media platform start-ups. International Small Business Journal, 40(1), 3–22. https://doi-org.ezproxy.library.sydney.edu.au/10.1177/02662426211050509
Popiel, P., & Sang, Y. (2021). Platforms’ Governance: Analyzing Digital Platforms’ Policy Preferences. Global Perspectives (Oakland, Calif.), 2(1). https://doi.org/10.1525/gp.2021.19094
Australian Competition and Consumer Commission. (2019). Digital Platforms Inquiry.
https://www.accc.gov.au/publications/digital-platforms-inquiry-executive-summary
Gillespie. (2018). Governance by and through Platforms. In Burgess & ProQuest (Firm) (Eds.), The SAGE handbook of social media (pp. 254–278). SAGE Publications.