What is a platform? Users can express their opinions on the platform without the constraints of time and space, like a small, self-centered community. But in this mini-society, are users regulated, and is posting content about bullying, pornography, crime, etc. moderated? Who will moderate it? Are users subject to legal rights? Facebook and other kinds of apps are not only technology or media but also a community of people. (Zuckerberg, 2017) When users think it is private and unregulated, they are actually wrong. Platforms need a community standard (Zuckerberg, 2017), issues are constantly emerging and platform standards should evolve with the times. Such platforms are private companies and the power to grasp the boundaries of public discourse is in the hands of a few stakeholders with huge investments. (Gillespie, T. 2018). When platform moderation goes wrong, users condemn the intrusion of the platform. Still, when problematic content such as bullying, harassment, violent content, hate, pornography, etc. appears on the platform, people feel that the platform is absent. (Gillespie, T. 2018). A Blueprint for Content Governance and Enforcement by Facebook CEO, mentions that Facebook should create an ” Oversight Body”, update new internal guidelines, reduce subjectivity, and ensure consistent decisions made by reviewers. (Zuckerberg, 2018, n.p)

Who will stop the spread of this information that needs to be moderated? How do they moderate or intervene? Platforms?
There are many ways in which social media companies can improve the review of content on their platforms. For example, in terms of human resources, increase the number of reviewers, improve community systems standards and be transparent. On the software side, add better detection software to detect bad behavior within the community. (Gillespie, T. 2018) Through big data analysis, content that is pulled by users when using different platforms, such as Twitter, can also be synced to other platforms so that cross-platform protection of users’ physical and mental health is achieved. Content censorship through platforms is a relatively straightforward form of moderation. Moderation when the essence of the platform is the commodity they offer, all platforms must be moderated. (Gillespie, T. 2018)
About Child sexual exploitation, abuse, and nudity. There is a great deal of controversy about the extent to which pornography is policed by the platform. For example, “The Terror of War”, also known as the “Napalm Girl”, by Nick Ut, 1972.
About adult sexual activities. For example, when my friend was attacked by the opposite sex with veiled pornographic words when he posted his pictures on TikTok, he was really powerless as a user in the face of such comments. Because he didn’t use the offending word, the system couldn’t detect it, yet it was genuinely disturbing to the poster. And it is obvious in the account of this heterosexual that he is whoring himself out because he filmed and uploaded it. There is good reason to wonder how these content detectors actually work. But in Adult nudity and sexual activities which is from Community Guidelines – TikTok has a specific sentence about “Do not post, upload, stream, or share content about contains sexually explicit language for sexual gratification.”

About misinformation. We can often find false information on social platforms, and instead of moderating it, platforms use big data analysis to push to the posts that are of most interest and discussed by users. However, these platforms never promise to provide only reliable information and they have no non-legal obligation to detect and eliminate such disinformation and fraud. (Gillespie, T. 2018) But they are responsible for the harm of fake news by intervening and users become increasingly caustic in their use of the platforms, and the longer they use them the more problems they discover. Content control is a basic condition provided by the platform to be responsible for at least part of the public discourse and for what purpose it serves. (Gillespie, T, 2018) Platform lunches are completely clear about what needs to be banned, and on this point, Facebook mentions in guidelines is Misinformation that there are specific instructions for misinformation about vaccines, the legitimacy of the platform’s control of content, transparency to the public, and uniform standards for content review are relatively important for now.
How to support platform content management? Government and laws?

Since Gillespie mentions that platforms have no legal obligation to detect and eliminate false information or fraud, shouldn’t the government update the relevant legal provisions against the internet at the same time as it emerges? With regard to sex trafficking, and the sexual exploitation of grooming minors, both must be legally prohibited at the end of the day, and in the case of the United States, US law imposes a clear and specific obligation on content and social media platforms to immediately report suspicious content and users to the National Center for Missing and Exploited Children.(Gillespie, T. 2018). With three antitrust lawsuits against Google and Facebook in the US alone in 2020, knitting disinformation dissemination is becoming commonplace on digital platforms. (Flew, 2022) Therefore national governments need to designate laws for relevant issues regarding the internet and moderate intervention becomes important. At the same time, however, digital platform markets pose unique challenges for policymakers, not only because the global span of platforms makes policy oversight more complex, but also because policies that involve a country’s regulatory platform involve global stakeholders. (Flew, 2022) Underpinning the content management of platforms with laws, community regimes, and crimes on the internet is no less serious than crimes in real life and should be treated equally.
conclusion
It is difficult for both platforms and governments to make some judgments about what is right and wrong on the internet. Very often the line between intervention and non-intervention is blurred and there is no specific standard. The systems of platforms are constantly improving and need to keep pace with the times.
Reference:
Child sexual exploitation, abuse and nudity | Transparency Centre. Transparency.fb.com. (2022). Retrieved 14 October 2022, from https://transparency.fb.com/policies/community-standards/child-sexual-exploitation-abuse-nudity/.
Flew, T., & Martin, F. R. (2022). Digital Platform Regulation: Global Perspectives on Internet Governance. Palgrave Macmillan.
Gillespie, T. (2018). CHAPTER 1. All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23). New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029-001
Gillespie, T. (2018). CHAPTER 8. What Platforms Are, and What They Should Be. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 197-214). New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029-008
Gillespie, T. (2018). CHAPTER 3. Community Guidelines, or the Sound of No. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 45-73). New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029-003
Misinformation | Transparency Centre. Transparency.fb.com. (2022). Retrieved 14 October 2022, from https://transparency.fb.com/policies/community-standards/misinformation/.
Zuckerberg, M. (2022). You’re Temporarily Blocked. M.facebook.com. Retrieved 14 October 2022, from https://m.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/.