Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

"Cyberbullying" by Infosec Images is licensed under CC BY 2.0. To view a copy of this license, visit https://search-production.openverse.engineering/image/3f4ea11c-0615-4658-a957-8fa1989cf4f2

Conflicts between early Internet concepts and regulation

Silicon Valley’s original vision for the Internet was to create a liberal world of “spontaneous order”. (Gillespie, 2018) When the Internet business is constantly changing to a platform, the issue of content regulation has become a hot topic among the public. The social media platform provides a zero-distance network space, allowing users to conduct social activities on the network without meeting. (Gillespie, 2018) Although the Internet seems to be trying to build utopian cyberspace, too much freedom and lack of restraint will also bring risks. For example, pornographic, obscene, violent content and some illegal activities are quietly bred on the Internet platform.

The importance of regulatory intervention on content

For users, the Internet platform is equivalent to a virtual society, and users will release and receive information from the platform. If the information containing pornography, violence, and harassment is not filtered and deleted, it may have a bad impact on users’ mental health. Pornographic content may distort the concept of healthy sexual development of teenagers, for example, it may lead to premature pregnancy of young girls due to premature exposure to sexual knowledge. In addition, if users browse some negative content for a long time, they are likely to suffer from depression or even seriously endanger their health. Women and some ethnic minorities are constantly harassed on social platforms. For example, some users obtain nude photos of women or minors through illegal means and spread them wantonly on the platform. (Massanari, 2017) The mobility and penetration of the Internet accelerated the spread of this toxic content. As a result, cyberbullying cases have become more and more frequent. Children’s electronic security commissioners have investigated about 11000 cyberbullying cases. (Quodling, 2016) However, the network platform is not an illegal place, and the government and platform should supervise and intervene in these contents. This will also establish a correct concept for society and platform users, that is, it is not allowed to publish bullying, harassment, violence, hatred, pornography, and other problems on the network, and it is also against the law and social ethics.

Social Media Influence” by Intersection Digital is licensed under CC BY-NC 2.0.

Who should be responsible for this phenomenon

  1. The government should introduce relevant laws and policies for digital platforms and make appropriate interventions.
  2. The platform should be responsible for reviewing the content published by users on their platform, screening, deleting, or hiding it.
  3. Platform users should improve their awareness of network security and promptly report to the platform the illegal content they see.

The government has the responsibility and obligation to maintain Internet security. When the government issues relevant laws or regulations, it can well promote the platform to strengthen content supervision. Internet censorship is becoming more and more common internationally. Governments have different standards and laws for content auditing and network security. Digital platforms must comply not only with national laws but also with the laws of other countries where they have substantial commercial interests and assets. This has also led to a gap in the content supervision and resources of platform enterprises in different countries. The German Law Archive (2017) and the European Commission of Conduct Code for Counter Illegal Hate Speech (2016) are both coming into force. Therefore, Facebook employs a large team of content auditors to manually intervene in the content published by European users. It is estimated that as of January 2018, 16% of the global Facebook content auditors are located in Germany, accounting for only 1.5% of global Facebook users (Turner 2018).

facebook” by pshab is licensed under CC BY-NC 2.0.

More and more media platforms have begun to take the initiative to supervise user behavior and content. This is not only due to the pressure of the government and public opinion, but also to maintain their own corporate image and retain those users who are offended and harassed. (Gillespie, 2018) For this purpose, the digital platform will review and screen the content published by users to see whether it complies with laws and regulations, and maintain users who are still willing to apply to the platform and create on the platform. Different from publishers and broadcasters that can pre-review published content, due to the large user base, many digital platforms can only conduct manual reviews after users publish content. Although digital platforms can use AI and unique algorithms to distinguish whether content violates rules, in Gillespie’s view, there is a big gap between the “data scale” of automated supervision of AI systems and the “human scale” of localized, culturally constrained interactions, which indicates that AI is still difficult to distinguish and classify endless content. (Flew et al., 2019) Machines without feelings are difficult to react to some indirect or suggestive violations. Therefore, the platform needs to employ a large number of content auditors for manual audits.

How to prevent the spread of these contents

Internet Safety a” by paul.klintworth is licensed under CC BY-NC 2.0.

In a broader sense, Internet governance is not simple government regulation, but management and supervision of the Internet from multiple levels. The first level is that the government should introduce relevant laws and policies, and publicize the network security law and relevant laws and regulations to the public. Improving citizens’ legal will and network literacy can reduce the phenomenon of publishing bad content or terrorist content to a certain extent. The second level is platform should timely find and delete bad content in to prevent pornographic, violent, or negative content from spreading on the platform. In addition, the platform should also upgrade the audit function of AI and recruit more audit teams.

Today, the main work of Internet regulation is undertaken by large technology and telecommunications companies. (Flew et al., 2019) The organization or platform responsible for content supervision will filter, evaluate, classify, approve, or delete/hide user-published content according to relevant communication policies. In order to prevent the content of harassment, bullying, violence, pornography, and hatred from spreading wantonly on the Internet, the platform needs to formulate content standards and audit policies that conform to the current national standards. This is conducive to these standards and policies being incorporated into the algorithm of artificial intelligence. When the content published by users is filtered and filtered by AI, only the content that meets the requirements will be published. Then enter the stage of the manual audit. Auditors need to find out illegal content that cannot be identified by AI and try to reduce the release of offensive and anti-social content. Tens of thousands of content are released on the digital platform every day. It is difficult to achieve large-scale pre-audit and review. To solve this problem, the platform can establish a complaint or reporting mechanism based on the preliminary evaluation of the system algorithm. (Flew et al., 2019) On xiaohongshu (RED), a short video platform in China, the platform provides users with a reporting function. When users see content that makes them feel uncomfortable, they can report the video or content and indicate the type or reason for the report. Let users participate in the process of content audit, not only can we understand the preferences and acceptance of users, but also can reduce the burden of content audit.

Conclusion

The future direction of Internet platform governance may be to address national differences in content regulation and the cultural expectations of publishers. American law provides almost complete autonomy for many western countries, and the platform can formulate and implement its own rules. Some platforms may use the safe harbor clause in American legislation to evade relevant responsibilities, such as Section 230 of the Communications Act of 1996. But besides the United States, big countries like the European Union, China, and Russia adopt a more interventionist approach. (Flew et al., 2019) The government undertakes the responsibility of urging the platform to review the content published by users. Freedom of speech should not be a shield against cyberbullying, harassment, and violence. This problematic content should be even deleted. In order to maintain network security, the government, platforms, and users should make efforts and assume corresponding responsibilities.

Reference List:

Electronic Frontier Foundation. (2022, October 9). Section 230 of the Communications Decency Act. https://www.eff.org/issues/cda230

European Commission. (2020, June 22). Commission publishes EU Code of Conduct on countering illegal hate speech online continues to deliver results. https://ec.europa.eu/commission/presscorner/detail/en/IP_20_1134

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the. question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1

Gillespie, T (2018). Governance by and through Platforms. In Burgess & ProQuest (Firm) (Eds.), The SAGE handbook of social media  (pp. 254–278). SAGE Publications.

Gillespie, T. (2018). All Platforms Moderate. In Custodians of the internet: Platforms, content. moderation, and the hidden decisions that shape social media (pp. 1-23). Yale University Press.

Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807

Quodling, A. (2016,August 29) FactCheck Q&A: what has the Children’s eSafety Commissioner done in its first year to tackle cyberbullying?. The Conversation. https://theconversation.com/factcheck-qanda-what-has-the-childrens-esafety-commissioner-done-in-its-first-year-to-tackle-cyberbullying-64309

Turner, Z. (2018), ‘Facebook, Google have a tough new job in Germany: Content cop’, Wall Street Journal, 10 January, https://www.wsj.com/articles/facebook-google-have-a-tough-new-job-in-germany-contentcop-1515605207. Accessed 19 January 2019.