Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Platform regulatory system needs continuous improvement

"Content regulation on social media platform" by Sanskriti IAS is licensed under CC BY 2.0.

Content distribution on digital platforms questioned

Dutton, (2019) divides the understanding of the internet into three parts: technology that is constantly being updated; enabling each other under democratized discourse and government regulation; and the operation of all digital platforms that allow the makers to act from a profit perspective. Once upon a time, the internet was one-way dissemination of information, where viewers were forced to receive information but not interact with it, however, the development of today’s web 2.0 has made the internet central to people’s lives, and it has made human communication frequent, not only accessing information exchange but also providing services and technologies that have changed the way people do things (Dutton, 2019). That is, digital platforms connect people with information, and technology and it integrates social and political common views that not only represent communication but also shape the values of the world. The fluidity of the public nature of the web blurs the boundaries of the individual, the state, and politics, increases people’s influence, opens up the discourse of journalists, experts, and politicians, and provides opportunities for personal and public benefit. Yet some argue that this development undermines traditional systems of consultative democracy in terms of political participation in the public, and undermines the quality of information integration in terms of social culture, as all people can post dispersed information that is not necessarily correct or biased (Dutton, 2019).

 

 

 

 

 

 

 “Anonymous Attack” by HonestReporting.com is licensed under CC BY-SA 2.0 .

Flew, Martin, & Suzor (2019) in their article mention the extent to which digital platforms are open to hateful, abusive, and extremist content, especially towards women and members of minority races, and the fact that platforms are permissive about this is that they allow themselves to be platforms that profit from it, which has political, economic, and sociocultural implications. The social shaping perspective reveals the impact of technological change by looking at the patterns of internet use and the impact over time so that surveillance and regulation of platforms are necessary. As Gillespie (2019) says, the regulation of platforms is to protect the interests of users and remove some bad information, but also to communicate good information to new users and society. Minimizing all risks while ensuring the openness of the Internet is the ultimate goal that regulation needs to work on.

 

Platform regulation and user supervision

First of all, it is important to encode the information before it is decoded by the public, so this responsibility should fall on the shoulders of the platform users. Because for the content supervision of the platform, it needs to rely on the users to initiate supervision in the content production process spontaneously to cooperate with the platform and to self-examine their own works. Besides, when the information is released to the platform, it should face the supervision of other users, i.e. they can monitor the illegal content through the reporting mechanism of the platform while harvesting the information. However the platform occupies a central position in regulating the public discourse, so it should review and screen the content posted by its users, that is the platform’s self-regulation. Content is produced and distributed without violating laws and regulations or harmful negative behaviors, such as on China’s “Tiktok” platform, which has rules designed to protect children and reject all illegal, harmful, or offensive, anti-social content on the premise of creative creation. Each video posted by users needs to be reviewed and monitored by the backend, for example, no cigarettes, alcohol, nude leaks, or other related information can appear in the video to spread bad inducements to others, otherwise, the posted content will be taken down or blocked. Users are also required to authenticate their real names, and there is a limit to the number of times teenagers can use Tiktok (Sarwar, 2021).

michael-nuccitelli-stop-online-child-pornography” by iPredator is marked with CC0 1.0 .

In fact, this has reduced the amount of undesirable content compared to the initial modus. While platforms may not have the legal responsibility to regulate the content they host, doing so sends one of the clearest signals that they operate in a manner similar to traditional media companies, regulating speech in the public and commercial interest (Flew, Martin, & Suzor, 2019).

 

Government intervention

Secondly, government supervision is also essential. As the “visible hand” in the socialist market economy, the government needs to supervise the relevant platforms and guide them to create a healthy and orderly Internet environment. When there is illegal content or other illegal behavior on the platform, government departments play a mandatory supervisory role. For example, the WeChat platform, on which people share bloody and violent content or content that undermines the political system, is blocked because the Chinese government prohibits the dissemination of undesirable content that causes mass panic. Government intervention has helped platform companies operate more transparently (wechatwiki, 2020). The guidelines raise the obligation of companies to quickly remove terrorist material and other problematic content. It also incentivizes companies to collaborate on best practices, for example through the Global Internet Counterterrorism Forum. Although On top of that, now that companies like Apple, Google, and Microsoft have developed a global sphere of influence, we are in an era of information monopolies, and Nick Srnicek has referred to the new era as one of ‘platform capitalism’ (Srnicek 2017 as cited by Flew, Martin, & Suzor, 2019 ).

 

 

 

 

 

 

 

Trust is the Key to Web 2.0” by kid.mercury is licensed under CC BY 2.0 .

“Dimitris Avramopoulos addressing the 5th Ministerial Meeting of the EU Internet Forum” by European Commission. All right reserved. Retrieved from https://www.youtube.com/watch?v=wQ2_6dXN-20Dimitris Avramopoulos addressing the 5th Ministerial Meeting of the EU Internet Forum – YouTube

One direction that governments also need to work on is balancing the digital platform market through regulation that encourages increased competitiveness to reduce platform dominance, such as anti-monopoly practices, i.e. specifying technical rules that require firms to split after a certain percentage of market share.

 

Stakeholder Engagement

Finally, multi-stakeholder regulation can complement traditional regulation (Gorwa, 2019). Regulation requires different competencies from the participants, who have high expertise and business competencies but need the cooperation of the platform companies to execute these competencies, and the companies operate with interests, so they should compromise between their interests and become a group where the combined competencies work. For example, civil society can come together to issue principles that might provide guidance to governments or companies, which can help platforms understand the complex relationship between different actor preferences. Engage in negotiations and informal commitments to better understand the power relations developed through these governance negotiations. Such constraints are often threatened by future legislation and a shared understanding that non-compliance may lead to punitive measures and potentially stricter regulatory outcomes (Gorwa, 2019). States can contribute to the public good by achieving reductions in bullying, harassment, violent content, hate, pornography and other problematic content spread across digital platforms at a lower cost.

 

Are these programs working?

Actually, these approaches are ongoing and improving all over the world. Because regulation is complex, it is difficult to find a balance between the need to ensure the openness of the platform and the need to protect users from abuse. For example, users and viewers have different cultures, values and ideologies, so it is difficult to ensure that regulation is proportionate, i.e. respects the contours of political discourse and cultural tastes; fights against gender, sexual, racial and class ine, equality; and extends moral obligations across national, cultural and linguistic boundaries (Gillespie, 2019). Taken together, it appears that a multi-stakeholder regulatory standard-setting program is the best approach that can balance the interests of users and platforms, as it allows for both learning more about users and keeping platform operations within the norm. Overall, the increasing desire of society for platform regulation and the increasing expectations and demands of people on the Internet will require new regulatory policies to respond.

Policies for platforms: supporting and regulating digital platforms” by ITU Pictures is licensed under CC BY 2.0 .

 

 

 

Reference list

Dutton, W. H. (2009). The Fifth Estate Emerging through the Network of Networks. Prometheus (Saint Lucia,       Brisbane, Qld.), 27(1), 1–15. https://doi.org/10.1080/08109020802657453

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital     communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50.  https://doi.org/10.1386/jdmp.10.1.33_1

Gillespie, T. (2018). All Platforms Moderate. Custodians of the Internet : Platforms, Content Moderation, and the   Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press,.  https://doi.org/10.12987/9780300235029

Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content.     Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407

Sarwar, N. (2021, Sep 21). China’s TikTok App Now Limits Kids To 40 Min Per Day, And That’s Smart. Screen Rant.  https://screenrant.com/tiktok-time-limit-app-china-restrictions/

Wechat Wiki. (n.d.) (2020, Jan 15). WeChat Rules and Marketing Restrictions.  https://wechatwiki.com/wechat-   resources/wechat-rules-and-marketing-restrictions/