

Who should take responsibility?
Digital platforms are valuable communication tools in the global world. Not only do digital platforms allow for the sharing of ideas and information in previously unprecedented ways; digital platforms are pioneers in the transformation of how content is shared by it is users. However, the ease of sharing communication and information has led to rampant problematic content. Nevertheless, who is responsible for stopping the spread of problematic content on digital platforms? Is it the users who are the ones uploading and sharing problematic content? Are digital platforms owning and operating the tools necessary for sharing problematic content? Or are governments and government agencies that oversee public spaces, including digital spaces, safe for everyone?
The essay offers that while the ultimate responsibility of stopping the spread of problematic content lies with digital platforms, governments also have a critical mandate in aiding the stopping of problematic content on digital platforms.

What do digital platforms do, and are they really useful?
The responsibility of stopping problematic content lies with digital platforms, and Digital platforms moderate problematic content. It is common for digital platforms to remove, ban or censor problematic content based on self-regulating rules that seek to protect users. For instance, in 2019, Facebook removed a video of a 16-year-old being sexually assaulted while unconscious at a house party in Providence, Rhode Island. Police used the video to identify and charge eight men with sexual assault-related crimes. Nevertheless, digital platforms use voluntary self-regulating policies embedded in double standards that cannot comprehensively deal with stopping problematic content. According to Gillespie, even a self-regulated online community faces the challenge of who should set the rules that will apply to everyone (2019, p.11). for example, bullying videos are uploaded on YouTube and TikTok, and none of the digital platforms takes down the videos unless there is a public outcry and multiple flagging of the videos to suggest harmful and violent content. There seems to be a lack of consistent rules and regulations by digital platforms that voluntarily self-regulate problematic content.
“The responsibility of Internet platform companies to mitigate harmful content online” by Cambridge Law Faculty is licensed under CC BY-NC-ND 2.0. All rights reserved. Retrieved from: https://www.youtube.com/watch?v=KDHGp-wDS50
What can regulators such as governments do to help police the Internet?
Moreover, that is where the government comes in as an oversight authority and a primary stakeholder whom digital platforms users appoint to act on their behalf as civil custodians. Within its oversight capacity, the government helps digital platforms enact regulating policies consistent with the law dealing with problematic content. Not only will the governments function as co-governance but also a collaborative and partnership model of regulation will be achieved, which will help digital platforms stop problematic content.
Governments are inherently interventionalist and working collaboratively with digital platforms may signal that digital platforms violate the citizens’ rights concerning speech or associations. Hence One of the ways that the digital platforms and the government can collaborate is by appointing self-regulation councils. The self-regulation council is independent of members, organizations-appointed members, and stakeholders that do not require governmental authority to enhance regulations (Brown, 2020). The councils do not operate for the government nor the digital platforms but for the users who need protection from the government and digital platforms within the internet of things environment. It is common knowledge that national governments are increasingly challenging the power and influence of global digital media companies and seeking strategies to regulate their content and operations (Flew et al., 2019, p.35). Moreover, self-regulating councils offer the best remedy for the challenge facing digital platforms of problematic content. The self-regulating councils will formulate and implement policies that will standardize moderation rules in digital platforms while ensuring that government-mandated rules are followed by the digital platforms, thus protecting users.
Digital platforms have failed or are unable to stop problematic content. In mid-2022, digital platforms were removed and banned by Andrew Tate for violating policies on dangerous organizations and individuals (Paul, 2022). However, before Tate’s removal and ban, he had amassed millions of followers and earned thousands of dollars from different digital platforms. The digital platform TikTok shows that Tate’s videos had been watched more than 12.7 billion times by the time he was de-platformed (Paul,2022). The delay in de-platforming or removing problematic content on digital platforms proves that, to an extent, digital platforms are complicit in not stopping the spread of problematic content. The revenue collected by digital platforms from problematic content is an incentive for their delay in stopping the spread of problematic content. At the same time, digital platforms have amassed a global power within political, social, and economic frames that allow them to display a do not care attitude to their social contract as actors within the society.

How can problematic content be stopped from spreading on digital platforms?
Gorwa notes that firms are bound by profit-seeking motives and do not necessarily act in the public interest (2019). Hence there is a need to oblige digital platforms to accept their responsibility and stop the spread of problematic content. However, due to conflict between profits and public interest, governments must intervene and create ways to aid digital platforms to stop the spread of problematic content. Moreover, another way that the governments can aid is by forming general like-purpose regulators. Due to the profit-centring of digital platforms, especially concerning problematic content, which is shared more than other content, it is clear that digital platforms need a regulation agency. The general-purpose regulator is an independent government agency mandated to enforce the law against specific acts of criminality within the business world. A general-purpose regulator would act much like the SEC or FCC in the united states, as the agencies are consumer-centric and have broad mandates without infringing on any rules or policies to protect the consumer. For digital platforms, businesses are not doing enough to ensure that their centrality to society is ethical and moral. Hence the formation of a general lei purpose regulator will focus on a tangible line of business or technology by the digital platforms. Not only will it be easier to note digital platforms that are not stopping problematic content but also financially punish and even close digital platforms that share the problematic content.

The Twitter grab is a woman repeatedly hitting a man in a car accident (Twitter,2022). While Twitter claims that the video is unverified, it shows the insidious nature of violence that Figure 1 (Twitter, 2020) is prevalent in digital platforms. The Twitter video was then posted on Facebook and TikTok and has been viewed globally as digital platforms have made content sharing seamless across locations. Therefore, there is a need for a global governance structure to stop digital platforms from showing problematic content. Lewis (2013) notes that a global governance framework comprises representatives from all countries, international organizations, and stakeholders as digital platforms and global players. A global governance framework will not only standardize how digital platforms will stop the spread of problematic content but also remove the challenge of conflicting government regulations frameworks that do not apply to other nations. At the same time, a global governance framework would create a standardized playing ground for all digital platforms, enabling the regulation and stopping of problematic content to be more accessible and multi-dimensional. At an international level, forming a global governance framework would place public interest as its core mandate by ensuring that the spread of problematic content is stopped by using global infrastructure while finding a balance between governments, digital platforms, and users’ digital and human rights.

Conclusion
It is the mandate of digital platforms and the government to stop the spread of problematic content. While digital platforms moderate and self-regulate, the actions have led to a double standard as some problematic content is removed while others are retained. Hence, the government within a co-governance framework of digital platforms is a self-regulatory council independent of each entity. Also, governments can form a general-purpose regulator agency that would hold the mandate of ensuring that digital platforms comply with the rules governing the sharing of problematic content. Finally, governments and digital platforms can partner and form a global governance framework that would set global rules and regulations to stop the spread of problematic content.
Reference list:
Brown, N. I. (2020). Regulatory Goldilocks: Finding the Just and Right Fit for Content Moderation on Social Platforms. Texas A&M Law Review, 8(3), 451-494. https://doi.org/10.37419/LR.V8.I3.1
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23). Yale University Press.
Gorwa, R. (2019). The platform governance triangle: conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407
Griffith, J. (2020, September 3). Facebook video of sexual assault found by teenage victim’s mother leads to 7 arrests. NBC News. https://www.nbcnews.com/news/us-news/facebook-video-sexual-assault-found-teenage-victim-s-mother-leads-n1239204
Lewis, J. A. (2013). Internet Governance: Inevitable Transitions. The Centre for International Governance Innovation. https://www.cigionline.org/sites/default/files/no4.pdf
MacCarthy, M. (2019, October 22). To regulate digital platforms, focus on specific business sectors. Brookings. https://www.brookings.edu/blog/techtank/2019/10/22/to-regulate-digital-platforms-focus-on-specific-business-sectors/
Paul, K. (2022). Dangerous misogynist’ Andrew Tate was removed from Instagram and Facebook. The Guardian. https://www.theguardian.com/us-news/2022/aug/19/andrew-tate-instagram-facebook-removed
Twitter. (2022). A woman repeatedly slapped a man who reportedly works as a cab driver in Uttar Pradesh’s capital city Lucknow. [Image]. Retrieved 6 October 2022, from https://twitter.com/httweets/status/1422143975570350088