Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stopping the spread of this content and how?

"Cyberbullying, would you do it?" by kid-josh is licensed under CC BY-NC-SA 2.0.

Introduction

As we walk into the contemporary world where digital media dominates our everyday expressions and experiences, constantly forming and shaping how we perceive and comprehend the world around us, the affordance and responsibility of digital platforms are actively debated (Baccarella et al., 2018; Deuze, 2011). Therefore, the following of this article will discuss the complexity of social media and the past, present and future of digital content governance.

Social Media Keyboard” by Shahid Abdullah is marked with CC0 1.0.

 

The Dark Side of Digital Platforms

With the acknowledgement of how digital technologies contribute to today’s society’s advancement, the digital platform’s dark side remains a complex issue that requires urgent attention. There are a notable amount of problematic behaviours and deleterious content on social media platforms, including hate comments, cyberbullying, trolling, harassment, fake news and more (Baccarella et al., 2018). These incivilities and inappropriate content pose severe impacts on our society,  and as described by the former senior executive of Facebook, Chamath Palihapitiya, social media are “ripping apart” society. In addition, studies indicate that there is a strong correlation between the use of social media and one’s mental and physical health; the level of stress, anxiety and depression increases as the time an individual spends on social media increases (Sheldon et al., 2019). However, it is crucial for people to understand the duality of digital platforms (Baccarella et al., 2018). Instead of an oversimplification to categorise social media as either good or bad, it is simultaneously both of them (Baccarella et al., 2018).  In a similar sense, Kitchin, Rob and Dodge (2011), in the article “Introducing Code/Space”, also discussed the two sides of social media, suggesting that digital media has the ability to engender both opportunities and threats.

 

The Current State of Platforms’ Regulations

Current regulations of digital platforms around the world, especially in western society, are under the influence of Section 230 of US telecommunication law(Gillespie, 2018). Section 230 is also known as the safe harbour for tech giants as it spares the legal responsibility of platforms that position themselves as information distributor rather than content creator. Nevertheless, while many accept and support this regulation as its initial intention is to protect and encourage technological innovation, others are concerned about the detrimental consequences that could result from this “industry self-regulatory model”. For example, systematic discrimination and the monetisation and collection of sensitive data by social media companies (Gillespie, 2018; Gorwa, 2019). This concern is further explored and explained in the documentary The Social Dilemma (Jeff Orlowski, 2020) as it reveals the surveillance capitalism of social media companies, which capitalize on the users’ information and manipulate users’ behaviour through algorithms and big data.

In response to the heated public debate towards the safety issues of Facebook after the European Parliament testimony, the CEO of Facebook, Mark Zuckerburg (2021) published an article revealing his blueprint for the future of Facebook’s content governance and implantation methods (Gorwa, 2019). In the article, Zuckerberg (2021) proposed a way to moderate problematic content, employing both human moderators and artificial intelligence to increase efficiency and reduce judgemental error. However, Zuckerburg (2021) fails to address a fundamental question, without proper supervision and restriction, the authenticity and validity of the transparency of the platform regulations they claim to have cannot be guaranteed, especially when there is a tendency in the oligopolization of technology sector following a monetisation incentive (Almeida et al., 2021; Gorwa, 2019; Napoli, 2019). This contrasts with the ideological vision of Barbrook and Cameron (1996), where the freedom of creativity and innovation is secured with a decentralised internet that is free of government intervention. Moreover, the Facebook cover-up for the Hunter Biden scandal during the US presidential election further proves the drawback of excluding government authorities or a non-biased third party in the process of policymaking (Almeida et al., 2021; Gillespie, 2018; O’Hara & Hall, 2018).

Penetrating media” by Kevin Dooley is licensed under CC BY 2.0.

 

Content Governance – A Double-Edged Sword

As stated in the previous section, government intervention is necessary to set legally binding regulations and enforce regulatory activity (Almeida et al., 2021). Nevertheless, Popiel (2018) questions the idea of government intervention as she explores the phenomenon where millions of dollars every year are spent on lobbying by tech companies as they aim to blur the boundaries between public benefits and corporate interests. Similarly, Ng (2020) explores how easily digital platforms can transform from an instrument that promotes democracy to a tool that derails it.

Furthermore, aside from the government and platform responsibility issues, Almeida (2021) and Zuckerburg (2021) pose the key question regarding the limits and boundaries of content moderation which is associated with the conflict between freedom of speech and public interest. For example, there was a surge of discussions around the priority of collective interest and the right to speak when the former US president, Donald Trump is permanently suspended from Twitter. In addition, as Zuckerburg (2021) advocates in his article, cultural norms shifts rapidly and vary in different countries. For these reasons, when building a global digital regulatory framework, the calibration of social value and frequency of regulation updates should be considered carefully (Almeida et al., 2021; Zuckerberg, 2021).

41. The House of Commons sits for the first time in the new Parliament, following State Opening” by UK Parliament is licensed under CC BY-NC-ND 2.0.

 

The Solution

Based on the complexity of a digital regulatory framework under the current political and sociocultural climate, the European Commission recommends using a “multidimensional approach” to discuss and manage the issues (Baccarella et al., 2018). Similarly and in opposition to the independent policy-making process that Zuckerburg (2021) promotes, Almeida (2021) suggests that digital platforms should be governed by polycentric institutions. In her article, Almeida (2021) introduces the digital content governance ecosystem and explores the role of the three key actors in the ecosystem. This includes the State with the responsibility to establish a consensual decision-making process involving all stakeholders and to ensure the transparency and accountability of the decisions (Almeida et al., 2021). The second actor is the market, which includes tech companies and associated businesses. Their role is to provide technical service and support in maintaining a functional and healthy online environment. Most importantly, the last actor in the ecosystem is society, and this is because we are now in the era of Web 2.0, where internet users are both the receiver and creators of information (Almeida et al., 2021; Gillespie, 2018).

In addition, Gorwa (2019) outlines Abbott and Snidal’s conceptual model of the Governance triangle, in which the three groups of actors: firm, state and NGO are highlighted, reflecting the same composition as what Almeida (2021) suggests. Yet, the difference between the concept of Gorwa (2019) and Almeida (2021) is that Gorwa’s emphasises on the “watchdog function” of NGOs, including civil society groups and academics, through their appeals and research on content governance practices. However, the centrepiece of the two concepts is the same: the collaborative effort of the Government authorities, corporates and non-government organisations (Almeida et al., 2021; Gorwa, 2019).

teamwork 4” by Daryl I is licensed under CC BY-NC-SA 2.0.

 

Conclusion

To sum up, with the increasingly influential status of digital platforms to the public and the appearance of problematic behaviours on social media, the affordance and accountability in governing the content are to be carefully measured and discussed. The multidimensionality of the matter includes ethical, social, economic and political factors. Therefore the appropriate method to regulate harmful content online involves three different actor groups, States, Markets and NGOs. In the regulation process, States are responsible for legislating and updating current policies according to social norms and consensus; Markets take the responsibility for implementing the regulations and keeping the balance between freedom of speech and collective interest, and NGOs can monitor the polycentric institution system, and defend the public rights.

 

This article is licensed under CC BY-NC 2.0.

 

Reference 

Almeida, V., Filgueiras, F., & Doneda, D. (2021). The Ecosystem of Digital Content Governance. IEEE Internet Computing, 25(3), 13–17. https://doi.org/10.1109/MIC.2021.3057756 

Baccarella, C. V., Wagner, T. F., Kietzmann, J. H., & McCarthy, I. P. (2018). Social media? It’s serious! Understanding the dark side of social media. European Management Journal, 36(4), 431–438. https://doi.org/10.1016/j.emj.2018.07.002 

Barbrook, R., & Cameron, A. (1996). The Californian Ideology. Science As Culture, 6, 44–72. https://doi.org/10.1080/09505439609526455 

Carroll, P. (2022, August 27). Zuckerberg Explains to Joe Rogan Why Facebook Censored the Hunter Biden Laptop Story | Patrick Carroll. https://fee.org/articles/zuckerberg-explains-to-joe-rogan-why-facebook-censored-the-hunter-biden-laptop-story/ 

Deuze, M. (2011). Media life. Media, Culture & Society, 33(1), 137–148. https://doi.org/10.1177/0163443710386518 

Gillespie, T. (2018). Regulation of and by Platforms. In The SAGE Handbook of Social Media (pp. 254–278). SAGE Publications, Limited. http://ebookcentral.proquest.com/lib/usyd/detail.action?docID=5151795 

Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407 

Jeff Orlowski (Director). (2020, September 9). The Social Dilemma. Exposure labs. https://www.netflix.com/jp-en/title/81254224?source=35

Twitter ‘permanently suspends’ Trump’s account. (2021, January 9). BBC News. https://www.bbc.com/news/world-us-canada-55597840 

Kitchin, R., Dodge, M., Fuller, M., Manovich, L., & Wardrip-Fruin, N. (2011). Introducing Code/Space. In Code/Space: Software and Everyday Life. MIT Press. http://ebookcentral.proquest.com/lib/usyd/detail.action?docID=3339248 

Napoli, P. M. (2019). User Data as Public Resource: Implications for Social Media Regulation. Policy & Internet, 11(4), 439–459. https://doi.org/10.1002/poi3.216 

Ng, E. (2020). No Grand Pronouncements Here…: Reflections on Cancel Culture and Digital Media Participation. Television & New Media, 21(6), 621–627. https://doi.org/10.1177/1527476420918828 

O’Hara, K., & Hall, W. (2018). Four Internets: The Geopolitics of Digital Governance. Centre for International Governance Innovation, No.206. https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance/ 

Popiel, P. (2018). The Tech Lobby: Tracing the Contours of New Media Elite Lobbying Power. Communication, Culture & Critique, 11(4), 566–585. https://doi.org/10.1093/ccc/tcy027 

Sheldon, P., Rauschnabel, P. A., & Honeycutt, J. M. (2019). Social Media and Mental and Physical Health. In The Dark Side of Social Media (pp. 3–21). Elsevier. https://doi.org/10.1016/B978-0-12-815917-0.00001-0 

Wong, J. C. (2017, December 12). Former Facebook executive: Social media is ripping society apart. The Guardian. https://www.theguardian.com/technology/2017/dec/11/facebook-former-executive-ripping-society-apart 

Zuckerberg’s European Parliament testimony criticised. (2018, May 22). BBC News. https://www.bbc.com/news/technology-44210800 

Zuckerberg, M. (2021, May 6). A Blueprint for Content Governance and Enforcement. https://www.facebook.com/notes/751449002072082/?paipv=0&eav=AfaHmDFvOF3emyZp0R0POsa9pYNcXO2Kxta3xOBmLfz-kz9b6a4jNVEW8AgsWpWPKhQ