The “Rescue” of Digital Platform

Violence, pornography and hate content from digital platforms need to be solved urgently

Social Media” by MySign AG is licensed under CC BY 2.0.
  • “Free” Network Digital Platform

Social media platform or digital platform is based on the original intention of advocating freedom, or for better social contact and communication (Gillespie, 2018). Which has created a “utopian” online world that is more convenient for communication and interaction, and sometimes even unrestricted. However, the accompanying huge and complex user groups, as well as the interests and value systems from all walks of life, have caused the prevalence of violence, pornography, hatred or other problems on the digital platform today. According to statistics, nearly half of Australian children aged 9-16 are often exposed to online pornographic pictures (Quadara & El-Murr, 2017), and nearly 40% of Facebook users often see hateful or disturbing content (McCluskey, 2021). Therefore, it is imperative to stop the spread of these dangerous contents through appropriate ways, this blog will analyze from three angles: the digital platform itself, the government and individual organizations.

 

  • The dilemma of digital platform

Social Media Keyboard” by Shahid Abdullah is marked with CC01.0.

As the defender of network freedom and the existence of the problem itself, the digital platform has the obligation to stop the spread of negative content, but which are faced with difficulties in restricting content and users with appropriate rules. As the representative of network freedom, digital platforms’ founders always flaunt themselves as freedom, openness or fairness, and with the economic basic operation mode of profiting from information and data, it is difficult for digital platforms to stop the spread of negative content without harming their own interests (Gillespie, 2018). Just as Apple’s deletion of pornographic applications and Jobs’ statement on “staying away from pornography” (Hutcheon, 2010) in 2010 have been criticized by technology media and stakeholders.

Meanwhile, Gillespie(2018) thinks that the digital platform will be limited by politics and culture when preventing the spread of violence or pornography and other negative content. The difference of group needs from different communities and countries leads to a phenomenon: deleting a post on a digital platform satisfies the wishes of one community but destroys the trust of another. For example, Facebook deletion of the “Napalm Girl” photo is controversial (Campbell, 2022). Some groups think that the photo is full of obscenity and pornography and support Facebook measures. However, some scholars think that the photo contains historical significance and should not be deleted because of pornography. Besides, the differences of national and policy regulations on different contents lead to the digital platform having to respond to these requests, which is called “authorized canary” by Gillespie(2018). Facebook has cooperated with Pakistan and Thailand to delete the content that the local government considers to be violence and hatred, but it does not have these negative factors in the eyes of other countries.

 

  • Attempt and selfishness of digital platform

Artificial commercial content review is one of the effective attempts of digital platform to prevent the spread of negative content such as violence or pornography. Rieza mentioned (2022) that content auditing can not only greatly reduce the amount of disturbing content on the Internet, so that the published content is always in a state of online protection, but also help to monitor users’ behavior and ban those users who violate the rules in time. Similarly, Roberts(2019) also mentioned that content review is a key part to protect the operation of websites and hinder the spread of negative content. On the contrary, Roberts(2019) thinks that the defects of content auditing of digital platforms and unfair labor exploitation and information profit are hidden behind them. Like the limitations of digital platforms mentioned above, employees of content auditing must know all kinds of cultural knowledge or have diverse language skills to cope with the large-scale and complex users uploading content. However, most auditors can’t meet such conditions, so that there are often negative performances, including violence or pornography, such as missed selection and delayed deletion. In addition, this kind of content review can squeeze workers through geographical dispersion and low hourly wage, which aggravates social inequality to some extent.

Furthermore, this kind of censorship also means that digital platforms can collect a large amount of user data and count the mainstream value trends of society, just as Van Dijck(2018) mentioned that digital platforms can create value or sell to advertisers through user data, content and attention. For example, in the absence of standards and policies, Facebook is hesitant to deal with content related to violence, pornography, hatred or religious attacks because it involves interests or commercial values, and often makes misjudgments (Fisher, 2018).

 

  • Efforts of individuals and governments

Government and individuals are the most important external forces of digital platform. As a new community that emphasizes novelty, openness, and not restriction, digital platform not only provides individuals with more discourse rights, but also creates an equal, global, inclusive and creative participatory culture exclusive to the public (Gillespie, 2018). Which is the common interest of everyone, and why individuals have the responsibility to prevent negative content including violence, pornography or hatred from spreading on digital platforms. In addition, Van Dijck(2018) believes that digital platform has shaped the current lifestyle and future social structure, which involves that the economic and cultural value is actually a choice between political ideology and public interests, and the government or the state needs to obtain the power and safeguard the public interests.

Social Media Week Milano :: Il Festival della rete” by br1dotcom is licensed under CC BY 2.0.

 

As a traditional regulator and defender of public interests,  government needs to indirectly prevent the spread of negative content on digital platforms by urging the platform. The government has established the supervision accountability system, that is, through legislation, supervision, taxation and other public channels, the digital platform is held responsible for lax content management. Which is embodied in the government’s intervention in the operation and management of digital platforms, including the statistics of data algorithms, anti-monopoly laws or the provision of network green environment standards. For example, the online security law passed by Australia in 2022 or the technical platform code issued by the Singapore government (Lim & Bouffanais, 2022). However, this traditional government supervision means only solves a single problem but lacks more comprehensive coverage, and which cannot solve the network security problems caused by differences in structure and common interests. Khan (2017) thinks that the government’s legislation can’t actually adapt to or keep up with the ever-developing skills and technologies of digital platforms. Especially those digital platforms that are large in scale and transnational or adopt different economic operation modes (Moore, 2016) such as Facebook and YouTube. Besides, according to Van Dijck(2018), the government is also a user and an open person of the digital platform. which means that the government should not only play the role of supervisor, but also participate in the development of core algorithms or technologies of the digital platform, such as developing central plug-ins into the infrastructure of the digital platform. In the final analysis, the content of violence, pornography or hatred on the digital platform lacks the guidance and existence of public values. The government urgently needs to create a platform construction criterion with public values and collective interests as the center.

 

network” by michael.heiss is licensed under CC BY-NC-SA 2.0.

Therefore, the multi-stakeholder model came into being. The biggest feature of this model is that it gives non-governmental actors the same status as the state in supervising or participating in the governance of digital platforms (Mueller, 2017). As Cowhey and Aroronson (2017) mentioned, the government, non-profit organizations or individuals, and companies should form a multi-interest cooperation system centering on public values. Which provides a channel for groups and strata from different groups and countries to participate in power, so that the solution to the problem of unsafe content on the digital platform is no longer limited by the platform itself. In addition, this model with public value can solve the problem that the digital platform mentioned above faces geopolitics, that is, different countries have different policies. The reason is that the digital platform can deal with negative content with reference to unified value standards and core areas without being restricted by the country. On the contrary, the multi-interest theme model may also be misled, which is because too many, but similar groups participate in it, which will lead to inefficiency and over-politicization.

 

  • Future prospect

In short, despite the fact that hate, violence, pornography and other problematic contents still exist on the digital platform, the digital platform itself is facing difficulties, and the immature participation of the government and individuals. At the same time, however, the perfection of the content audit of the digital platform and the attempt of the new governance model provides a feasible way for the future of the digital platform.

 

 

Reference List

Campbell, W. J. (2022, June 2). 50 years after ‘Napalm Girl,’ myths distort the reality behind a horrific photo of the Vietnam War and exaggerate its impact. The Conversation. https://theconversation.com/50-years-after-napalm-girl-myths-distort-the-reality-behind-a-horrific-photo-of-the-vietnam-war-and-exaggerate-its-impact-183291

 

Cowhey, P. F., & Aronson, J. D. (2017). Digital DNA: Disruption and the challenges for global governance. Oxford University Press.

 

Fisher, M. (2018, December 27). Inside facebook’s secret rulebook for global political speech. The New York Times. https://www.nytimes.com/2018/12/27/world/facebook-moderators.html

 

Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press.

 

Gillespie, T. (2018). ‘Governance by and through Platforms’, In J. Burgess, A. Marwick & T. Poell (eds.), The SAGE Handbook of Social Media, London: SAGE, pp. 254-278.

 

Hutcheon, S. (2010, May 17). A feisty Steve Jobs offers the world “freedom from porn.” The Sydney Morning Herald. https://www.smh.com.au/technology/a-feisty-steve-jobs-offers-the-world-freedom-from-porn-20100517-v7wl.html

Khan, L. M. (2017). Amazon’s Antitrust Paradox. The Yale Law Journal, 126(3), 710–805.

 

Lim, S. S., & Bouffanais, R. (2022, May 26). Fighting for data dregs – and losing the fight against digital violence. AsiaGlobal Onilne. https://www.asiaglobalonline.hku.hk/fighting-data-dregs-and-losing-fight-against-digital-violence

 

McCluskey, M. (2021, November 3). Why some people see more disturbing content on facebook than others, according to leaked documents. Time. https://time.com/6111310/facebook-papers-disturbing-content/

 

Moore, M.  Tech Giants and Civic Power. London: Centre for the Study of Media, Communication and Power, Policy Institute, King’s College London, 2016. https://www.kcl.ac.uk/sspp/policy-institute/cmcp/tech-giants-and-civic-power.pdf

 

Mueller, M. (2017). Confronting Alignment. In Will the Internet Fragment?: Sovereignty, Globalization and Cyberspace (pp. 50–56). John Wiley & Sons.

 

Quadara, A., & El-Murr, A. (2017, December). The effects of pornography on children and young people. Australian Institute of Family Studies. https://aifs.gov.au/research/research-snapshots/effects-pornography-children-and-young-people

 

Rieza, E. (2022, August 26). Content moderation: Importance and benefits. NMS. https://newmediaservices.com.au/content-moderation-3-reasons-why-it-is-crucial-for-your-business/

 

Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (pp. 33–72). Yale University Press.

 

US, T. (2019). The sad life of Facebook content moderators – TomoNews [Video]. In YouTube. https://www.youtube.com/watch?v=DlcAjc0ka10

 

Van Dijck, J., Poell, T., & de Waal, M. (2018). The Platform Society. Oxford University Press, USA.