The main problem of circulation on digital platforms and its solution

Abstract

With the rapid development of technology and the times, the Internet as a typical representative of digital platforms has been gradually integrated into life. Digital platforms are taking up an increasingly high proportion in daily life use. Users namely enjoy the convenience brought by digital platforms and at the same time suffer from the attack of circulation problems on digital platforms. This essay will analyze the following three common problems of circulation on digital platforms in the context of Gillospie and Gorwa’s article and focus on who is responsible for these problems and their solutions through the three perspectives of platforms, governments and audiences.

Race and gender discrimination

Discrimination on the basis of race and gender is one of the issues circulating on the Internet. They not only occur in society and the workplace, but also spread on digital platforms, which is closely related to the Internet culture. According to news reports, Google offers lower-level jobs to black people and pays them less than their white counterparts. (The Guardian, 2022) Also Google, a major internet company, was reported by wired magazine to have linked black people to gorillas when they used algorithms to identify photos. (Simonite, 2018) The above examples are just a snapshot of how racism is manifested on digital platforms. Gillospie (2017) argues that digital media platforms should now take more responsibility in the face of such issues. ‘Safe habor’ should not be an excuse for platforms to avoid responsibility. A good measure to avoid the problem of racial discrimination is to create an inclusive online platform. (Senz, 2020) It presupposes that the builder of the platform has a proper sense that he can speak out for racial inequality. Gillospie (2018) argues that moderation is the essence of a platform. A platform without moderation cannot be called a platform. He concludes that the purpose of platform moderation is to manage the user experience of the platform in the future. (Gillespie, 2018) The view is accepted that platforms should indeed strengthen the moderation of discriminatory content by maintaining a transparent mode of operation and calling on users to adopt a self-regulatory approach. Without a doubt, platforms must assume the responsibility of addressing racial discrimination. Additionally, governments and audiences should also play an active role in addressing the issue, which they should work together to collaborate with platforms to manage the problem of racial discrimination.

Black Lives Matter” by seikoesquepayne is licensed under CC BY 2.0.

Disinformation

The rapid spread of disinformation continues to be a common phenomenon, which is the problematic content that digital platforms are facing today. The audience has an obligation to stop this phenomenon from getting worse. The proliferation of misinformation and disinformation can lead to the concealment of the truth about events, which can be used to attack others who disagree with valuable political issues. (Bose, 2021) For example, the article written by Croakey during the federal election offered the possibility of spreading disinformation about health topics. (Barrett, 2022) The Australian Government’s introduction of The Australian Code of Practice on Disinformation and Misinformation and the requirement for digital platforms to sign up to guidelines are the result of government and platform efforts. On an individual level, the primary responsibility for addressing the spread of disinformation lies with audiences. The audience should judge disinformation through pragmatic scepticism. (Kyriakidou et al., 2022) Specifically, they can verify the truthfulness of news by searching for information on their own or by seeking more reliable news sources. The ability to think critically is also one of the ways in which audiences can distinguish between true and false news. Platforms should also be held accountable. Another purpose of platform moderation as summarized by Gillospie is to remove restricted and illegal content. (Gillespie, 2018) His view is accepted that timely platform auditing can stop the spread of disinformation. Taken together audiences are important roles in stopping the spread of false news. When they identify false information and flag them. The problem of digital platforms where disinformation spreads rapidly can only be solved. Stifling the spread of false information on digital platforms also requires a tripartite collaboration.

deception-online-disinformation-michael-nuccitelli 1” by iPredator is marked with CC0 1.0.

Negative content on the Internet

Violence, pornography and other illegal content circulating on the Internet are issues that need to be addressed urgently for the government. Such information has a negative impact on audiences, especially children. In 2017, on November 20, a 14-year-old British girl, Molly Russell, passed away. According to recent reports from an investigation of her social media platforms, it was found that she had viewed a large amount of self-harm and suicide content on Instagram during her lifetime. The content of these posts was well beyond what she should have undertaken at her age. It even caused discomfort among the staff and child psychiatrists on hand during the hearing. (Milmo, 2022)

Why a minor would be exposed to a large amount of content about suicide and self-harm has to be linked to the lax management of the platform. As the world’s first case in which social media algorithms were cited as the culprit, Russell’s case demonstrates the tragic consequences of condoning the spread of violence, pornography, and a host of other illegal, negative content on digital platforms. This is not limited to children, as studies have shown that adult cyber abuse can also lead to multiple mental, emotional and physical effects on users who are bullied. (Unicef, n.d.) In response to this incident, the UK government has introduced a draft law requiring large social media outlets to adopt new child protection measures or face large fines. (Milmo, 2022) This is the government’s solution to this negative content. Although the platforms should be the main responsible, the government has passed a law to force them to do so, which is more deterrent and effective in reducing the occurrence of similar incidents.

AntiBully” by Raban_Holzner is licensed under CC BY-ND 2.0.

Triangle of digital platforms

The government, the platform and the audience have all made appropriate measures to address the three typical problems of digital platform circulation mentioned above. In this case it is only a single player that takes the main responsibility, which obviously does not completely solve these problematic contents. Personally, avoiding these problems should be a three-way collaboration between the government, the platform itself, and the audience. Gorwa elaborates on this triangle model. He breaks it down into seven different combinations to govern digital platforms. (Gorwa, 2019) The most ideal is a tripartite collaboration which is accepted. The government introduces laws to force implementation by all parties. The audience has the independence to think on their own and the high representation to speak freely as representatives of NGOs. Digital platforms’ content audits are representatives of strong enforcement. But they are companies whose number one goal is to make money, and driven by profit, digital platforms will inevitably make concessions. (Gorwa, 2019) With the three parties constraining each other, digital platforms have to devote themselves to governance to maintain a green Internet environment.

Conclusion

In conclusion, there are three major problems circulating on digital platforms: discrimination against race; rapid dissemination of disinformation and illegal negative content of violent pornography at a very low cost of access. Faced with these problems, platforms, audiences and governments should all play their roles. Platforms themselves strengthen the moderation of content. Audiences engage in independent critical thinking when faced with information. The government introduces strict laws to regulate the platforms. These three should cooperate with each other to form a stable triangle model to regulate and govern digital platforms.

This work is licensed under a Creative Commons Attribution 4.0 International License

Reference

Barrett, A. (2022, June 1). What are digital platforms doing to tackle misinformation and disinformation? Croakey Health Media. https://www.croakey.org/what-are-digital-platforms-doing-to-tackle-misinformation-and-disinformation

Bose, S. (2021). Australian code of practice on disinformation and misinformation. https://digi.org.au/wp-content/uploads/2021/10/Australian-Code-of-Practice-on-Disinformation-and-Misinformation-FINAL-WORD-UPDATED-OCTOBER-11-2021.pdf

Gillespie, T. (2017). Regulation of and by plaforms. In J. Burgess, A. E. Marwick, & T. Poell (Eds.), The SAGE Handbook of Social Media (pp. 254–278). Sage. https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/reader.action?docID=5151795&ppg=277

Gillespie, T. (2018). All platforms moderate. In Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press. http://dx.doi.org/10.12987/9780300235029-001

Gorwa, R. (2019). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407

Kyriakidou, M., Morani, M., Cushion, S., & Hughes, C. (2022). Audience understandings of disinformation: Navigating news media through a prism of pragmatic scepticism. Journalism. https://doi.org/10.1177/14648849221114244

Milmo, D. (2022, September 30). ‘The bleakest of worlds’: How Molly Russell fell into a vortex of despair on social media. The Guardian. https://www.theguardian.com/technology/2022/sep/30/how-molly-russell-fell-into-a-vortex-of-despair-on-social-media

Senz, K. (2020, July 28). Racism and digital design: How online platforms can thwart discrimination. HBS Working Knowledge. https://hbswk.hbs.edu/item/racism-and-digital-design-how-online-platforms-can-thwart-discrimination

Simonite, T. (2018, January 11). When it comes to gorillas, google photos remains blind. WIRED. https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/

The Guardian. (2022, March 18). Google gives Black workers lower-level jobs and pays them less, suit claims. The Guardian. https://www.theguardian.com/technology/2022/mar/18/google-black-employees-lawsuit-racial-bias

Unicef. (n.d.). Cyberbullying: What is it and how to stop it. Unicef. Retrieved October 6, 2022, from https://www.unicef.org/end-violence/how-to-stop-cyberbullying#2