
An overview of problematic content circulates on digital platform
Digital platforms emerged from the web’s exquisite disorder with many being created by individuals inspired by the freedom the web promised to host and expand participation, expression, and social interaction; as these forums gained in popularity, disorder and violence swiftly returned. The mechanism of the platform economy has expedited the interchange of information and generated enormous commercial value while increasing the circulation of harmful content (Gillespie, 2018). UNICEF has reported that one in three young people has been the victim of cyberbullying and that proportion has increased to 53% in Australia, with origins from Gender (19.5%), Race (17.5%) and Sexual Orientation (16.5%); along with online discrimination and digital violence has raised more concerned, the governance of digital platform has been presented accordingly (Gjorgievska, 2022).
“Dutch campaign against discrimination” by Niriel is licensed under CC BY-NC 2.0.
Online discrimination:
Online hate and discrimination are an increasing agenda in the development of the digital platform, with radicals mindlessly expressing their racial or religious opinion and further, causing a public opinion war through the stigmatization of other marginal groups. The increasing hate speech dissemination and leading to large-scale conflict incidents, where social media play a role in driving the spread of discrimination thoughts. Users’ online experiences are mediated by algorithms designed to maximise their engagement, which frequently promotes extreme content unwittingly. In terms of YouTube’s autoplay feature, where the player automatically loads a related video when one ends, can be particularly harmful. The algorithm directs users to conspiracy theory-promoting or otherwise “divisive, deceptive, or incorrect” videos (Laub, 2019). Similar discrimination occurred at Google, mistakenly tags black people as gorillas and classified keywords related to ‘LGBTQ’ into restricted view, fuelling the belief that queer content was potentially offensive. (Bishop, 2018).
“Is YouTubes ALGORITHM RACIALLY BIAS?” by Clueless Tips. All rights reserved. Retrieved from: https://www.youtube.com/watch?v=3Jx_NTNUz0U
Pornography is frequently disseminated on social media platforms and profoundly addicts people. On Facebook, child pornography and revenge porn were frequently reported, while on Instagram, 150 million fake profiles containing a huge number of porn bots were discovered, demonstrating that sexually explicit content is one of the most problematic on the digital platform (Hedges, 2015). Due to the lack of content regulation on the internet platform, it has become a breeding ground for the dissemination of pornographic material. The deepfake has involved everyone in this crisis, with the digital platform’s information distribution mechanism potentially amplifying this risk. Additionally, the report of revenge porn is increasing during the pandemic, with repeated lockdown policy has changed intimated relationships and the public might rely on online sexual interaction, offering the basis of sexual blackmail and it is more obvious in the marginal group of society (Meakins, 2022).
Read More: ‘Pornography Is What the End of the World Looks Like’
“Porn” by ephidryn is licensed under CC BY 2.0.
Who should be responsible for stopping the dissemination of problematic content
In consideration of the pervasiveness of problematic content, Abbott & Snidal (2009) has conceptualised a platform governance triangle that specifies the framework of multi-stakeholders in digital platform governance and emphasises the collaboration between them.
“The model of platform governance triangle” by Abbott and Snidal is licensed under CC 2.0
Additionally, the significance of users of platforms is highlighted in governance. Inspired by decentralised autonomous organisations in the crypto space, the decentralised model will be emphasized in Web 3.0, where users play an increasingly important role in the construction of the community thus, their ubiquitous Internet access will help purify content ecology (Mesquita & Hall, 2022).
Firms
In the operation of the digital platform, the technology giants have accumulated an abundance of user data property and a higher level of discourse within the community they constructed; hence, the enterprise should bear the primary responsibility for preventing the propagation of harmful content. Firstly, the company should create advanced machine-learning technology to identify and remove problematic information; and in response to image or video content that is difficult to monitor with present technology, the company must increase the number of moderators to combat unfavourable publicity (Cusumano, 2021). Digital platforms are also responsible for classifying restricted content, strictly controlling access for users under the required age, and identifying false accounts’ speculative behaviour. Meanwhile, the transparency of the Internet could be facilitated so that users would be more cautious about their online behaviour and eliminate bot accounts under the uninominal mechanism.
State and NGOs:
In the context of firewalls and filtering bubbles that are constantly constructed on the Internet, the splinternet has posed Internet administration with additional challenges. Just as the controversy between France and the United States over online intellectual property rules and freedom of expression, it is imperative that NGOs promote The state should utilise legalisation to govern content production and the responsibilities of content platforms, limiting digital platforms and giving priority to corporate over social interests. Thereby, section 230 in the United States should be updated to require Internet companies to assume legal responsibility for their community operations, and the prerequisites of free speech should be emphasised to ensure that preserving individuals’ commentary does not compromise the wellbeing of others (Smith & Alstyne, 2021).
Users:
In a centralised governance structure, platform owners have sole authority over governance, allowing them to determine the governance process and outcomes, thereby disadvantaging and alienating platform participants as platform owners may prioritise their interests over those of their stakeholders (Chen et.al., 2020). In the web 3.0 process, users have gained a sense of ownership over their content and belongingness in the community to which they have contributed, which spurs their anticipation of a better platform environment. Consequently, digital users will serve as “supervisors” of the online community and contribute to the cooperation of a healthier internet environment. However, user governance is currently underdeveloped, as many individuals continue to play the role of lurkers when discriminating not against themselves, and even if problematic content is reported, the digital platform response rate is significantly lower than anticipated (only 27% of reported revenge porn is removed from the digital platform). Consequently, the administration of digital platforms is still a work in progress, and the digital literacy of users is required to be enhanced.
Conclusion:
The digital platform does generate enormous economic value for society, but the algorithm’s discriminating stereotypes and ubiquitous porn and hate speech highlight the absence of governance in the development of the digital platform. Considering the vast volume and complicated sections on digital platforms, the governance will rely on a multi-stakeholder corporation, in which firms will use technology to their advantage, states and NGOs will use legislation and mass foundation to improve the public perception of anti-problematic content, and users will stand in the position of platform owner and be courageous enough to extend helping hands to those being discriminated against. The mature governance model is predictable, and the freedom of speech is no longer viewed as a cause of harassing dissension and hate speech.
Reference List:
Abbott, K. W., & Snidal, D. (2009). Strengthening international regulation through transmittal new governance: overcoming the orchestration deficit. Vand. J. Transnat’l L., 42, 501.
Bishop, S. (2018). Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm. Convergence, 24(1), 69–84. https://doi.org/10.1177/1354856517736978
Chen, Y., Richter, J. I., & Patel, P. C. (2021). Decentralized governance of digital platforms. Journal of Management, 47(5), 1305-1337.
Cusumano, M. A., Gawer, A., & Yoffie, D. B. (2021). Can self-regulation save digital platforms?. Industrial and Corporate Change, 30(5), 1259-1285.
Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029
Gjorgievska. L. (2022, April 15). 20+ Cyberbullying Statistics in Australia. Take a Tumble. Retrieved Oct 9, 2022. https://takeatumble.com.au/insights/lifestyle/cyberbullying-statistics/
Hedges, C. (2015, February 19). ‘Pornography Is What the End of the World Looks Like’. Canadian Dimension. Retrieved Oct 9, 2022. https://canadiandimension.com/articles/view/pornography-is-what-the-end-of-the-world-looks-like
Laub, Z. (2019, June 7). Hate Speech on Social Media: Global Comparisons. Council Foreign Relations. Retrieved Oct 9, 2022. https://www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons#chapter-title-0-3
Meakins, T (2022, June 21). Disturbing revenge porn trend soars among young Aussies. Yahoo News. Retrieved Oct 9, 2022. https://au.news.yahoo.com/disturbing-online-trend-young-aussies-082850674.html
Mesquita, E & Hall, A. (2022, May 04). Platforms Need to Work with Their Users – Not Against Them. Harvard Business Review. Retrieved Oct 9, 2022. https://hbr.org/2022/05/platforms-need-to-work-with-their-users-not-against-them
Smith, M & Alstyne, M. (2021, August 12). It’s Time to Update Section 230. Harvard Business Review. Retrieved Oct 9, 2022. https://hbr.org/2021/08/its-time-to-update-section-230