Cyberharm and Content Audit in the Digital Age: Weighing Freedom of Expression, National Sovereignty and Global Standards

Abstract

In the digital age, cyberspace is rife with potential and obstacles, and the proliferation of cyberharms and harmful content is now a major source of worry. Bullying, harassment, violent material, bigotry, misleading information, and other online dangers pose a severe threat to users’ mental health as well as to social and cultural ecosystems. Content auditing has drawn a lot of attention as a crucial tool for preserving a secure and polite atmosphere on digital platforms. However, content screening has raised issues regarding the balance to be struck between censorship and freedom of expression, as well as contradictions between national laws and principles and international social media platforms. The current condition of online harm on digital platforms, the distribution of blame, the significance of content audits, and strategies for striking a balance between free speech, national sovereignty, and international standards are all topics covered in this essay.

Cyberhazards and mental health

Serious mental health issues are brought on by cyberbullying, harassment, and violent content. Online bullies and harassers may suffer from mental health issues like depression, anxiety, and low self-esteem. These problems not only interfere with the victim’s regular activities but may also have more severe repercussions. Up to 15% of 12 to 18-year-olds have encountered cyberbullying at some time in their lives, according to a poll done by the Cyberbullying Research Centre 2021. Though it is concerning that over 25% of 13 to 15-year-olds would have experienced cyberbullying in 2021 alone (Bottaro, 2022). This indicates that cyberbullying has become a significant social issue primarily affecting young people.

youth of all genders, sexual orientations, and races can become victims of cyberbullying, according to data from the Cyberbullying Research Centre, with LGBTQ youth making up an even larger percentage of victims (31.7%). The severity of cyber damage is further highlighted by the possibility that the onset of these mental health problems would result in suicidal behavior and self-harm (Bottaro, 2022).

The Importance of Content Audits

Digital platforms use content audits as a critical tool to maintain a secure and polite environment. 70% of individuals think the proliferation of incorrect online information constitutes a serious threat to their nation. This feeling of threat is in line with the widely held notion that the Internet and social media have made it easier for people to spread false information and rumors. About half or more of respondents in each country agreed with this statement. Nearly 90% of individuals in areas like the Netherlands, Australia, and the UK think that people are more susceptible to manipulation. According to Pew Research Center’s analysis, this indicates that the propagation of misinformation has expanded into a global issue and now threatens both social stability and national security. In order to combat cyberharm, governments and social media companies must work together. For instance, a number of violent and provocative words were extensively circulated on social media during the 2011 protests in the United Kingdom, inspiring bystanders to join the protestors and sparking a number of violent occurrences. The federal authorities issued warnings to Facebook, Twitter, and Research in Motion (RIM), the company that makes the Blackberry, urging them to be more accountable for the material uploaded on their networks. The riots sparked by the inappropriate expression of online content were ultimately put under control by the government and social media corporations working together (Halliday and Garside, 2011).

Trade-offs between freedom of expression and content censorship

Controversy over the balance between censorship and freedom of expression has been raised by content vetting. While content vetting is crucial to halting the spread of violent and destructive content online, there are worries that overzealous content vetting may result in information censorship and speech restrictions, violating people’s right to free expression. Freedom of expression is acknowledged as a fundamental right that safeguards people’s freedom to their own free expression, encourages pluralism, and advances society. The governance of online content involves collaboration between governmental and non-governmental organisations, with social platforms acting as online regulators (Hanming and Bing, 2021). Take Twitter’s decision to block Trump’s account as an example (BBC, 2021), an event that sparked controversy about freedom of expression and the power of social media companies. The lack of transparency of the blocking decision raised questions as the platform did not provide detailed criteria to explain why the action was taken. Therefore, a balance between content control and freedom of expression must be struck, and judgments about content censorship must be just and open (Barendt, 2005).

International challenges and national sovereignty versus conflict

It might be difficult and complicated to resolve the dispute between international social media platforms and country regulations. Global companies like Facebook and Twitter function, yet there are substantial regional differences in the laws governing content restriction and freedom of speech. Due to this, it has come to pass that social media firms’ efforts to abide by the legal requirements of some nations may collide with the laws and morals of other nations. When it comes to concerns like internet defamation, there are differences between the rules of the UK and the European Court of Justice (ECJ) as an illustration of this. In the e-date advertisement case, the ECJ discovered that it was challenging for victims who had experienced reputational harm in a foreign area to recover their losses. This resulted from victims having to file numerous claims in various jurisdictions, which increased complexity and expense (Mills, 2015). This resulted from victims having to file numerous claims in various jurisdictions, which increased complexity and expense (Mills, 2015). For instance, even though the communication is “authorized” by US First Amendment free speech standards, a US publisher that uploads content to a US website may discover that a UK injunction orders them to remove the content. As a result, laws all around the world have become intricately intertwined (Mills, 2015).

Solutions and Future Prospects

There is a conflict between content auditing and national sovereignty, but some people believe that this conflict may be overcome by multiparty cooperation and trade-offs, which can be achieved through the following three points. International cooperation comes first. International communities and multinational social media firms should collaborate to create a worldwide set of content review criteria to address issues like cyberterrorism and misinformation. Such cooperation would lessen conflict and confusion between nations while also assisting in preserving stability in the global cyberspace. Respect for national laws and values is the second. Global norms are crucial, but they should also respect national laws and values. To maintain respect for national sovereignty, content review choices should take into account a country’s cultural, legal, and political environment. The third aspect of openness and transparency is: Social media firms can boost openness and address the content review paradox by outlining their review criteria and rationale in detail. This can lessen mistrust and help people comprehend why particular content has been blocked or removed. The conflict between content vetting and national sovereignty is present, but it is not insurmountable. The impact of this battle can be lessened while preserving the stability and security of the global Internet space through international cooperation, respect for national laws and values, and more transparency.

While there is some conflict between content screening and national sovereignty, others claim that this conflict can be resolved by multiparty collaboration and trade-offs. First, to address global issues like cyberterrorism and misinformation, international social media platforms and communities can collaborate to create a worldwide set of content vetting criteria. In order to ensure that national sovereignty is respected, choices made during content reviews should take into account the cultural, legal, and political background of the country in question. Finally, resolving content review issues requires openness and transparency. In order to reduce mistrust and make it simpler for users to understand why some content has been banned or restricted, social media businesses should boost openness by explicitly outlining the standards and justifications for their vetting choices. To solve this problem and prepare for future cyberharm, transparency, fairness, and international collaboration are essential. These factors must all be prioritized. We can jointly create a safer and more prosperous digital future by upholding the idea of freedom of expression while taking the appropriate precautions.

References

Barendt, E. (2007). Freedom of Speech. Oxford University. https://doi.org/10.1093/acprof:oso/9780199225811.001.0001

Clayton, J. (2021, January 9). Twitter “permanently suspends” Trump’s account. BBC News. https://www.bbc.com/news/world-us-canada-55597840

Greenwood, S. (2022, December 6). Views of social media and its impacts on society. Pew Research Center’s Global Attitudes Project. https://www.pewresearch.org/global/2022/12/06/views-of-social-media-and-its-impacts-on-society-in-advanced-economies-2022/

hu, jintao. (2010). Decree of the President of the People’s Republic of China LAW OF THE PEOPLE’S REPUBLIC OF CHINA ON THE LAWS APPLICABLE TO FOREIGN-RELATED CIVIL RELATIONS. http://conflictoflaws.net/News/2011/01/PIL-China.pdf

Josh, H., & Juliette, G. (2011, August 11). Rioting leads to Cameron call for social media clampdown. The Guardian. https://www.theguardian.com/uk/2011/aug/11/cameron-call-social-media-clampdown

Mills, A. (2015). The law applicable to cross-border defamation on social media: whose law governs free speech in “Facebookistan”?. Journal of Media Law, 7(1), 1–35. https://doi.org/10.1080/17577632.2015.1055942

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven: Yale University Press. https://doi.org/10.12987/9780300235029