The Code To Curb Hate Speech

Social platforms in Australia

“countering illegal hate speech online”. By Copyright European Commission 2020
“countering illegal hate speech online”. By Copyright European Commission 2020

Introduction

This text will discuss why Australia should legislate to the obligation of social platforms to remove hate speech and illegal speech. As enumerated in German law and Europe’s 2016 Code of Conduct on Countering Illegal Hate Speech Online, platforms and social media companies such as Facebook, Twitter, and YouTube should take responsibility for combating the spread of illegal online hate speech. Because hate speech has a negative impact on society as a whole, not just against a certain group of people. Social media platforms can also be seen as publishers of online content, and they should be responsible for the content on the platform. In Australia, online hate speech on social platforms also has a negative impact on society ,even this hate speech incited crime and violence. Therefore, it is necessary to establish laws against hate speech and illegal speech, and social platforms are required to fulfill their obligations, supervise the content of the platform and eliminate illegal speech and hate speech in a timely manner.

 

“Hate speech is not free speech”by: Sadia Naeem

Definition and historical background of the obligation of social platforms to regulate speech

The obligation of social platforms to regulate speech is to require social platforms to remove hate speech and illegal speech online (such as racism, gender discrimination, and violent information)through national legislation and other means. With the increasing influence of Internet social platforms on people, the negative influence caused by the content of hate speech spread by the social platform is also valued by society. In recent years, Germany has passed a speech law(Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken)  that will punish social platforms that fail to delete illegal or hate speech content in a timely manner (Flew, Martin, & Suzor, 2019). In 2016, the European Union reached a “Code of conduct on countering illegal hate speech online” with online platforms and social companies such as Facebook, Microsoft, Twitter, and YouTube. More and more countries have noticed the negative impact of illegal hate speech online. And in conjunction with social media platforms, and introduced a containment policy, which shows that restricting illegal hate speech is a trend in the media world.

The title of the first show in SBS’s “Face Up to Racism” week, Image: DIANE FIELDES

Hate speech in Australia

Europe’s 2016 Code of Conduct on Countering Illegal Hate Speech Online mentioned that the spread of hate speech not only discriminates against and strikes against the people it targets, but also has a negative impact on other people in society. Today, the Internet has almost covered our lives, and media platforms have the ability to control the information available to the audience (Bucher, 2012). The media platform should be responsible for what content we receive. The Australian Internet regulator eSafety mentioned in the report Online Hate speech in 2019 that about 14% of adults in Australia believe that they have been targeted by online hate speech, and 58% of them believe that hate speech is causing them. Due to the negative impact, people identifying as LGBTQI or as Aboriginal or Torres Strait Islander suffered twice as many attacks as the national average. This shows that the number of Internet users who receive hate speech content every day is very large, which will cause very bad social effects. However, compared with the rapid response of some media platforms to content that may have copyright infringement disputes, their response to hate speech is relatively slow. A large part of the reason why media platforms are unwilling to monitor hate speech content is that they believe that hate content is not will affect their finances (House of Commons, cited in Flew, Martin, & Suzor, 2019). If social platforms turn a blind eye to hate speech, then social platforms are likely to become hotbeds of terrorism, racism, and sexism due to the rapid spread of online content. Social platforms should not enjoy the profits brought by the volume of activities but are not responsible for the illegal hate speech content on the platform.

Experience of online hate speech in the 12 months to August 2019. By eSafety

 

 

The powerful influence of social platforms

Flew, Martin, & Suzor (2019) put forward this point in the article, digital media and communication platforms often call themselves communication intermediaries rather than communication companies, they position themselves as content distributors, but in fact, because of these platforms have high influence We can’t assume that they are not the publishers of the content. In addition, based on the content push behavior of social platforms to users, social platforms use algorithms to determine the priority of the content displayed to us and in the process selectively ignore some content and emphasize others (Gillespie, 2014), social platforms can also be regarded as a pusher of online content. Kolbert (2017) proposed that technological changes have given several Internet companies the power to control the flow of information. For example, Google controls nearly 90% of search ads, and Facebook controls nearly 80% of mobile social traffic. Some social media platforms use the banner of freedom of speech to evade the traditional media’s responsibility to edit and review content. Media platforms are publishers or promoters of content on their platforms, and they are obliged to review the content delivered to eliminate hate speech that may have a negative impact. Media platforms have the power to control the content that people obtain, they also have the obligation to restrict the spread of illegal hate speech content.

 

Implications for ordinary internet users

Eliminating illegal and hate speech online has undoubtedly had a positive impact on ordinary netizens. In 2019, a white supremacist racist born in Australia launched a massacre in a New Zealand mosque. The killer posted a lot of hate speech on Facebook and filmed the assault. At least 49 victims were killed (Hunt, Rawlinson and Wahlquist , 2019). If social media platforms are alert when the murderer publishes hate speech, it may avoid tragedy. The supervision and elimination of illegal hate speech by social platforms not only can reduce the guidance and negative impact of hate speech on people, it can also actually maintain social stability.

Conclusion

In summary, hate speech poses a huge threat to the entire society. The existence of hate speech hits the people targeted by it, incites people’s negative emotions, and brings turmoil to society. Australia should learn from Germany and European Union and require social platforms to supervise content posted on their platforms and delete illegal speech and hate speech in a timely manner. Social media should consider the wide-ranging social impact of their products. At the same time, they have the power to control the content received by users, they should also fulfill their obligation to monitor online content. Social media should work with the government to evaluate which content is hate speech and which content is illegal. Only by combating online hate speech while ensuring freedom of speech can our online world be healthier. Reasonable supervision of the content of social platforms can protect more people from hate speech and eliminate racism, gender discrimination, and violence in the online world.

 

 

Reference list:

European Commission. (2016). Code of Conduct on Countering Illegal Hate Speech Online 

 

The German government.(2017).Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken

 

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50.

 

Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180. https://doi.org/10.1177/1461444812440159

 

Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies: Essays on Communication, Materiality, and Society (pp. 167–193). Cambridge, Massachusetts: The MIT Press.

ISSN: 1461-4448

 

eSafetyresearch, 2019. Online Hate Speech. [online] Available at: <https://www.esafety.gov.au/sites/default/files/2020-01/Hate%20speech-Report.pdf> [Accessed 28 October 2020].

 

Kolbert, E. (2017), ‘Who owns the Internet?’, The New Yorker, 28 August, https://www.newyorker.com/magazine/2017/08/28/who-owns-the-internet. Accessed  28 October 2020.

 

 Hunt, E., Rawlinson, K. and Wahlquist, C., 2019. ‘Darkest Day’: How The Press Reacted To The Christchurch Shootings. [online] the Guardian. Available at: <https://www.theguardian.com/world/2019/mar/16/darkest-day-how-the-press-reacted-to-the-christchurch-shootings> [Accessed 28 October 2020].

 

Avatar
About Shang Wang 2 Articles
Bachelor of ART Digital culture+Marketing The champion of cold feet