This short hypertextual essay will discuss why the obligation of social platforms to remove hate speech and illegal speech should be applied in the Australian context despite several possible drawbacks it may have.
Before talking about whether Australia should or should not employ the obligation of social platforms to remove hate speech and illegal speech, we need first to understand the definition of hate speech and illegal speech, and the historical basis of such obligation.
Hate speech and illegal speech
According to Watanabe, Bouazizi & Ohtsuki (2018), hate speech is a specific form of offensive language used by people based on their stereotypes, racist or extremists background. In other words, hate speech conveys message out of malice to someone who belongs to a specific group. The illegal speech stands for a broader term that consists of all exceptions to free speech.
The historical basis of obligation
Legislation against hate speech has a long history in Australia. The Racial Discrimination Act 1975 states that it is “unlawful for a person to do an act, otherwise than in private, if the act is reasonably likely, in all the circumstances, to offend, insult, humiliate or intimidate another person or a group of people”. However, unlike Germany and Europe, Federal law in Australia does not clearly mention digital platforms’ responsibility to remove hate speech and illegal speech. Germany started to regulate hate speech online on January 1st, 2018 (Rohleder, 2018, p.34). Similarly, the European Commission in line with digital platforms agreed a “Code of Conduct on Countering Illegal Hate Speech Online” in May 2016. Australia is left behind in protecting citizens from hate speech and illegal speech on the internet. This essay will further explain why Australia needs to formulate laws to force digital platforms to remove hate speech and illegal speech.
Reasons for Australia to apply the obligation
Statistics show the incidence of racism and problematic behaviours on digital platforms is increasing in Australia (All Together Now, 2020). A COVID-19 Coronavirus Racism Incident Report Survey reported (2020) that around 10% of racist incidents happened online due to COVID-19. The Australian Institute predicts (2019) there are about 8.8 million Australian experienced harassment or hate speech online based on their research. The obligation of social platforms to remove hate speech and illegal speech should be applied to protect Australians’ mental health, provide a remedy to victims of hate speech and illegal speech, unify regulations online and offline, and prevent possible hate crimes and cyberterrorism.
Protect Australians’ mental health
In the predicted number of 8.8 million Australian who experienced harassment or hate speech online, The Australian Institute suggests (2019) 744 thousand of them may have to see a health professional afterwards. The eSafety Commissioner (eSafety) (2019) reported that 58% participants who experienced hate speech online said there is a negative impact on their lives, 37% of them reported mental or emotional stress, 14% relationship problems, while 10% reported damage to their reputation. The Australian Institute‘s low estimate, which only includes medical and reported income costs, Australians have born a total cost of $330 million from online harassment and cyberhate. Therefore, Australia must employ legislation asks digital platforms to take the responsibility to remove hate speech and illegal speech to protect Australians from possible mental disorders and economic costs.
Economic costs of online harassment and cyberhate, 2019, p. 17
Economic costs of online harassment and cyberhate, 2019, p. 20
Economic costs of online harassment and cyberhate, 2019, p. 22
Provide a remedy to victims
Gelber and Mcnamara claimed (2015) that hate speech laws can provide a remedy to its victims. It offers people who suffered from hate speech and illegal speech a chance to take to court and help them bringing online haters to justice. It is crucial for the government to announce that hate speech violates the law since it can knowledge the disadvantaged group that they are protected from discrimination. Secondly, it can use precedents as direct evidence to educate people in Australian society to not show prejudice against the disadvantaged community on social platforms. The past judgements that pronounce online hate speakers guilty may dissuade potential extremists from taking actions online, and it can not occur without bringing hate speech law in legislation. Australia should adopt legislation to obligate social media to remove hate speech and illegal speech to improve social justice and deter potential hate speakers.
Unify regulations online and offline
As mentioned in the previous paragraph, Australia legislated against hate speech back in 1975. Hate speech is prohibited in Australia, and there should not be an exception for online hate speakers, they deserved the same punishment as people who express hate speech in the real world. Rorive argues (2009) that “what is illegal offline is illegal online”. With the development of technology and prosperity of digital platforms, the online community now is arguably equally important as the offline community, media invisibly shapes our life and reality, and we are living a media life (Deuze, 2011). Cyberbullying, catfishing, fake news, online harassment, and all kinds of other hate speech and illegal speech online have the same negative impacts on people as similar activities offline. Social platforms encourage their users to participate in immoral activities since there are little consequences (Massanari, 2017). Australia should unify regulation against hate speech and illegal speech online and offline. Otherwise, it will encourage hate speakers and terrorists to move online to escape from punishments according to law.
Prevent possible hate crimes and cyberterrorism
All Together Now observed (2020) that right-wing extremists in Australia now increasingly use social platforms to recruit young people, and promote anti-democratic, xenophobic, racist, misogynistic, homophobic, transphobic activities. And social network played an important role in organizing hate crimes and cyberterrorism.
“Role of online social networks for destructive activities” by Chetty and Alathur, 2018.
It provides a social platform for terrorist organizations to post illegal or harmful contents, propagate criminal activities to online users, communicate with other members in the organization, collect information and organize terror events, and attack internet through information dissemination (Chetty and Alathur, 2018). For example, Twitter is maliciously used to spread fake news of Hurricane Sandy 2012 disaster to cause panic in public (Chetty and Alathur, 2018). Remove hate speech and illegal speech can nip extremism, hate crime and cyberterrorism in the bud. It can stop extremists and terrorists from recruiting members, propagating criminal activities, and causing panic online.
One of the fack news tweets @ComfortablySmug posted when Sandy pummeled New York.
Some people argue that applying such obligation will damage freedom of speech. However, hate speech and illegal speech are not be protected under free speech. Moreover, it can not justify online haters to incites racial discrimination, hatred, and violence (Berman, 2015). Nevertheless, we do need to concern that the obligation may privatize law enforcement to private companies by asking social platforms instead of the judges to remove hate speech and illegal speech (Rohleder, 2018). The Australian government should supervise and control the judgment when the obligation put into practice.
In conclusion, Australia should introduce legislation to obligate social platforms to remove hate speech and illegal speech to improve social justice and protect human rights in Australia. In the bright side, apply such legislation in Australia can protect Australians’ mental health and prevent potential economic costs, provide a remedy to victims of hate speech and illegal speech, unify regulations online and offline to stop extremists from moving online and escape from the punishments they deserved, and prevent possible hate crimes and cyberterrorism. Facebook was taking actions to ban Holocaust denial and distortion last week (Oboler, 2020). Germany and other European countries brought legislation to improve law enforcement on the Internet to prevent the circulation of illegal hate speech and fight hate crime more effectively years before (Rohleder, 2018). Australia is already left behind by European countries and should follow the new trend to legislate against hate speech and illegal speech online now. The general public is seeking legislative change in Australia to protect them from hate speech and illegal speech online. Survey shows the majority of Australians (71%) agreed there should be a new law to regulate the growth of online hate speech and illegal speech, and even more Australians (78%) believed social media companies should take their responsibility and do more (eSafty Commissioner in Australia, 2019). Therefore, the Federal government should adopt new law against hate speech and illegal speech online, and supervise social media platforms to remove them.
Watanabe, H., Bouazizi, M., & Ohtsuki, T. (2018). Hate Speech on Twitter: A Pragmatic Approach to Collect Hateful and Offensive Expressions and Perform Hate Speech Detection. IEEE Access, 6, 13825–13835. https://doi.org/10.1109/ACCESS.2018.2806394
Chetty, N., & Alathur, S. (2018). Hate speech review in the context of online social networks. Aggression and Violent Behavior, 40, 108–118. https://doi.org/10.1016/j.avb.2018.05.003
Racial Discrimination Act 1975 s. 18C (Austl.). Retrieved from http://www6.austlii.edu.au/cgi-bin/viewdoc/au/legis/cth/consol_act/rda1975202/s18c.html
Rohleder, B. (2018). Germany Set Out To Delete Hate Speech Online. Instead, It Made Things Worse. New Perspectives Quarterly, 35(2), 34–36. https://doi.org/10.1111/npqu.12140
Gelber, K., & Mcnamara, L. (2015). The Effects of Civil Hate Speech Laws: Lessons from Australia. Law & Society Review, 49(3), 631–664. https://doi.org/10.1111/lasr.12152
Rorive, I. (2009). What can be done against cyber hate? Freedom of speech versus hate speech in the Council of Europe. (Symposium: Comparative Law of Hate Speech). Cardozo Journal of International & Comparative Law, 17(3).
Deuze, M. (2011). Media Life. Media, Culture and Society, 33(1), 137–148. doi:10.1177/0163443710386518
Massanari, Adrienne (2017) #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3): 329–346.
Berman, A. (2015). Human rights law and racial hate speech regulation in Australia: reform and replace? Georgia Journal of International and Comparative Law, 44(1).
The EU Code of conduct on countering illegal hate speech online (2019). European Commission. Retrieved from https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en
Cyberbullying. (n.d.). eSafety Commissioner. Retrieved October 28, 2020, from https://www.healthdirect.gov.au/cyberbullying
The eSafety Commissioner (eSafety) in Australia. (2019). Online hate speech report. Commissioned by the Australian Government.
All Together Now. (2020). Right-Wing Extremism and COVID-19 in Australia.
The Asian Australian Alliance and Osmond Chiu. (2020). COVID-19 CORONAVIRUS RACISM INCIDENT REPORT: Reporting Racism Against Asians in Australia Arising due to the COVID-19 Coronavirus Pandemic.
The Australia Institute. (2019). Trolls and polls –the economic costs of online harassment and cyberhate. Commissioned by independent journalist and researcher Ginger Gorman.
Oboler, A. (2020, October 23). Tackling Holocaust denial and distortion will be a challenge for Facebook. Retrieved October 28, 2020, from https://ohpi.org.au/tackling-holocaust-denial-and-distortion-will-be-a-challenge-for-facebook/
Catfishing. (n.d.). eSafety Commissioner. Retrieved October 28, 2020, from https://www.esafety.gov.au/young-people/catfishing
Fake news. (n.d.). eSafety Commissioner. Retrieved October 28, 2020, from https://www.esafety.gov.au/young-people/fake-news
Gross, D. (2012, October 31). Man faces fallout for spreading false Sandy reports on Twitter. Retrieved October 28, 2020, from https://edition.cnn.com/2012/10/31/tech/social-media/sandy-twitter-hoax/index.html