This web essay will argue that Australia must implement an obligation for the removal of hate speech and illegal speech due to the immediacy and instantaneous nature of online platforms and the continuity of its messages. Hate speech will be outlined within a global context, making references to German law and Europe’s 2016 Code of Conduct on Countering Illegal Hate Speech Online. Through a case study analysis of Aboriginal and Torres Strait Islanders, it will become ostensible that hate speech regulations alike to Germany and the EU will benefit Australian society.
Hate Speech Defined
A definitive definition of hate speech is often contested. Whilst broadly hate speech relates to specific discourse that attacks individuals and particular groups on a basis of identity to disrespect or hurt with intent (Chetty & Arthur, 2018), the specifics behind its global connotations are implicitly different. Each country and even states within countries have their own distinctive definitions of hate speech. For the purpose of this essay, I will define hate speech in alignment with The Australian Anti-Discrimination Act 1998, who prohibits hate speech by “any conduct which offends, humiliates, intimidates, insults or ridicules another person” on the basis of race, disability, sexual orientation or religious belief.
When Hate Speech Meets Social Platforms
The nature of communication has significantly transformed since the foundation of the Internet in 1983 which has caused the narrative of today to be significantly influenced by the rapid growth of social platforms. Facebook, Twitter, YouTube and Instagram are now a part of society’s everyday lives, changing the orthodox notions of ‘community’ by establishing a new forum for interpersonal communication (Holschuh, 2014). The advancements in technology have allowed populace to connect with others on opposite sides of the globe in a matter of seconds. Letters, telegrams, phone calls and television all seem like manners of the past.
Whilst this nuanced way of communication has momentously benefited society, the affiliations between social medias and hate speech are inherently present. The technological innovations of the Internet have established platforms for radicals and hatemongers to proliferate their rhetoric (Banks, 2010) and new forums for the dispersal of hate, fear and intimidation (Holschuh, 2014). Anonymity, invisibility, community and the instantaneousness nature of the online world are four features that Brown (2018) suggests make online hate speech more harmful than its offline counterpart.
Then begs the questions; with the advancements of the internet why hasn’t Australia established online hate speech laws in alignment with their pre-existing offline hate speech laws?
German Law and Europe’s 2016 Code of Conduct
In 2017 Germany introduced strict laws in regard to online hate speech. Under the Network Enforcement Act, Internet companies in Germany are obligated to remove hate speech within a 24-hour timeframe or face rife financial penalties which see up to millions in fines (Ross, 2018). This national legislation caused Facebook Germany to employ over 1000 moderators who assess masses of flagged posts which adhere to any of the twenty illegal acts of speech (Ross, 2018).
Similarly, the European Union (EU) established a Code of Conduct in 2016 to counteract the spread of illegal hate speech online. Formally the conduct was an agreement with Facebook, Twitter, YouTube and Microsoft, however in 2018 Instagram and Snapchat joined and in September 2020 TikTok announced their participation (European Commission, n.d.).
The Removal of Hate Speech as a Necessity, Not a Choice
Many arguments against the removal of online hate speech stem from the fundamental right for freedom of expression. In the United States, hate speech laws cease to exist as protected under the constitution’s First Amendment which prevents the government from legislating laws in relation to freedom of speech (Gelber & McNamara, 2015). However, it must be acknowledged that hate speech both tests the limits and takes advantage of the idea of free speech. Indisputably, freedom of expression has become a foundation for hate speech (Chetty & Alathur, 2018) and this must be assessed within Australia’s framework.
Within Australia, the Aboriginal Indigenous and Torres Strait Islander community has long been subject to online hate. The immediacy and far-reaching impact of social platforms has allowed the redundant dichotomy of ‘us’ and ‘them’ to be perpetuated for humorous and degrading intent. In 2014, a Facebook page named ‘Aboriginal Memes’ was published and took over two weeks for the page to be finally removed after Facebook initially adhered it did not violate their Community Guidelines. The page was filled with a vilifying view of Indigenous Australians, perpetuated through jokes and illustrations.
The Australian Competition and Consumer Commission (ACCC) released the ‘Digital Platforms Inquiry’ in June of 2019 which outlined the supremacy that social media platforms have within the Australian media sphere. Every month in Australia, Facebook has 17 million users accessing the site, obtaining the largest social media presence within the country and hold a global market capital of US$517.6 billion (ACCC, 2019). The morality entwined within Internet companies gaining gargantuan amounts of profit from technology that allows harmful and illegal hate speech seem rather flippant. It must be a necessity for Facebook to hold responsibility which quickly removes and disallows these actions (Brown, 2018), which would have seen the removal of ‘Aboriginal Memes’ page come much earlier.
The traditional conventions of social media’s such as Facebook and Twitter help see an immediacy within the transport of information. Whilst there are always some benefits to this contiguity, the instantaneous nature of online communications encourages more spontaneous and unconsidered forms of cyberhate (Brown, 2018). The ability to ‘share’ pages and posts on Facebook allows multiple threads of permanency on the platform. The Australian Online Hate Prevention Institute (2014) addressed this within their briefing of the ‘Aboriginal Memes’ page, outlining that even whilst the page has been removed, other pages on Facebook had already re-posted some of the content which was then exposed to larger audiences.
If Australia implemented similar law as to Germany’s Network Enforcement Act, the longevity of this hate speech against Aboriginal’s would have not endured. Andre Oboler, the CEO of Online Hate Prevention Institute has argued:
The longer content stays available, the more damage it can inflict on the victims and empower the perpetrators. If you remove the content at an early stage, you can limit the exposure. This is just like cleaning litter, it doesn’t stop people from littering but if you do not take care of the problem it just piles up and further exacerbates. (UNESCO, 2015)
Oboler’s examination of the longevity of hate speech reflects similarly to the notions of the Network Enforcement Act, demonstrating that by implementing laws alike will benefit the hindering of online hate speech.
Online hate speech is more abundant than offline hate speech due to its aspect of anonymity. Less feelings of responsibility come with the online interaction between people. Drew Boyd the Director of Operations at The Sentinel Project has said that facelessness “is what makes online speech so unique, because people feel much more comfortable speaking hate as opposed to real life when they have to deal with the consequences” (UNESCO, 2015). For example, the person running the Facebook page ‘Aboriginal Memes’ had no intent to reveal their identity, because if they were comfortable with other people seeing who they were would they have used their personal profile to disseminate the hateful content. Thus, the element of being able to hide behind a fake persona adds to the dynamic effects of hate speech online.
Limiting the harmful effects of hate speech and its endemic presence in social platforms can only exist if the Australian government implements appropriate legislation to do so. Additionally, Australian media platforms themselves must re-evaluate their Community Guidelines to adhere to the current legislative laws regarding offline hate speech. Ross (2018) demonstrates an eye-catching analogy, comparing the need for hate speech free social platforms to forcing car manufacturers to provide seatbelts for safety. I am hopeful that this will be implemented soon within the Australian framework.
ACCC. (2019). Digital Platforms Inquiry. Australian Competition and Consumer Commission.
Banks, J. (2010). Regulating hate speech online. International Review of Law, Computers & Technology, 24(3), 233-239. doi: 10.1080/13600869.2010.522323
Brown, A. (2018). What is so special about online (as compared to offline) hate speech? Ethnicities, 18(3), 297-326. doi: 10.1177/1468796817709846
Chetty, N., & Alathur, S. (2018). Hate speech review in the context of online social networks. Aggression and Violent Behavior, 40, 108-118. doi: 10.1016.j.avb.2018.05.003
European Commission (n.d.). The EU Code of conduct on countering illegal hate speech online. Retrieved from https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en
Gelber, K., & McNamara, L. (2015). The Effects of Civil Hate Speech Laws: Lessons from Australia. Law and Society Review, 49(3), 631-664. doi: 10.1111/lasr.12152
Holschuh, J. (2014). #Civilrightscybertorts: utilizing torts to combat hate speech in online social media. University of Cincinnati Law Review, 82(3), 953-977. Retrieved from https://library.sydney.edu.au/
Online Hate Prevention Institute. (2014). Briefing: Aboriginal Memes 2014. Retrieved from http://ohpi.org.au/briefing-aboriginal-memes-2014/
Ross, K. (2018). Hate speech, free speech: The challenges of the online world. Journal of Applied Youth Studies, 2(3), 76-81. Retrieved from https://library.sydney.edu.au/
UNESCO. (2015). Introduction. In Countering online hate speech, 11-17. France