The distribution of porn and violent content has proven to be extremely harmful in Australia’s online media landscape, perpetuating the responsibility of this spread of content onto a variety of institutional bodies (Henry & Powell, 2015).
This essay will argue that online platforms such as Facebook and Instagram, should be responsible for preventing the dissemination of this content as they uphold the power to govern online user behaviours. However, the users uphold a responsibility themselves, as their own values and behaviour can shape the way they perform online. Ultimately, on one hand platform regulators are responsible but it’s merely up to the user’s interaction with the platform for this content to be stopped.
Increasingly, the interrelationship between digital platforms and violence content in today’s widely virtual world has caused problematic effects on societies and individuals. Social media has become a tool that has allowed individuals to disseminate actions and beliefs onto a wide audience in a fast manner.
An example of a form of violent content that should’ve been terminated of the internet within seconds but wasn’t the 2019 Christchurch mass shooting. In March 2019, a mass shooting at a Christchurch Mosque, livestreamed to Facebook, by the shooter, highlighting viral dissemination of extremely violent content.
1News reports moments after the shooter was arrest outside the Linwood Islamic Centre.
The shooter – who acted alone, was able to weaponise the use of digital platforms and in particular Facebook, to portray a sense of power. In response, Facebook reacted to slowly and did not have effective measures in place to prevalent the mass sharing of this violent content. The livestream function on digital platform is a relative new application. Some researchers believe through the using of live streaming it is promoting violence, as it’s a fast and widespread approach to capturing an audience (S. Every-Palmer, 2020). Moderators of these platform tried to delete the original content; however, the video duplicated and moved between sites such as YouTube and Reddit. ‘Facebook said it removed 1.5 million copies of the video’ (S. Every-Palmer, 2020).
Screenshot of Facebook Newsroom, commenting on the issue and their actions.
There is no question that the video should have been stopped and terminated within minutes of the livestream starting. For many, the speed that content can be shared and circulate across platforms and the amount of people it can impact is an increasing concern. Societies react in varying ways to issues and is this extreme violence episode, it was confronting to watch as viewers.
Comments of loved ones, showcasing how it has impacted them post attack.
Ultimately, the shooter had a subsequent motive for the live stream to go viral and have a high population reach and this motive was achieved through the ability for content on social media platforms to spread as fast as anything. A news article written by PBS, draws upon stories from victims of the attack and how it has affected not only them, but families and wider New Zealand community.
Charlie Warzel from New York times, expressed to PBS the premediated nature of the crime, as the shooter intentionally set up multiple twitter accounts prior to the shooting to show pictures of weapons used (Press, 2019). Hence, not only was it livestreamed, but there was also a trail for people to follow the events, to provoke and engage the media. Moreover, Joan Donovan, Director of Social Change at Harvard University, commented on how media interacts white supremacy. Donovan states that a “journalist’s reasonability has been heighted in this environment by the fact the shooter was able to control the narrative by distributing this content online, ultimately shifting the responsibility to platform companies” (Press, 2019).
The gunman of the mosque shooting ultimately was able to exploit social media as an attempt to gain infamy and alter people’s views. Raising societal issues and helping promote a sense of collective responsibility for change and highlighting harder digital regulation need to occur. In reality, maybe the question is whether governments have the ability to intervene effectively. Certainly, governments have attempted to limit the consumption of these problematic materials, through compulsory settings and privacy implication on platforms. However, this is also limited due to the liberalised nature of the internet which allows for anonymity (Boyd, 2019). Thus, Facebook and other social media platforms need to heighten users control of settings to determine the extent of privacy in their account.
In contrast, censorship in an internet context, is regulated based on algorithms not individual cases. While this does allow for mass regulation of content, there are many examples where the algorithm fails which leads to evident inconsistencies and prejudice.
Freedom House discuss the global drive of algorithms on platforms.
Throughout social media there is a movement of women and non-binary people who are attempting to fight the prejudice which allegedly arises from automated censorship (Poulsen, 2021). A prime example of this is pregnant mothers posting photos of their ‘bump’ in their everyday clothes or swimwear and the photo being taken down by platforms almost instantly. This is particularly evident on Instagram and TikTok.
Many high-profile influencers have attempted to report in mainstream media the allegation that Instagram’s community guidelines are fostering an “epidemic of misogynist abuse” (Paul, 2022). While Instagram overtime has developed their community guidelines to reject “gender-based hate or any threat of sexual violence” influences and social media users are reporting inconsistencies in policies which allow for the protection of online abusers. The Guardian reported that “there is an epidemic of misogynist abuse taking places in women’s DMs. “Meta and Instagram must put the rights of women before profit” (Paul, 2022).
Furthermore, an article published by the University of London investigated ‘How Instagram’s algorithm is censoring women and vulnerable users but helping online abusers. The article states, “Gendered policing is a double-edged phenomenon. In addition to harassment, women’s bodies, nudity, sex and sexuality also bear the brunt of social media’s algorithmic censorship, replicating the male gaze online.”(Are, 2020). This example raises questions on what defines ‘porn’ and ‘nudity’ especially on social media platforms and contributes to a movement attempting to counteract ‘online systemic misogyny’’.
Amnesty International, drives deep into the impact of online misogyny, hearing from marginalised female individuals.
Juxtapose to what the aforementioned sources reported women and minority peoples online experience with censorship being, nudity and pornographic images seem to have an entirely different definition and standard for the cis-men of the internet. Tommy Lee, a relatively well-known drummer from heavy-metal band Mötley Crüe, in August 2022 uploaded an explicit ‘selfie’ of his genitals to his Instagram which at the time had a following of 1.5 million people. The post remained on the social media platform for excess of 3 hours while inevitably being distributed to other popular platforms including Twitter, Facebook & TikTok (Burns, 2022).
Mamamia published an article in response to Tommy Lee’s post stating that, “despite Instagram’s strict no-nudity policy and tendency to immediately remove photos of women’s the photo was allowed to stay up on the social media platform for hours. Overall, this is proof of a double standard and blurred definition of what constitutes porn and nudity on online platforms. It is proof that even social media platforms themselves cannot regulate distribution of pornographic images themselves.
‘The Spill’ podcast discusses the double standard of porn on Instagram in relation to Tommy Lee scandal.
The solution to stopping the spread of harmful media thus is proven complex and there are certainly many avenues that organisations including schools, companies and governments have trialled and failed – or only reached mildly improved results.
Presently, there is a movement arising from the correct social political climate which has boosted the conversation of regulating distribution of harmful media to the forefront. One alternative attitude which is being pioneered by leading contributors such as Chanel Contos founder of Teach Us Consent.
This education that Contos has crafted particularly young people, alongside further personalised monetisation from platforms. Meaning moving away from the current algorithmic and more towards a humanised collaboration with the internet world because social media is always going to attempt to recreate the current socio-climate of the real world.
Conclusively, it is evident that online platforms do have a responsibility to prevent the circulation of problematic content together with the responsibility of the users of the online platforms. Through a combination of reformed educational programs implemented through schools, companies and government organisations combined with a more humanised censorship system on platforms, the spread of problematic content will be minimised.
Are, C. (2020). Feminist Media Studies. How Instagram’s algorithm is censoring women and vulnerable users but helping online abusers, 741-744. https://doi.org/10.1080/14680777.2020.1783805
Boyd, D. (2019). Sage Journals . Stop the Presses? Moving From Strategic Silence to Strategic Amplification in a Networked Media Ecosystem, 231-244. https://doi.org/10.1177/0002764219878229
Burns, B. (2022, August 24). Mamamia. Retrieved from “I went on a motherf**king bender, bro.” Why everyone is talking about Tommy Lee… again.:
Henry, N., & Powell, A. (2015). Embodied Harms: Gender, Shame, and Technology-Facilitated Sexual Violence. Violence Against Women, 21(6), 758-779. https://doi.org/10.1177/1077801215576581
Paul, K. (2022, April 6). The Guardian. Retrieved from High-profile women on Instagram face ‘epidemic of misogynist abuse’, study finds: https://www.theguardian.com/technology/2022/apr/05/high-profile-women-on-instagram-face-epidemic-of-misogynist-abuse-study-finds
Poulsen, A. (2021). Digital Libray . Gendering Algorithums in Social Media , 24-31.
Press, A. (2019, March 17). PBS News Weekend . Retrieved from Stories of the victims of the New Zealand mosque attack: https://www.pbs.org/newshour/world/stories-of-the-victims-of-the-new-zealand-mosque-attack
Every-Palmer, R. C. (2020). Psychiatry, Psychology and Law. The Christchurch mosque shooting, the media, and subsequent gun control reform in New Zealand: a descriptive analysis, 28(2), 274-282. https://10.1080/13218719.2020.1770635