INTRODUCTION
In the age of information technology, the world is facing a transformational opportunity known as “digitalization” and Internet-based digital platforms are bound to continue to grow and expand their territories. Digital platforms are used as occasions for producers and consumers to exchange information and goods and services. Successful digital platforms include Facebook, Instagram and TikTok. Digital platforms take on the responsibility of connecting two or more individual users who are independent of each other in the form of communities. As a platform for information exchange and communication, the builders of digital platforms expect to create a free social platform-an “Internet utopia”-and as a result, some obscene, pornographic, violent, abusive, bullying, and hateful content gradually emerges from digital platforms (Gillespie, 2018).
As more and more users flock to digital platforms, these uncomfortable contents, such as violence, bullying, and harassing messages, are also increasing. There are a large number of cyberbullying incidents on digital platforms, about one-fifth of which occur through social media, and most adolescents have experienced cyberbullying. Cyberbullying is common, and its counterpart is cyber harassment. Cyberharassment is the sending of rude, offensive, or threatening content to victims via the Internet or cell phones, including but not limited to gender, race, and sexual orientation (Van Royen et al., 2017).
Online harassment on social platforms is becoming increasingly evident, as mentioned in the news, young women in Australia experience higher levels of online harassment than the world average, and it is also noted in the news that digital platforms, as well as the public, do not take it seriously, which means that there is a lack of public and social awareness of online harassment. In this article, we will discuss who should be responsible for the regulation of digital platforms and how to organize a communication session on the above-mentioned issues.
Who should stand up to stop?
- Platforms
Digital platforms have long assumed the role of providing users with open communication, and many websites have emerged inspired by the freedom of the Internet to host and expand user participation, expression, and communication. Social media platforms provide a bridge for users to communicate and interact with each other on a broad scale, allowing more people to reach out to each other. However, this exposure can also have a negative impact.
As Gillespie mentions platforms must be regulated in some effective form so that users are not influenced by other users, remove illegal or harmful content, and demonstrate their advantages to new users and the public (2018). Platforms should maintain the health and harmony of their content. For example, with Trump’s social ban, Facebook and Twitter blocked Trump’s accounts in response to the false and inflammatory messages he posted on the Internet. As these posts spread on the Internet causing undue influence and public panic, platforms restricted their distribution, and therefore, platforms should restrict content such as bullying, harassment, violent content, hate, pornography and other problematic content to maintain the platform environment.
2. Self-regulation by users
The public, as citizens, has the right and obligation not to be bound by personal choices (Schlesinger, 2020). The development of the public sphere has led society to an Internet-dominated media system where politics gradually transitions to a post-public sphere. Users as citizens should be civic-minded and have a sense of maintaining the safety of the public sphere. Users posting inappropriate information on digital platforms can cause distress to the recipients and other users; therefore, inappropriate information on digital platforms should also be prevented from being posted and disseminated through users’ self-regulation.
3. Government or State
There are many ways to disseminate information, paper media and digital media. Unlike traditional paper-based media, information is published and disseminated more easily and quickly on digital platforms. The dissemination of traditional paper media requires various procedures. However, digital platforms allow everyone to be the creator and receiver of information, a process that is no longer limited by traditional industries. As mentioned in the Australian Parliament’s “Can the Internet be regulated?” mentioned, the Internet is global, encrypted and uncontrolled, and the state has a responsibility to protect the safety of its citizens, so the Australian government has established some regulations for online content monitoring. The global nature of the Internet means that censorship of the Internet requires cooperation between states to maintain a safe and healthy online environment, and government and state regulation should ensure that cyberspace is liberalized while regulating it.
How to stop?
Any online platform has the responsibility and obligation to stop the spread of problematic content.” The illusion of a truly “open” platform is powerful and resonates with deep utopian notions of community and democracy” (Gillespie, 2018).
1. Platforms should limit the dissemination of inappropriate information on time. For example, the Associated Press photographer’s 1972 photo “The Horrors of War” was reported as underage nudity when it was inserted as a photo in a Facebook story that was later removed by Facebook. However, the editor and the Norwegian Prime Minister were not aware of this action by Facebook. Since people on digital platforms feel differently about the content of information on the platforms due to different cultures in different countries and regions, problematic content should be dealt with by limiting its distribution.
2. Platforms have a responsibility to punish harmful content, such as bullying and harassment. Under New South Wales law, help should be sought when cyberbullying is received, even if the evidence is withheld. In the face of government departments dealing with bullying, platforms should also deal with the accounts of bullying users, such as banning and stopping account services.
3. The government or relevant state departments should deal with harmful information on digital platforms. Information such as endangering national security and insulting state departments may appear on digital platforms. In response to incitement-type information, relevant departments should deal with the corresponding content.
User Code. Users, as the main participants of the platform, should review and monitor the content they post. They should popularize the content needs of the platform and the dangers of spreading incorrect information, etc., and strengthen the content output to users.
CONCLUSION
In conclusion, the evolving digital platforms have a positive impact on society and users’ lives and entertainment. Through digital platforms, people communicate with people from different countries and regions over long distances, spreading and sharing interesting things in their lives. As mentioned in some news, many people have been bullied, harassed and racially discriminated against on the Internet platform, and some have even done things to hurt themselves. Meanwhile, according to the Australian government’s survey, although the censorship of online pornographic content for teenagers has not received many red lights, there is still a lot of information on digital platforms that is not suitable for teenagers to access, so timely content censorship of digital platforms to stop the spread of problematic content is imminent, and content censorship and punitive measures can ensure the rights of every digital platform user.
Reference:
Can the Internet be regulated? (n.d.). Parliament of Australia. Retrieved October 14, 2022, from https://www.aph.gov.au/about_parliament/parliamentary_departments/parliamentary_library/pubs/rp/rp9596/96rp35
Cyberbullying. (n.d.). NSW Police Public Site. Retrieved October 14, 2022, from https://www.police.nsw.gov.au/safety_and_prevention/crime_prevention/online_safety/online_safety_accordian/cyberbullying
Delkic, M. (2022, May 10). Trump’s banishment from Facebook and Twitter: A timeline. The New York Times. https://www.nytimes.com/2022/05/10/technology/trump-social-media-ban-timeline.html
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). Yale University Press.
Hermant, N., & Kent, L. (2020, October 4). Young Australian women cop more online harassment than global average, report finds. ABC News. https://www.abc.net.au/news/2020-10-05/young-australian-women-online-abuse-harassment-planinternational/12725286
Online platforms: The right solution for your organisation – OEB insights. (2021, October 14). https://oeb.global/oeb-insights/online-platforms-the-right-solution-for-your-organisation/
Schlesinger, P. (2020). After the post-public sphere. Media, Culture & Society, 42(7–8), 1545–1563. https://doi.org/10.1177/0163443720948003
Van Royen, K., Poels, K., Vandebosch, H., & Adam, P. (2017). “Thinking before posting?” Reducing cyber harassment on social networking sites through a reflective message. Computers in Human Behavior, 66, 345–352. https://doi.org/10.1016/j.chb.2016.09.040
Vogels, E. A. (2021, January 13). The state of online harassment. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/(N.d.).