Who should control harmful content in media platforms, and how to regulate it?

Social Media Tools

Social Media Tools” by jrhode is licensed under CC BY-NC-SA 2.0.

The transformation of the internet has brought a significant impact on society. Today, the social media platform is a place for producers and their users to have more space for expression and communication. It gives more people quicker and more effective connections with each other and provides a broader range of opportunities for interaction (Gillespie, 2018). The equal communication mode has changed the original fixed position, the relationship between the information published and the receiver has become more equal, and the relationship has become more ambiguous. Participants can participate in more content, and everyone has the right to speak when publishing content. But because of the Internet’s transformation, the platform has more users, and people are willing to post opinions and share things on the platform. At the same time, harassment, violence, hatred, threats, pornography, bullying, etc. More and more negative content exists on the platform day by day. Some negative remarks that appear on digital platforms are very important and necessary for the platform or the government to do some relative control.

 

Governance of the platform

Platform control is being taken seriously, and Facebook has begun to take some content governance measures to ensure the safety of users. Davey (2020) stated that “in Australia, 65% of 1000 respondents had experienced online abuse and one in five feared for their personal safety”. Online violence against women is exceptionally high in more than two dozen countries, with incidents occurring most frequently on the Facebook platform. The women who had been harassed or threatened suffered from mental and emotional distress and were constantly in fear for their safety. In the survey, 59 percent of the injuries were caused by abusive language, while body shaming and threats of sexual violence accounted for 39 percent. Facebook founder Mark Zuckerberg created Face Mash, which lets college students post photos of women on campus and rank them. The platform is a double-edged sword.

On the one hand, too little management makes users choose to leave the platform to avoid being immersed in an improper environment. On the other hand, too much control makes users leave the platform, which is very difficult to balance. Many platforms are profit-centric and don’t see themselves as a media company. They see themselves as hosting content posted by their users and don’t want to take any legal responsibility for their users.

Platforms must ensure that users do not interact with each other but also protect groups from rivals and remove offensive, despicable or illegal content. Although the platform hopes that users can self-restrain the content published is correct, the fact is that it is tough to regulate the users. The platform must make a normative treaty and become the decision maker after the incident, and the platform needs to bear the responsibility. In 2018, Mark Zuckerberg published “A Blueprint for Content Governance and Enforcement” (Gorwa, 2019) and created a watchdog that would allow content to be reviewed. Users can upload content to independent organizations. International regulatory issues are becoming more and more common (Wagner, 2013). The company’s “command and control” approach to regulation came into play. Regardless of the media platform, online platforms must protect their users’ safety on the network and create community standards to protect users’ human rights (Mudgway,2020).

 

government regulation

Stop Bullying
Stop Bullying” by Indrid__Cold is licensed under CC BY-SA 2.0.

The federal government began to regulate online content on media platforms. Social media platforms provide new opportunities for users to communicate. Still, a small number of people create a severe problem, namely online bullying, posting indecent content, threatening others, etc. The federal government plans to introduce the world’s first legislation to determine justice and prevent cyberbullying (Hickey & Nedim, 2020). The law would give security officers the right

 

to block websites in the event of cyberbullying to help some terrorist users post negative information on the platform, thereby reducing the impact on other users. At the same time, websites strengthen network monitoring, can better protect Internet users, and inhibit the occurrence of online language and behavior. The government has set a fine of $555,000 to regulate platforms that break the rules, which will make it easier for platforms to control online content (Hickey & Nedim, 2020).

In addition, each country’s control methods of social media platforms are not uniform. In the United States, social media companies are heavily responsible for protecting their users, and self-regulation is common. They want more regulation from the government. In India, the government is tightening controls on harmful content, and platforms are reluctant to take responsibility. What’s different is that Russian social media are controlled by the government and have certain restrictions on user activities. The Chinese government also has strict control over the Internet. Many websites that the West can access are not allowed in China. In other words, the Chinese government monitors and controls Chinese people’s social media platforms (siripurapu & Merrow, 2021).Some online gender violence is increasing, and the government controls too little or no seriously. Many governments started to take different measures to prevent online sex issues of violence, but there are still many countries victims are not very quick to get legal assistance; this is a very helpless thing. After being hurt online, the victims have easily broken again due to the fast spread of the Internet. For example, a female photo was secretly taken in the past and posted on the Internet, but later someone else may spread the image on the Internet. At this time, the government needs to take action to protect people from being harmed and mistreated and set up a series of regulations to maintain the order of the network (Barr, 2021).

Australia passed a law to stop the online spread of violent content after the Christchurch massacre

Griffiths (2019) states that after the Christchurch massacre, which left 50 people dead, a sweeping new set of laws in Australia regulated tech giants such as Facebook and Google, forcing them to remove violent content. Instead of forcefully stopping the spread of the massacre, which was broadcast live to customers, and putting in place the most urgent plan to protect platform users from the fear of violent crime scenes caused by terrorists, the government acted with legislation. In many cases, the government can be the strongest backstop; they need to step forward to solve a series of serious issues.

In general, the development of the Internet makes the emergence of social media platforms closely related to people. People share their opinions and lives on the platforms, but at the same time, there will be much harmful content, which will bring bad experiences and negative impacts to users. Currently, it is necessary for the platform and the government to control these events. They need to have the right and method to hold these events.

No Hate Speech

No Hate Speech” by AshMarinaccio is licensed under CC BY-NC-SA 2.0.

Reference list

Cassandra Mudgway Senior Lecturer in Law, & Kate Jones Senior lecturer in Digital Marketing & Social Media. (2022, September 21). As use of digital platforms surges, we’ll need stronger global efforts to protect human rights online. The Conversation. Retrieved October 14, 2022, from https://theconversation.com/as-use-of-digital-platforms-surges-well-need-stronger-global-efforts-to-protect-human-rights-online-135678

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp.1-23). Yale University Press.

Guardian News and Media. (2020). Online violence against women ‘flourishing’, and most common on Facebook, survey finds. The Guardian. Retrieved October 14, 2022, from https://www.theguardian.com/society/2020/oct/05/online-violence-against-women-flourishing-and-most-common-on-facebook-survey-finds

 Griffiths, J. (2019). Australia passes law to stop spread of violent content online after Christchurch massacre. CNN. https://edition.cnn.com/2019/04/04/australia/australia-violent-video-social-media-law-intl/index.html

Wagner, B. (2013). Governing internet expression: How public and private regulation shape expression governance. Journal of Information Technology & Politics, 10(4), 389–403. https://doi.org/10.1080/19331681.2013.799051

Jakubowski, V. (2020). Federal Government seeks greater control over online content. Sydney Criminal Lawyers. Retrieved October 14, 2022, from https://www.sydneycriminallawyers.com.au/blog/federal-government-seeks-greater-control-over-online-content/

Barr, H. (2021). As online gender-based violence booms, governments drag their feet: World Report 2021. Human Rights Watch. Retrieved October 14, 2022, from https://www.hrw.org/world-report/2021/essay/online-gender-based-violence-booms-governments-drag-feet