Media regulation, who should be responsible?

Social Media Logos

                                                                                                         “Social Media Logos” by BrickinNick is licensed under CC BY-NC 2.0.

Introduction

Since the growth of web2.0, people own a platform for sharing their ideas on the media platform, also images or video posting between closer relationships. The Internet was a free-speech platform with endless freedom, as O’Hara & Hall (2018) introduced. Gradually, because of the over-freedom that platform offered to users, the risk of cyber conflicts such as hate speech, harassment, or cyberbullying might happen.

Who should be responsible for this?

  1. platform as the essential regulation supervisor, platform self-regulation guarantees better protection
  2. the government should develop regulatory policies which do not conflict with the platform.
  3. users should strengthen self-regulation.

 

Platform Intervention

Platforms should proactively screen and remove undesirable content. Nowadays, platforms have repeatedly moderated as well as concealed their platform policies to meet users’ needs, form new business models, and differentiate their culture from traditional media. Moreover, the platform’s revenues depend on content makers, who always like to publish content that is not entirely positive to attract viewers (Gillespie,2018).

An image called “napalm girl” shocked the whole internet in 1972 and it was brutal, violent, and visually disturbing (Stockton,2017). Until now, this photo has not been removed from all platforms because their policy is lenient, and the photo does not offend the platform’s bottom line, also the growing number of clicks generates revenue for platforms. The Internet is touted as a utopian world because it is free enough and users’ privacy is adequately protected by the platform, however, this freedom has provided opportunities for people to insult and harass others through the internet, causing psychological harm to the person receiving the speech and even affects his or her safety, because their speech and behaviours are irresponsible. Velmo (Gillespie,2018) defined the internet as a cool place, but until it becomes a strong regulatory mechanism, people will verbally and negatively attack someone or a group of people on the internet through anonymous accounts.

Fleeing Trang Bang, June 08, 1972 - Phóng viên ảnh Nick Út và bé Kim Phúc sau lúc ném bom nhầm tại Trảng Bàng
Fleeing Trang Bang, June 08, 1972 – Phóng viên ảnh Nick Út và bé Kim Phúc sau lúc ném bom nhầm tại Trảng Bàng” by manhhai is licensed under CC BY-NC 2.0.

According to Indaily (2022), a male was dating females without strict controls on the platform to commit sexual abuse or violence, such as performing sexual acts without the other’s permission. After the incident, the lack of strict controls gives the male the advantage, and easily changed his name or even just cancelled his account to let the matter go .

Japanese Pornography
                                                                                                                  “Japanese Pornography” by balaam is licensed under CC BY-NC 2.0.

This example has clearly shown, users’ privacy is important, but such poorly regulated rules can cause a decline in social well-being and people’s happiness, as well as increase anxiety and panic among users of all platforms, putting the risk of victimization higher. In this regard, platforms should adopt stricter authentication as well as censorship of explicit images, for instance, YouTube’s latest management mechanism is to adopt a cumulative count policy, after reaching three violations, the account will be permanently closed, also the content creators must select their target audience group, once determined, the creator cannot publish content that is not relevant to that audience group, or they will face huge fines (Seal,2022). This example has indicated some platforms are gradually developing more effective mechanisms to protect users. Gillespie (2018) agreed that improvements in platforms will benefit the public’s physical and mental health. Indeed, improvements and regulation of platforms can show they are taking responsibility for protecting the physical and mental health of users, and with increased regulation, users’ experience of digital culture will turn slightly better. Although the platform is an essential part of protecting online users, the government should act as an assisting party to improve the regulation.

How my LEGO hobby became a YouTube business

                                                                             “How my LEGO hobby became a YouTube business” by BRICK 101 is licensed under CC BY-NC 2.0.

 

Government intervention 

The government should assist platforms in regulating content. With adequate freedom from the Internet, people become more dependent on communicating on social platforms. However, the excessive liberalization of platforms is detrimental to the government’s efforts to maintain social order. There is tremendous content exported through the platform every day, the overabundance of content causes the review mechanism unable to detect and remove undesirable content quickly and accurately, or perhaps the platforms refuse to increase the manpower for content review because of costs and interests, which making the whole review system null and void.

In Germany, a shooting incident in 2019 had fully recorded and live-streamed on the platform for a whole day, with inadequate regulation, the platform did not remove such violent and bloody content, and the government has no right to intervene, which later triggered huge socially discussion and unrest (Lever, 2019). Hence, the best way for governments to regulate passive media content is to enact strong laws to ensure that platforms censor content more efficiently and safeguard users.

'FRIVOLOUS GUNSHOT'

                                                                             “‘FRIVOLOUS GUNSHOT’” by Bresciani Emanuele Virtual Photographer is licensed under CC BY-NC 2.0.

For example, the latest law issued by the UK government in 2022 requires platforms to be highly guarded against violations such as terror, sexual abuse, and fraud. If platforms violate the law and fail to address undesirable content on time, the government has the right to seek a fine (Seal, 2022). Laws issued by the government are the most direct and effective way to regulate platforms and all media content. However, involving the government in content regulation is also contradictory. The platform is now used as an intermediary for politicians to deliver messages to the public, such as in the 2016 U.S. presidential election, where Trump used social media to help him develop a good reputation and delivered his message to the public (Joseff et al., 2020). This example shows that when the government interferes excessively with the content published by the platform, the interests of both the platform and the government are affected, moreover, when users register an account on the platform, all of them unconditionally agree to the terms and conditions that platform asking for, hence the platform is obligate to protect the privacy of users, which means the third party is prohibited, the chances of the government intervention are small. Thus, the most critical link in regulating undesirable content is the platform itself, but users also need self-regulation to ensure the stability of the platform’s security.

 

Users’ self-regulation

Media users should ensure they have compliance with self-regulation. Platform users are the foundation of the Internet and social media operation, without the user’s browsing and their content contribution, the platform cannot gain profit. However, the more users enter the internet, the more undesirable content spreads, to maintain a good internet experience for users, they should monitor themselves, such as voluntarily stopping viewing or posting objectionable content. According to the research, when users view pornography on the platform, the director or investor of that pornography will earn $3000 per second (Ecofunomics, 2020), the more users spend on pornography will prompt the porn industry to have more money to shoot new content, which will form a vicious circle. However, the most direct way to stop organizations from continuing to create undesirable content is to have users spontaneously pause their browsing, with the lack of revenue and steady audiences, content makers will not be able to continue creating content. There is no doubt that it is contradictory to ask users to self-regulate, the internet is popular because the content provided can satisfy human instinctive curiosity, hence users’ self-regulation cannot be guaranteed. stopping the release and dissemination of undesirable content still has to rely on the platform itself.

Playing on the computer

                                                                                                                          “Playing on the computer” by fd is licensed under CC BY-NC 2.0.

Conclusion

With the development of the internet, the diversity of information dissemination is inevitable, and because of this diversity, society can be exposed to more and more exciting content. But things always have two sides, the development of the internet has produced disadvantages that we cannot ignore. Gradual globalization has complicated the dissemination of information, the freedom of the internet does not represent it as a place where one can speak completely freely without considering the consequences. If the Internet is not strictly controlled, undesirable contents remain allowed to spread freely, digital culture will be gradually destroyed and decayed, and society will become discordant.

All in all, the key to maintaining the internet is to rely on the cooperation between the platform, the government, and the users.

 

References

Center for Media Engagement. https://mediaengagement.org/research/social-media-influencers-and-the-2020-election/

Ecofunomics. (2020, September 26). Pornography and Economics: The rare talked Topic. Eco-Fun-Omics. https://ecofunomics.com/economics/673/

Gardner, S. (2022, October 3). Risk of sexual violence for dating app users. InDaily. https://indaily.com.au/news/2022/10/04/risk-of-sexual-violence-for-dating-app-users/

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the   Hidden Decisions That Shape Social Media. New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029

Joseff, K., Goodwin, A., & Woolley, S. C. (2020). Social Media Influencers and the 2020 U.S. Election: Paying “Regular People” for Digital Campaign Communication.

Lever, R. (2019). Germany shooting livestreamed despite efforts by tech firms (Update). Techxplore.com. https://techxplore.com/news/2019-10-halle-shooter-video-twitch-livestream.html

O’Hara, & Hall, W. (2018). Four Internets: The Geopolitics of Digital Governance (No. 206).          Centre for International Governance Innovation. https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance

Stockton, R. (2017, June 6). The True Story Behind “Napalm Girl.” All That’s Interesting; All That’s Interesting. https://allthatsinteresting.com/napalm-girl

Seal, T. (2022). Bloomberg – Are you a robot? Www.bloomberg.com. https://www.bloomberg.com/news/articles/2022-07-04/uk-to-force-internet-companies-to-curb-foreign-disinformation

Yampolsky, Y. (2020). What are YouTube’s 2021 Guidelines? Www.cincopa.com. https://www.cincopa.com/blog/youtube-guidelines/