Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Legal requirements or moral constraints?

cyber threat defense by CyberHades
"cyber threat defense.jpg" by CyberHades is licensed under CC BY-NC 2.0.

Introduction

The rapid growth of social media in recent years has provided opportunities for a wide range of people to communicate and interact. The online world represented by social media is gradually intermingling with the real world, and people are doing their best to move their real-world agendas to the online world to solve them, and in the process, the undesirable problems of the real world are rapidly spreading on the digital platform. Gillespie (2018) argues that a lot of offensive, violent, abusive, illegal and discriminatory content is posted on websites every day. Freedom of expression and the communicative nature of digital platforms are prerequisites for these issues, and the creators and participants in these issues are difficult to regulate and punish. This paper will present the actions that people should take in response to these issues on digital platforms, and who (groups) should be responsible for them in order to regulate and prevent the content of digital platforms to some extent.

Government? Digital Platforms? Users themselves?

The government bears the brunt of the responsibility for a healthy environment on digital platforms. The government, as the state apparatus, is supposed to regulate all matters that take place within the borders of the state(Riedl et al., 2021). Each country’s government, as well as its legislature, makes laws for and regulates the Internet according to national conditions. At present, the laws for digital platforms are not perfect, and the boundaries of the laws are vague. Many platforms and groups are on the edge of the law in order to seek their own interests. Gorwa (2019) presents that the Australian government has put in place some necessary regulations for digital platforms, requiring social media giants like Twitter to pay for content on their sites. Such undesirable content on digital platforms may not only cause mental harm to users but may also affect their personal safety, property, and privacy. Internet scams seem to be common nowadays, and some organizations create phishing websites to cheat victims out of large amounts of money.

The End Of The Government Shutdown 2013

The End Of The Government Shutdown 2013” by Stephen D. Melkisethian is licensed under CC BY-NC-ND 2.0.

What goes on in digital platforms affects the real world and the government should make more regulations for the internet to protect their citizens. The digital platform itself needs to control these bad contents. As a communication tool, the efficiency of direct control of the platform will be higher than that of government supervision. Platform emergencies are handled and controlled relatively effective. For instance, during acts of terrorism, terrorists will broadcast live killing and carnage on social media. Flynn(2019) mentions that Facebook statistics show that 1.5 million videos related to the New Zealand mosque shooting were removed within 24 hours, of which 1.2 million were immediately censored by the platform. While digital platforms utilize a range of algorithmic tools to help censor content, Dutton(2009)argues that the fact that digital platforms, unlike traditional media, are in a more open and larger community with tremendous daily traffic results in current technology still not doing enough to eliminate bad content from being distributed on the Internet. As participants of digital platforms, users themselves should comply with relevant laws and regulations and platform regulations. On a legal level, users should not be creators and disseminators of offensive content, and on a moral level, users should assist the government and platform supervision by reporting bad content and creators of bad content, work together to maintain a healthy online environment.

Microsoft Live platform

Microsoft Live platform” by niallkennedy is licensed under CC BY-NC 2.0.

What should people do?

The essence of government regulation of digital platforms is to intervene in the online environment, but the degree of intervention varies from country to country. Some Eastern countries have interfered with the online speech in ways that are seen by outsiders as limiting freedom of expression, and some governments and platforms have joined together to create self-regulation councils that seek to regulate without outside interference. Brown (2020) argues that the self-regulation council’s members, members chosen by organisations, and stakeholders who don’t need official permission to strengthen regulations are all autonomous. In recent years, governments all over the world are seeking more and more reasonable ways to supervise digital platforms(Flew et al., 2019). China has special network police to supervise the content circulating on the network, and even people’s chat privacy can be seen in the background of the platform server. However, neither the self-discipline committee nor the government’s tough control can properly handle the bad content on the Internet. In China, some sensitive words related to political factors will be restricted, and people will not discuss them online because of this. Some people will use pronouns or symbols to represent some words. What people want to communicate and create will not be easily regulated and restricted on the Internet, and the Internet is hidden. People can log on to digital platforms in different locations only by switching domain names, and it is difficult to be punished after spreading some illegal or even illegal things. Gorwwa (2019) argues that the profitable nature of social platforms makes them not necessarily responsible for the public interest. However, due to the great influence of digital platforms on society, the government has to make some laws to restrict the operation of digital platforms.

Anonymous Hacker

Anonymous Hacker” by dustball is licensed under CC BY-NC 2.0.

QVOD is a Chinese multiplayer that differs from traditional playback engines by applying P2P technology and supporting mainstream video formats such as MKV, RMVB, MPEG, etc. This video software has 300 million installations in China in 2013 when there were only 530 million internet users in China. However, in 2014, QVOD was closed down by the Chinese government for spreading obscene information. QVOD, as a profitable enterprise, gained many users by spreading obscene information and pirated videos by default, but this is all against the law and morality, and many users of QVOD are minors, and the spread of this obscene information seriously damages the physical and mental health of young people. In a good online environment, users will improve their moral quality and enter a virtuous circle, and they will unite to monitor the bad messages on digital platforms. Butsch (2009)argues that people that have strong civic cultures are able to fight against social and cultural injustices. Both the government and the platform should do their part to foster a culture of good citizenship.

Bootlegged Kink.com Videos in Copenhagen

Bootlegged Kink.com Videos in Copenhagen” by Melissa Gira Grant is licensed under CC BY-NC-ND 2.0.

Conclusion

It is the government, the platforms and the users themselves who are responsible for what happens on these digital platforms. The problems that exist on these digital platforms need to be controlled from the root causes and the dissemination process under the legal framework and moral constraints and to develop good online literacy among citizens to prevent the problems. The government should develop a sound legal framework to limit the activities of platforms and users, platforms need to formulate the rules under the legal framework and regulate bad information with algorithms and other technologies, and users themselves need good self-discipline to unite and spontaneously maintain online environment.

 

References

Brown, N. I. (2020). Regulatory Goldilocks: Finding the Just and Right Fit for Content Moderation on Social Platforms. Texas A&M Law Review, 8(3), 451-494. https://doi.org/10.37419/LR.V8.I3.1

Butsch, R. (2009). Media and public spheres. Palgrave Macmillan.

Dutton, W. H. (2009). The Fifth Estate Emerging through the Network of Networks. Prometheus27(1), 1–15. https://doi.org/10.1080/08109020802657453

Flynn, M. (2019). No one who watched New Zealand shooter’s video live reported it to Facebook, company says. Washington Post. Retrieved October 12, 2022, from https://www.washingtonpost.com/nation/2019/03/19/new-zealand-mosque-shooters-facebook-live-stream-was-viewed-thousands-times-before-being-removed/

Gillespie, T. (Ed.). (2018). Regulation of and by Platforms. In The SAGE handbook of social media (pp. 254–278). SAGE Publications.

Gorwa, R. (2019). The platform governance triangle: conceptualising the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407

Riedl, M. J., Naab, T. K., Masullo, G. M., Jost, P., & Ziegele, M. (2021, May 14). Who is responsible for interventions against problematic comments? Comparing user attitudes in Germany and the United States. Policy &Amp; Internet13(3), 433–451. https://doi.org/10.1002/poi3.257