Stopping the spread of harmful content on digital platforms? Who is responsible and how?

Tuesday 1pm - 3pm, RE group14 tutor: Chika

"The Internet told me to do it..." by id-iom is licensed under CC BY-NC 2.0 .

 

Social Media Logos” by BrickinNick is licensed under CC BY-NC 2.0 .

The unprecedented success of the internet has been based on a process model that is accessible to a global audience and its unique model and more open standards have allowed the internet to evolve at a faster pace (“Internet Society,” 2014).

The proliferation of social media social applications has made it extremely difficult to regulate digital platforms. The highly public, free and transparent model has allowed for the circulation of content that is not positive and healthy, some of which is about bullying, some violent hatred, some racial and other issues, which not only undermine a healthy and good Internet environment, but also poison other online users. The content is not only detrimental to a healthy and good Internet environment, but also toxic to other Internet users.This essay will briefly discuss who is responsible for the spread of this content and how to stop it.

 

 

 

 

 

Masked Becca” by Billy Wilson Photography is licensed under CC BY-NC 2.0 .

Why Is It  Spreading ?

In The Real World, There Are Not Just Black Zones and White Zones. There are inevitably grey cases in the vast world of the Internet, and these grey cases are clearly the reason why such harmful content is still in circulation. A good example here is the enactment of Section 230 in the United States, a regulation that was introduced at the same time as it opened the door even more to what should have been locked out of such objectionable content.

Who Should Stop The Spread ?

The question of who is responsible for stopping the spread of such content is a highly controversial one, as everyone has their own interpretation of what is good or bad.

Self-regulation of Platforms Has Now Become A Cliché.For the time being, although the major social media and social apps have their own reviewers for content, as mentioned on Tarleton‘s study, self-regulation by the platforms is now becoming a familiar and acceptable way of dealing with the digital cultural landscape. The regulators also consider this self-regulation of the industry to be effective, and that they themselves only need to develop certain policies or supplement certain regulations and give them the corresponding legal benefits. Is this correct?

Here’s a good example that might counter this argument: the case of the previously deleted Facebook photo of a naked girl named Kim, also known as the napalm girl.

 

 

 

LIFE Magazine June 23, 1972 – The Beat of Life” by manhhai is licensed under CC BY-NC 2.0 .

At the time, the photo caused a great deal of controversy. On the one hand, the Facebook censors considered the photo to be obscene because it showed a nude figure, while on the other hand it was considered to be a good example of how much damage chemical weapons could do to people, and it was a historically significant image (Tarleton Gillespie, 2018 ,p.4).In the end Facebook’s reviewers gave in to the voices of users and the photo was allowed to be distributed on Facebook anyway, which should have been a reasonably normal ending at the time, but here is also a video that might provoke more thought.

by https://www.youtube.com/c/dwnews

In this video the girl, as an adult and even after she has her own children, recalls how embarrassed she still is to have her nude photo spread on social media, which in a way means that Facebook’s reviewers were actually right. This is a good example of the grey area.

It must not be overlooked that when platforms admit that they judge what users post impartially and fairly, these platforms often define moderation as “non-interference”, which is essentially a way of avoiding their obligations or their regulatory responsibilities (Tarleton Gillespie, 2018 ,p. 19), and that in today’s society Media platforms have been largely privatised. Platforms have relaxed or ‘ignored’ their own policies in response to popular appeals for more users.

This example leads to the obvious conclusion that in many cases:

  • Regulation By The Platform Alone Is Not Effective In Stopping The Spread Of Harmful Content.
  • This Is Where Digital Platforms Need A Third-Party Intervention To Balance The Delicate Relationship Between Users And Their Own Regulation.

Based on the above discussion the Chinese government’s authoritative model for online regulation may be worthy of consideration. The Chinese government’s control over the internet or digital platforms is extremely strong (O’Hara, & Hall, 2018,p.12). In the case of China’s regulation of microblogs, for example, the Chinese government can block or shut down a topic that is being discussed in a matter of minutes, or it can quickly project a topic onto the public’s page in a very short period of time. The Winnie the Pooh case is a typical example of government regulation of the internet, which is backed by China’s public safety law, the China Firewall Project.

Subdude street art, Shoreditch” by duncan is licensed under CC BY-NC 2.0 .

For some reason(perhaps because of the resemblance), Winnie the Pooh was used to refer to Xi Jinping, the leader of the Chinese government, and subsequently all Winnie the Pooh-related hashtags or related posts were blocked, and surprisingly, it took less than half a day for all Winnie the Pooh-related hashtags to be blocked in China, where there are over 100 million Weibo users.

Political Power Plays A Decisive Factor In The Action Of Internet Regulation (ROGERSON & THOMAS, 1998).

Internet regulation with government interference is indeed very effective because governments have more power to mobilise more human and material resources and to focus the most resources quickly to get something done in a short time. But this inevitably changes the unique model of the Internet’s transparent, open and free environment, where people’s online behaviour is exposed to government control, and leads to the loss of much of their freedom of expression.In David’s study, he also shows that centralised government control is neither the only nor the best form of governance, but rather a trend that is gradually becoming dominant (Levi-Faur, 2011).

How To Stop ?

Sefer Torah” by VCU Libraries is licensed under CC BY-NC 2.0 .

Based on the above discussion this essay argues that stopping the spread of such harmful content on digital platforms requires a joint effort between the digital platforms and the power of the government. The argument of this essay is that the government as the power setter should not just improve and supplement the legal treaties, and the platform owners as the operators should not give up the censorship in order to attract customers.Ideally, perhaps all posts posted on digital platforms should be divided into 2 categories: those about people’s livelihoods, and those of a political nature. The political ones would be submitted to a special government department for review, while the rest of the posts would be left to the platform, so that the separation of powers could be achieved in a more harmonious way between the free and the regulated and avoiding a monopoly.

Apart from these all-internet users are social citizens. Only if people themselves are aware of the dangers of posting objectionable content can the spread of such content be reduced at all. This can be achieved by organising awareness sessions in schools, starting with children, and by organising presentations at weekends in various communities to remind people to resist bad content.

conclusion

In summary, this essay discusses who is responsible for stopping the proliferation of objectionable content, and how to stop it. The napalm girl’s photo and the Winnie the Pooh incident in China show that relying solely on digital media platforms to regulate or on government power to govern does not work, and that a harmonious internet environment can only emerge if there are checks and balances between all parties.

References

Disturban History. (2021, August 22). The Harrowing Story of Napalm Girl. Www.youtube.com. https://www.youtube.com/watch?v=Jui542Kfezc

DW News. (2019, February 14). “Napalm Girl” from iconic Vietnam War image: Phan Thị Kim Phúc | DW News. Www.youtube.com. https://www.youtube.com/watch?v=Dr7bUdS_rWw

Internet Society. (2014). Who Makes the Internet Work: The Internet Ecosystem.

Levi-Faur, D. (2011). Regulatory networks and regulatory agencification: towards a Single European Regulatory Space. Journal of European Public Policy, 18(6), 810–829. https://doi.org/10.1080/13501763.2011.593309

O’Hara, K., & Hall, W. (2018). Four Internets: The Geopolitics of Digital Governance (No. 206). 1–20.

Patterson, D. (2020, December 16). What is “Section 230,” and why do many lawmakers want to repeal it? Www.cbsnews.com. https://www.cbsnews.com/news/what-is-section-230-and-why-do-so-many-lawmakers-want-to-repeal-it/

ROGERSON, K. S., & THOMAS, G. D. (1998). Internet Regulation Process Model: The Effect of Societies, Communities, and Governments. Political Communication, 15(4), 427–444. https://doi.org/10.1080/105846098198821

Tarleton Gillespie. (2018). All Platforms Moderate. In Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1–23). New Haven Yale University Press.