Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Cyberbullying / Responsibilities and Obligations of Online Platform

"People Using Phones" by Tom Coates is licensed under CC BY-NC 2.0

Due to the popularization and development of the Internet, more and more information content appears on online platforms. Malignant behaviors in real life also occur on online platforms. Therefore, controlling the problematic content has become a concern. According to International Covenant on Civil and Political Rights (ICCPR), the premise of exercising the right to freedom of expression is to respect the rights and reputation of others (Australian Human, 2017). But bullying, violent content, hate speech, etc. are excluded from the freedom of expression. Online platforms are the same public places as shopping malls in the real world. Therefore, bullying, violent content, and hate speech should not be tolerated or ignored by online platforms.

 

What is problematic content?

Bullying and hate speech, among others, are clear problematic content. Bullying is a form of interpersonal violence that causes long-term or short-term effects on the victim’s physical, and emotional (Vivolo-Kantor et al., 2014). Forms of bullying in online platforms are called “cyberbullying” and refer to forms of harassment via the Internet, text messages, and emails (Vivolo-Kantor et al., 2014). These forms of problematic content may be detected and blocked through visual perception. However, there are also ambiguous problematic contents.

“Stop-Bullying-Concept” by freepik is licensed under CC BY-ND 2.0

Although ICCPR has confirmed the responsibilities that individuals should perform when enjoying the freedom of expression, it is still difficult to distinguish some ambiguous content in online platforms. For example, platform users have different views on an event. When the Internet gathered people from all over the world to discuss this event. A lot of attention and participation will lead to a lot of discussions. One’s perceptions arise depending on one’s educational environment, life circumstances, and identity. Diverse participants may have differences of opinion. Then, one’s ideas that are not understood or agreed upon by the public may be classified as alternative speech. However, it is unfair to classify alternative speech as problematic content because of a difference of opinion. Therefore, it is mentioned in ICCPR that protecting public health and morals is also an obligation of freedom of expression (Australian Human, 2017). When someone affects public sentiment because of their expression, then that also falls under problematic content. It is not a restriction on the right to normal freedom of expression.

 

Impact of problematic content

It is significant to know the reason for stopping the dissemination of bullying, violence, etc. In the case of bullying as problematic content, online violence has a long-term negative impact on the mental health of the victim. At the same time, perpetrators also experience negative effects. For example, youth perpetrators of bullying are likely to exhibit more aggressive antisocial, and criminal behavior in adulthood (Price & Dalgleish, 2010). When perpetrators go on to more serious and vicious incidents, more impactful social events result. Bullying in the teenage years is a hidden bomb for the future social environment. So, stopping problematic incidents is to avoid victims and stop perpetrators. At the same time, it protects the whole online platform environment and avoids future malicious events.

According to Price and Dalgleish (2010), the prevalence of online bullying varies from 9% to 49%, and the prevalence of offline bullying is as high as 70%. The data on online bullying was lower than the prevalence of offline bullying at that time, but Price and Dalgleish predicted that the accident rate of online bullying would increase in the future based on the characteristics of digital media as a communication medium. Thus, the Affordance of digital media is two-sided.

  • Persistence means that the content expressed online is automatically recorded (Boyd, 2011). Then, bullying may be automatically recorded. In addition to harming the victim twice, it can deepen the perpetrator’s sense of pleasure in bullying and strengthen the perpetrator’s antisocial psychology.
  • Scalability means that the visibility of online content is a huge possibility (Boyd, 2011). Therefore, cyberbullying may spread more widely. Therefore, stopping the dissemination of problem content in the online platform requires attention, because the Internet, as a communication medium, will have an uncontrollable impact on the dissemination of events.

 

Who should be responsible for stopping the spread of this content?

“Cyberbullying” by paul.klintworth is licensed under CC BY-NC 2.0.

Online Platform

Online platforms are already closely connected to a wide range of public activities, such as news, education, and transportation (Helberger, Pierson, & Poell, 2017). Online platforms serve as places where problematic content exists, and therefore platform providers and administrators need to be responsible for stopping the dissemination of problematic content. For stakeholders operating these platforms, the more discussions about the event, the more views and exposure the online platform will receive. This can ensure the benefits of stakeholders and the value of the platform. Therefore, online platforms, while making profits, they need to fulfill their obligations and responsibilities to maintain public health. As the problem content will be disseminated through the online platform, the online platform and stakeholders have direct responsibility for stopping the spread of the problematic content.

Users

The rights of users and platforms are unequal. Due to the opacity of platform operation, it is difficult to divide the proportion of responsibility between users and platforms (Helberger, Pierson, & Poell, 2017). However, the perpetrators of bullying and hate speech still come from some users on the online platform. The online platform may guide the direction of the platform content for the sake of interests, but if the online platform is fully responsive, it is also in connivance with the existing perpetrators. Although the perpetrators are only some users, there are also some users who have gathered around or contributed to the popularity of the problematic content. Therefore, some users, as bystanders or secondary perpetrators, also need to take indirect responsibility for stopping the dissemination of the problem content.

Government and organization

As two authorities, government organizations and online platforms, the government obviously cannot escape responsibility. Government organizations may not be able to control the detailed platform content, but they have the responsibility to check and balance the arbitrary development of online platforms. The treaties and laws formulated by them are also the key lines of defense to prevent the occurrence or spread of problem content. Therefore, government organizations are also indirectly responsible for preventing the dissemination of problem content.

 

How to stop the spread of problematic content?

Before discussing how to stop the distribution of problematic content, it is important to realize that online platforms hardly become a perfectly harmonious utopian world. Not only because the complete elimination of problematic content is an idealistic plan, but also because profit and responsibility cannot be fully balanced for platforms. However, methods and improvements are still necessary.

“Reddit AMA” by NASA Johnson is licensed under CC BY-NC 2.0

Online Platform

As a platform with many users, Reddit has received attention because of its platform model. Massanari (2016) argues that Reddit has become a hothouse for the development of a toxic technology culture. But during the period Massanari (2016) studied, Reddit introduced several improvements.

  • The content of “inciting harm to individuals or groups” or “harassment and bullying” on the platform is prohibited (Massanari, 2016).
  • Create a new category to isolate “problem content” from public content, so that it cannot be searched or listed publicly (Massanari, 2016).

However, these improvements still have problems:

  • Although the plates are isolated, the toxic content is not isolated. Because users who publish problematic content are still free to participate in the online platform. Massanari (2016) believed that Reddit administrators tried to connive at those alternative content implicitly.

Therefore, the online platform needs to make strict rules and determine the direction of management and control.

Individuals

  • Although Reddit has made improvements to stop the spread of problematic content, these programs are still flawed. From a user perspective, Reddit wants to maintain good public health while maintaining the platform’s identity and style. It has shown a wavering attitude. Then something individuals can do to stop the spread is to participate in the governance of the online platform. Increasing the diversity of identities in posts can then add a voice to the group. This is also something that the online platform can do.
  • While users may encounter problematic content and feel uncomfortable about it, they may become secondary victims. However, it is better to complain about the content of the platform than to rebut or ignore it. Because rebuttal may increase the heat of the problematic content. Ignoring it condones the problematic content.

 

Conclusion

There are still crimes and disasters in the world, and the content of the problem is also cancer which is difficult to eliminate on the Internet. However, this article focuses on the problem of bullying in the problem content. It also explores the roles that should be responsible for this and gives solutions to prevent the spread. This article can help because understanding the status of the root cause of the problem can help stop the propagation of the problem content.

 

Reference

Australian Human Rights Commission. (2017). Freedom of information, opinion and expression. Australian Human Rights Commission. https://humanrights.gov.au/our-work/rights-and-freedoms/freedom-information-opinion-and-expression

Boyd, D. (2011). Social Network Sites as Networked Publics: Affordances, Dynamics and Implications. In Z. Papacharissi. (Ed), A networked self: identity, community and culture on social network sites (pp.39-58). Taylor & Francis Group. https://doi.org/10.4324/9780203876527-8

Helberger, N., Pierson, J., & Poell, T. (2018). Governing online platforms: From contested to cooperative responsibility. The Information Society, 34(1), 1–14. https://doi.org/10.1080/01972243.2017.1391913

Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807

Price, M., & Dalgleish, J. (2010). Cyberbullying: Experiences, Impacts and Coping Strategies as Described by Australian Young People. Youth Studies Australia, 29(2), 51–59. https://doi.org/10.3316/ielapa.213627997089283

Riedl, M. J., Naab, T. K., Masullo, G. M., Jost, P., & Ziegele, M. (2021). Who is responsible for interventions against problematic comments? Comparing user attitudes in Germany and the United States. Policy and Internet, 13(3), 433–451. https://doi.org/10.1002/poi3.257

Vivolo-Kantor, A. M., Martell, B. N., Holland, K. M., & Westby, R. (2014). A systematic review and content analysis of bullying and cyber-bullying measurement strategies. Aggression and Violent Behavior, 19(4), 423–434. https://doi.org/10.1016/j.avb.2014.06.008