
Problematic Content On Digital Platforms
Digital platforms have revolutionized communication, interaction, engagement patterns, behaviors, and habits. Not only do individuals converse online to exchange visuals but also ideas and hold the belief that digital platforms meet the individual demand of belonging to a community. Moreover, digital platforms offer safe public spaces while allowing for opinions to be shared and discussed, but digital platforms are also home to violent ideologies and activities. Digital platforms are mediums that accommodate a large user base that is getting more complex as communication technologies become highly evolved and more interactive, which compounds the issue of vast inappropriate content. However, who is responsible for stopping the spread of problematic content on digital platforms? This paper aims to answer that question by noting that digital platforms are developed and sustained by humans while digital platforms are independent subjects. Thus, digital platforms should be responsible for stopping the spread of problematic content. Besides, users must also be individually responsible for uploading and distributing the content.
Why need to be responsible?
Digital platforms do not operate in a vacuum because they exist due to individuals using the mediums. Moreover, spaces online are being used by predators, hackers and other bad actors to accelerate illegal and harmful activity in unprecedented ways. Hence, digital platforms are responsible for ensuring that any content uploaded to their sites is not detrimental. A digital user must first log in to the digital platform to upload, check and share content. Digital platforms are aware of who has logged in and all the details that pertain to the individual. While companies may claim that accessing such information will violate user privacy, digital platforms are notorious for overtly disregarding user privacy (Gorwa, 2019). They can continue to disregard their corporate values on user privacy and offer information on users who upload, share and store problematic content. While the suggestion may be unethical digital platforms have claimed that there are not responsible for the inappropriate content as they are platforms that have the content and hence “have no responsibility themselves in regulating the content of their sites “(Gorwa, 2019, p.3). Nevertheless, digital platforms have immensely profited from problematic content. In this case, it is unreasonable that they gain billions of dollars in profits from these sites without being responsible. As digital platforms operate in society, they are liable for ensuring that inappropriate content is prevented as they are the owners of digital platforms.
Digital platforms should be held responsible for stopping problematic content in the media industry. The media industry is subject to rules and regulations which media from being the source of problematic content and sharing the same in public spaces. While digital platforms claim that they are not media companies and hence are not subject to the rules and regulations of the media industry, digital platforms play a clear role in curating and distributing original content (Flew et al., 2019). Digital platforms operate similarly to traditional media, as both are mass media and disseminate information to massive audiences. Not only do digital platforms have a role in mediating the type of content that is uploaded and shared by users they also hold the responsibility of ensuring that the content is not problematic in any way like traditional media. In addition, as digital platforms have moderated content that they believe is unsuitable, then moderation activity indicates that digital media operates similarly to traditional media and should be held responsible for stopping problematic content.

For instance, recently, digital platforms removed all content created by Alex Jones’s conspiracy theories. Twitter went ahead and banned Alex Jones’s account permanently. However, Facebook upheld the live streaming of the Philando Castile shooting. So, the removal of Alex Jones’s content and subsequent ban shows that the digital platforms have knowledge of inappropriate content but are not removing all problematic content.
How to stop the spreading of harmful content and manipulate it?
Therefore, there is a lack of a standardized governance framework that digital platforms can use to stop problematic content. Digital platforms operate in self-regulatory contexts and have generic policy-making to address inappropriate content (Flew et al., 2019). However, self-regulation allows digital platforms to make rules that will benefit them while removing all and any responsibility from them. It is necessary to develop and implement a standard governance framework for digital platforms to ensure that they are consistent in moderating problematic content. A digital governance framework corresponds to the notion of an overall legal and ethical framework that offers guidelines as implied by global corporate governance theory (Linke et al., 2013). A digital governance framework will provide digital platforms with standardized rules, regulations, and supervision mandates by stakeholders, specifying problematic content and how such content should be addressed according to implemented laws. Also, a digital governance framework is focused on strategy and foundation, implementation and enforcement, and monitoring.
Also, placing digital platforms under traditional media rules and regulations will not stifle digital platforms’ ways of operation. The traditional media regulatory framework will complement other enforcement instruments currently available for digital platforms (Stasi, 2019). The act will allow for closer partnership between digital platforms and authorities in addressing and stopping inappropriate content.
What can other co-responsible parties do?
Furthermore, digital platforms are globalized, and users of digital platforms are from different social and cultural backgrounds. For digital platforms, the acceptable content in one culture may be unacceptable in another. However, problematic content is inappropriate content in all cultures. For instance, can digital platforms claim that bullying or pornographic materials are accepted in any culture worldwide? Digital platforms cannot operate with rules; they must moderate their content to protect one user from another and engage in best practices (Gillespie, 2018). Hence not only is it the responsibility of digital platforms to stop the spread of problematic content; the digital platforms must do so regardless of the idea that one culture acceptances what is offensive to another culture, and one way of ensuring that is by moderating content within clear and well-established guidelines that are not influenced by debate based on social and cultural backgrounds but absolute universal truths.
In April of 2022, an online video showed a man arrested and detained by police in California on allegations of uploading, distributing, and strong child pornography. The man is suspected of portraying himself as an eleven-year-old girl online and speaking with children to groom them (KHOU 11, 2022). The man did not act alone and alleged that some family members and friends engaged in the practice. Problematic content does not appear magically. It is shared and distributed by users who are in contact with others. It is also the responsibility of the community and individuals to stop the spread of harmful content by sharing any suspicious activity that may regard as problematic online content. Tang and Chaw talk about digital literacy, which enables users to develop digital citizenship and responsible use of technologies (Tang & Chaw, 2016). At the same time, digital literacy gives users the knowledge and skills to discern harmful online behaviors, patterns, habits, and problematic content. Therefore, offering public digital literacy courses within their communities’ governments, digital platforms companies, and stakeholders will equip individuals and communities with digital literacy consciousness themselves and the necessary skills to note and flag inappropriate content within their communities.
Ethical Issues in Digital Literacy by Renee Hobbs is licensed under CC BY-NC-SA 2.0
Conclusion
Although digital platforms are reluctant to take responsibility for problematic content, asserting that they are not media companies and hence cannot regulate content. However, digital platforms operate within the media industry and are therefore responsible for the content published on their sites by users. Digital platforms must enact a digital governance framework and moderate the content uploaded by their users. More importantly, digital platforms should engage in best practices within upholding absolute universal truths. Finally, platform users need to gain knowledge and skills to become active and moral participants in ensuring that problematic content is flagged at individual and community levels.
Reference
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23). Yale University Press.
Gorwa, R. (2019). The platform governance triangle: conceptualizing the informal regulation of online content. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1407
KHOU 11, 2022. Multiple Houston children among 80+ victims of California man in massive child porn investigation.
Available at: <https://www.youtube.com/watch?v=5skO8Qjyg4M> [Accessed 5 October 2022].
Linke, Anne & Zerfass, Ansgar. (2013). Social Media Governance: Regulatory frameworks for successful online communications. Journal of Communication Management. 17. 270-286. 10.1108/JCOM-09-2011-0050.
Stasi, M. L. (2019). Social media platforms and content exposure: How to restore users’ control. Competition and Regulation in Network Industries, 20(1), 86–110. https://doi.org/10.1177/1783591719847545
Tang, C. M., & Chaw, L. Y. (2016). Digital Literacy: A Prerequisite for Effective Learning in a Blended Learning Environment? Electronic Journal of E-learning, 14(1), 54-65. available online at https://files.eric.ed.gov/fulltext/EJ1099109.pdf
Twitter, 2022. [image] Available at: <Twitter removes accounts linked to Alex Jones & Infowars https://blbrd.cm/7hfCsD> [Accessed 5 October 2022].
Renee, H. (2019, November 22). Ethical Issues in Digital Literacy. [Video]. Youtube. https://www.youtube.com/watch?v=4V7qdt0p8TM