Do Children and Teenagers Need “Special Governance” on Social Media Platforms?

Abstract

In daily life, social media platforms attempt to reach out to a wide range of age groups. People now have more access to information due to the advent of social media platforms. In order to attract more users, media platforms typically broaden the coverage of platform content. Sometimes these contents may contain excessive social behaviours and misinformation to deceive platform users (Gruzd et al., 2023). Social media platforms tend to moderate their content in accordance with local laws and social mores. However, as a “vulnerable group” that uses social media platforms, children and teenagers still need “special governance” on content moderation to reduce the harm of online information to them.

In What Ways Do Social Media Platforms Affect Children and Teenagers

Most children and teenagers are addicted to social media platforms. More than 2.5 million children and teenagers spend 2 hours and 25 minutes a day on social media platforms, and about 3 hours on TikTok and Instagram (Armitage, 2021). In addition, excessive use of social media platforms could cause psychological and physical harm to children.

Social media platforms might enhance children’s and teenagers’ herd mentality

Social media platforms increase communication among youth and promote the formation of communities with a common identity. Online antisocial behaviour affects different groups of people, particularly teenagers, rising from 36% to 51% (Gruzd et al., 2023). The existence of social media brings some serious risks for children and teenagers. According to Armitage (2021), the majority of children and adolescents suffer from sleep deprivation due to excessive use of social media platforms, which could affect their physical health and anxiety levels.

National bullying prevention month-stop cyberbullying” by iPredator is marked with CC0 1.0

The dissemination of cyberbullying information that harms others is negative to the mental health of growing children and teenagers. The online communities created by social media platforms may also expand the scope of cyberbullying among teenagers and children. This means that many children and teenagers participate in cyberbullying in order to gain an identity from their online community. In addition, some teenagers and children who are addicted to social media are more likely to engage in cyberbullying as an expression of their emotions (Kao, 2021).

Some content on social media platforms may contain negative information, and the inability to recognise children and adolescents as “vulnerable groups” will have serious consequences. For example, the spread of graphic content on social media platforms may encourage children and teenagers to normalise offline behaviours such as self-harm and suicide (Lavis & Winter, 2020). According to Lavis & Winter (2020), suicidal people may find consolation in social media communities. At the same time, this kind of self-harm content may also lead some children to imitate and injure their health.

Some online pornography exists on social media platforms

Pornographic content is still widely disseminated on social media platforms, threatening the mental health of children and teenagers. According to a UK survey of 16-17-year-old teenagers, 63% have been exposed to pornography on social media platforms (Thurman & Obster, 2021). Moreover, some social media platforms, such as Twitter and Reddit, do not have age restrictions on the sharing of such pornographic content (Thurman & Obster, 2021). This proves that the content-sharing services in some social media platforms ignore the psychological vulnerability of children and teenagers so that these groups of people are harmed by pornographic content.

Social Media Content Moderation

Social media” by Sean MacEntee is licensed under CC BY 2.0

Social media platforms distribute user-generated content to the public through the internet, at the same time, the platforms’ major focus is the network and channel of technology business.

Langlois et al., 2009 as cited in Myers West, 2018, p. 4367

Content moderation is a governance approach used by social media platforms to check user-generated information in order to create a secure online community and encourage communication (Myers West, 2018, p. 4368). In addition to using computers for screening, the platform will engage outsourcing workers to do content reviews during the content moderation process (Myers West, 2018). However, social media platforms focus on commercial interests and need to expand the content of the platforms to attract a large number of audiences to browse. Thus, platforms may be concerned about their profits, they may ignore some content during content moderation.

TikTok’s Content Moderation

TikTok is a popular platform among young people, with more than 60% of TikTok users being members of Generation Z (Zeng & Kaye, 2022). In addition, TikTok is a platform that includes a significant number of short videos aimed at teenagers and children, but it is also a tool for the distribution of violent and pornographic content (Zeng & Kaye, 2022). The spread of violent and pornographic content affects the physical and mental health of children and teenagers, these kinds of content also cause the public to be suspicious of the content moderation of platforms.

ABC10 Source from https://www.youtube.com/watch?v=LwgSNQEUobU

Moreover, TikTok’s algorithmic mechanism for content moderation has certain limitations and neglect. According to Zeng & Kaye (2022), TikTok’s content moderation focuses on machine learning in algorithmic governance, a moderate method that may result in errors and uncertainty. For example, the number of Australian teenagers using TikTok has increased during the pandemic, but there is a neglect of content moderation on the platform, which promotes a trend of self-harm (Cunningham, 2023). According to the social media algorithm, teenagers will continue to receive more relevant content as long as they show their interest in this kind of suicidal content, which will lead to negative psychological emotions (Cunningham, 2023).

Are the Current Governance Models of Social Media Platforms Effective?

To some extent, the platform’s algorithmic mechanism and manual audit are effective. However, the disadvantages of the algorithmic process and the importance of platform profits make the public reconsider the efficacy of social media content moderation. For example, the majority of these inappropriate posts have been removed as a result of reports from other users, but these posts are only not visible to other users (Myers West, 2018). Thus, moderation of social media content reduces the spread of offensive content by some users but also has limitations.

Existing social media “community rules” regulate how users communicate on platforms, but some moderate approach delivers misleading information to the public. Nowadays, many governments and the public are concerned about the dissemination of harmful information to children and teenagers. Social media platforms should pay more attention to content moderation and adopt special methods such as AI tools to conduct “special governance” for children and teenagers. In addition, there are opportunities and challenges for social media platforms to conduct “special governance” on content moderation for children and teenagers.

Conclusion

Social media provides more channels for people to communicate. Users follow the “community rules” established by the platform, which to some extent limits the spread of negative information. From the perspective of children and teenagers, social media platforms are a way for them to share thoughts, build community and gain an identity from each other’s.

However, there are several problems with the social media platforms’ content management. These problems affect the physical and mental health of children and teenagers. Children and teenagers are more susceptible than adults to social media content trends, especially those who are addicted to the internet. Therefore, social media platforms should adopt “special governance” for children and teenagers, which includes restricting content based on user age or adopting a new AI algorithm to moderate media content.

References

ABC10. (2023, March 24). Inside the effects of TikTok and social media on children. [Video]. Youtube. https://www.youtube.com/watch?v=LwgSNQEUobU

Armitage, R.C. (2021). Social media usage in children: an urgent public health problem. Public Health, 200, e2–e3. https://doi.org/10.1016/j.puhe.2021.09.011

Ashbridge, Z. (2022, December 21). How the TikTok algorithm works: Everything you need to know. Search Engine Land. https://searchengineland.com/how-tiktok-algorithm-works-390229

Cunningham, M. (2023, July 29). Australia lagging in protecting teens from ‘dark rabbit holes’ on TikTok. The Sydney Morning Herald. https://www.smh.com.au/national/australia-lagging-in-protecting-teens-from-dark-rabbit-holes-on-tiktok-20230728-p5ds1d.html

Darbinyan, R. (2022 June 14). The Growing Role Of AI In Content Moderation. Forbes Technology Council. https://www.forbes.com/sites/forbestechcouncil/2022/06/14/the-growing-role-of-ai-in-content-moderation/?sh=34c883e64a17

Gruzd, A., Soares, F. B., & Mai, P. (2023). Trust and Safety on Social Media: Understanding the Impact of Anti-Social Behavior and Misinformation on Content Moderation and Platform Governance. Social Media + Society, 9(3). https://doi.org/10.1177/20563051231196878

IPredator. (2017). National bullying prevention month-stop cyberbullying [Image]. Flickr. Retrieved from https://www.flickr.com/photos/69420045@N08/37624217491

Jerome, J. (2021, July 1). Opportunities and Challenges in Online Content Management. https://www.commonsensemedia.org/kids-action/articles/opportunities-and-challenges-in-online-content-management

Kao, K. (2021, March 30). Social media addiction linked to cyberbullying. UGATODAY. https://news.uga.edu/social-media-addiction-linked-to-cyberbullying/

Lavis, A., & Winter, R. (2020). Online harms or benefits? An ethnographic analysis of the positives and negatives of peer-support around self-harm on social media. The Journal of Child Pychology and Psychiatry, 61(8). 842-854. https://doi.org/10.1111/jcpp.13245

MacEntee, S. (2010). Social Media [Image]. Flickr. Retrieved from https://www.flickr.com/photos/18090920@N07/5209796269

Mbevi, K. (2020, January 15). Impact of Social Media on Youth. [Video]. YouTube. https://www.youtube.com/watch?v=soHn6t_jjIw

McClure, P. (2023, August 15). Legislation is effective at moderating harmful social media content. New Atlas. https://newatlas.com/health-wellbeing/government-legislation-effectively-moderates-harmful-social-media-content/

Myers West, S. (2018). Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society, 20(11), 4366–4383. https://doi.org/10.1177/1461444818773059

Suciu, P. (2022, June 28). Not Exactly News – Younger People More Likely To Trust What They Read On Social Media. Forbes Technology Council. https://www.forbes.com/sites/petersuciu/2022/06/28/not-exactly-news–younger-people-more-likely-to-trust-what-they-read-on-social-media/?sh=7230be6a677d

Surgeon General Advisory: Social Media Poses ‘Profound Risk of Harm’ to Kids. (2023, May 23). Psychiatrist. https://www.psychiatrist.com/news/surgeon-general-advisory-social-media-poses-profound-risk-of-harm-to-kids/

Thurman, N., & Obster, F. (2021). The regulation of internet pornography: What a survey of under-18s tells us about the necessity for and potential efficacy of emerging legislative approaches. Policy & Internet, 13(3). 415-432. https://doi.org/10.1002/poi3.250

Zeng, J., & Kaye, D. B. V. (2022). From content moderation to visibility moderation: A case study of platform governance on TikTok. Policy and Internet, 14(1), 79–95. https://doi.org/10.1002/poi3.287