How to Balance Users’ Freedom of Speech and Handling of Extreme Speech

Social media are now being relied on as important communication platforms. Many people rely on these platforms to obtain daily news instead of mainstream media platforms (Sobaih et al., 2020). In response, social media platforms have established rules aimed at curbing the misuse of these platforms. These rules have raised the question of whether these platform owners misuse them to favor specific content or sides. It is important to establish a balance between freedom and restriction of content. Although these platforms want freedom of speech, they also understand that they must regulate content and determine its suitability for the users. Therefore, it is vital to explore how community guidelines, community notes, and content restriction can help balance freedom of speech and handling of extreme speech.

The first way to balance the two concepts entails using community guidelines informing users on what is acceptable or unacceptable. Platforms like Twitter, Facebook, and Google have created specific rules called community guidelines to protect the users and themselves against malicious individuals. Maddox & Malson (2020) mention that these community guidelines aim to provide a barrier that prevents users from misusing the platforms. For instance, YouTube shares an example of such a guideline targeting child safety expectations.

A Child Watching YouTube” by uncleboatshoes is licensed under CC BY-NC 2.0.

However, Aguerri et al. (2023) note that in some cases, these community guidelines are created to protect the platform owners against legal repercussions. For instance, Aguerri et al. (2023) share the situation of Donald Trump, where numerous platforms banned his accounts after the Capitol attack, arguing that his account was inciting violence. However, Aguerri et al. (2023) question the decision based on whether private enterprises like these social media platforms now have power over freedom of speech. Opponents of these community guidelines argue that these private entities sometimes use such tools with specific political parties. For instance, Ng (2020) shares the emergence of a new culture called the cancel culture that sometimes targets people solely based on their political or ideological inclination, even though they have not violated any community guidelines. Mueller (2021) shares how community guidelines are now weaponized through the propagation of cancel culture on social media, where people gang up on an account user simply because their ideologies do not match. In response, social media owners must create more objective community guidelines beyond simple political ideologies. These guidelines must look at an account user or post based on the specifics of the complaint rather than whether it falls within a specific side of the ruling party. Adopting this approach will help create a more balanced environment on social media and minimize the weaponization of these community guidelines. Objective community guidelines would allow users to freely share their views without fear of cancel culture while remaining within the expectations of the platform’s rules.

internet” by x61.sh is licensed under CC BY 2.0.

A second way to create a balance on these social media platforms is through the adoption of community notes. Community notes represent a recent innovative idea developed on Twitter since Elon Musk took over this application. Although this platform is now known as X, it has still retained some of the features of the original Twitter platform (X, 2023). The recent Community notes represent one of the additions brought by Elon Musk. The idea behind these notes is for users to create independent communities, where they can vet the content and evaluate its authenticity. The aim of this feature was to create a balance and a more objective evaluation of content. As shared in the previous paragraph, it is not uncommon to find platforms regulating content even when it does not fully violate the standards. Sometimes these platforms regulate content simply because it attacks a specific side in politics. In response, Elon Musk thought of the idea of providing more power to the users (X, 2023). In this case, the users get to come together and evaluate the legitimacy of any information that could be potentially misleading or harmful. Proponents of this new feature believe that it could become the start of a stronger policy to evaluate the type of content created on these platforms. Although some people may be against this platform due to the belief that it could be manipulated, it has made numerous corrections on posts that could become misleading to the users. In the age or artificial intelligence, it is increasingly becoming hard to authenticate information on social media. Therefore, such notes can become good tools to provide additional guidance on information that could be misleading or harmful (X, 2023). If other platforms adopt such approaches, it could become easier to regulate extreme speeches while offering users more freedom to tweet without fear that their comments will be flagged even when they have not violated community guidelines.

Twitter” by petesimon is licensed under CC BY 2.0.

Another way to create a balance on social media is through restricting the viewing of specific content. As platforms strive to create opportunities for expression among content creators and other users, they must also recognize their power in regulating the type of content that is presented to consumers. One of the ways that has been shown to work is the restriction of content. A platform that is popular with restricting content is YouTube. In one article, Southerton et al. (2020) share the experiences of LGBTQ content creators who shared that their content on YouTube was getting restricted. In their remarks, they mentioned that the restriction was mainly due to their sexuality and not the content that they posted. Their experience provides an indication of how restrictions occur on YouTube. It shows that the platform has the capability of limiting the number of people who can see some of the content. This technique can apply to extreme content that is posted on the platform. YouTube can restrict such content rather than enabling algorithms to market the content. If it does not fully violate any community guidelines, they can just restrict it, thus ensuring that only a few people get suggested with such content. Opponents of this idea might argue that some content may get restricted even when it does not violate community guidelines as shared by Southerton et al. (2020). In response, it is vital that YouTube moderators focus on objectively analyzing content. They must rely on tools that allow them to only restrict content that is extreme and not content that goes against a political or social ideology. Adoption of this restriction will assist these platforms in regulating content, while still enabling users to create content without fear of getting banned or canceled.

youtube JPG” by renaissancechambara is licensed under CC BY 2.0.

In conclusion, the current essay has explained some of the ways in which social media platforms can create a balance between freedom of expression and managing of extreme content. The first suggested way is through the reliance on community guidelines. Although some people have shared how these guidelines are misused, implementation of these guidelines objectively can assist in protecting users against extreme content, while enabling creators to create on these platforms. For instance, the essay has shown examples of how YouTube manages child safety violations on the platform to ensure that extreme content is curbed. On the same note, the platform allows millions of users to create content. A second way involves the adoption of community notes, which is a new technique adopted on Twitter with the aim of regulating content through users who become moderators. The third approach involves the restriction of some content from reaching a lot of viewers to help protect them, while still allowing people to create content freely.

The Internet is here” by PJ Davis is licensed under CC BY 2.0.

References

Aguerri, J. C., Miró-Llinares, F., & Gómez-Bellvís, A. B. (2023). Consensus on community guidelines: an experimental study on the legitimacy of content removal in social media. Humanities and Social Sciences Communications, 10(1), 1–11. https://doi.org/10.1057/s41599-023-01917-2

Maddox, J., & Malson, J. (2020). Guidelines Without Lines, Communities Without Borders: The Marketplace of Ideas and Digital Manifest Destiny in Social Media Platform Policies. Social Media + Society, 6(2). https://doi.org/10.1177/2056305120926622

Mueller, T. S. (2021). Blame, then shame? Psychological predictors in cancel culture behavior. The Social Science Journal, 1–14. https://doi.org/10.1080/03623319.2021.1949552

Ng, E. (2020). No Grand Pronouncements Here…: Reflections on Cancel Culture and Digital Media Participation. Television & New Media, 21(6), 621–627. https://doi.org/10.1177/1527476420918828

Sobaih, A. E. E., Hasanein, A. M., & Abu Elnasr, A. E. (2020). Responses to COVID-19 in Higher Education: Social Media Usage for Sustaining Formal Academic Communication in Developing Countries. Sustainability (2071-1050), 12(16), 6520. https://doi.org/10.3390/su12166520

Southerton, C., Marshall, D., Aggleton, P., Rasmussen, M. L., & Cover, R. (2020). Restricted modes: Social media, content classification and LGBTQ sexual citizenship. New Media & Society, 23(5), 146144482090436. https://doi.org/10.1177/1461444820904362

X. (2023). About Community Notes on X | X Help. Help.twitter.com. https://help.twitter.com/en/using-x/community-notes#:~:text=Community%20Notes%20aim%20to%20create