Can platform audit guide public opinion?

I. Introduction

Platform vetting has become a crucial step in social media life that cannot be ignored. The rules of vetting are already present in the user guidelines when registering for a platform account they are already present in the user guidelines. However, these vetting tools have sparked a complex debate about freedom of expression, information filtering and transparency. Many users believe that the content they receive is rich and authentic due to the platform’s censorship. However, the power to define the boundaries of a user’s speech is given to the media, which reduces the influence of dissenting voices and puts the media in control of a massive cultural force. (Roberts, 2019) So, does platform vetting influence social opinion? This paper will explore an in-depth look at vetting tools on social media, analysing their positive and negative impacts. This paper may use content diversity theory, freedom of speech theory and social constructionist theory to interpret.

II. The Platform may block some non-mainstream voices


Measures taken by platforms in their audits may reduce users’ freedom of expression. Appealing to freedom of expression or healthy communities is the primary purpose of platform vetting. Still, in practice, these rules will restrict the speech of some users and prohibit some from accessing the community. This opacity increases the risk of abuse of vetting power, which may result in legitimate non-mainstream voices being blocked for no apparent reason.

Example

In 2021, social media accounts related to the LGBTQ movement operating on Chinese campuses were blocked and deleted. One of the operators of the social accounts stated that they would have established reliable links with a wide range of people but were only able to communicate through private communities after being blocked from social media discourse. Ni and Davidson (2021) bluntly state that cutting off the ability to pass on information makes it more difficult for sexual minorities to advocate for their rights, which fuels prejudice and resistance from a broader range of people.

LGBT” by David Michael Morris is licensed under CC BY-SA 2.0.

“This blockade restricts the mere expression of views and the exercise of freedom of speech and the right to freedom of expression,”

——Said Ned Price, a spokesman for the U.S. State Department.

For practical reasons, social media platforms may need rules that can be followed and are protective of users. However, many topics are routinely censored for politically or culturally sensitive content and removed. At the same time, the platforms’ policy teams do not provide a reasonably clear guide to deciding why removals should be made and provide satisfactory justification for reductions when disputed under the shadow of users’ own or public scrutiny. Despite their attempts to become neutral and work for the platform environment and quality of speech, such definitions are necessarily subjective, blurred lines and incomplete. They support these definitions with logical principles and borrowed value systems to legitimise their imposition. (Roberts, 2019) It runs counter to the principles of many platforms: open participation, unfettered interaction and speech protection.

As one Facebook manager worried “If you exist to make the world more ‘open and connected’ and you’re a content-sharing platform, the key question to answer is, why are you deleting anything? Delete anything, right? Because deleting stuff reduces openness and makes the world worse.

This behaviour clearly ignores the diversity of content. Napoli’s (2017) study says that social media is trying to create a utopia where information filtering bubbles can confine users to their own world, where they can only see the dominant unified viewpoints, and more voices are marginalised. The aim is to reduce the political pressure on the platform and the negative impact of controversial topics.

III. The platform’s right to censor may lead to the direction of public opinion

The censorship function empowers platforms to blindly trust public opinion. More and more users are aware of the platform’s censorship mechanism, which has led to greater trust in the content pushed by the platform during use. But platform censorship can be influenced by the internal political and commercial interests of platform companies who want to believe that the platform is in line with their goals and values.

Example

The Twitter accounts of Trump and his family were blocked in 2021 for posting inflammatory comments. Twitter Inc. also made a note: “After carefully reviewing recent tweets from Trump’s account and related issues, we have permanently blocked the account due to concerns about the risk of further incitement to violence.” So according to Twitter’s logic, the blocking of Trump’s account is just a routine operation carried out by the company in accordance with the platform’s regulations. In a commentary, the Sunday Times stated that “blocking the social media account of a sitting president is, however one looks at it, an attack on ‘free speech’.” (Allyn & Keith, 2021)

President Trump’s Twitter account, @realDonaldTrump, has been permanently suspended, the company announced. Twitter/Screenshot by NPR

Regarding cyberspace governance in the United States, there is a regulation called the Net Neutrality Regulation, according to which market users can choose to access services according to their own wishes, and the network providers are not allowed to show favouritism. The Trump administration repealed it, so firms like Twitter and Facebook could direct, restrict, and influence users’ access as they wished, employing monitoring, screening, and deleting opinion information (Khimm, 2017). This provides a legal backing for the blocking.

Rachel Maddow reports on the FCC’s vote to end net neutrality rules that forced internet service providers to treat all content equally.

In the face of the incoming Biden, as a substantial technological power or resource advantage of technology companies, Twitter, Facebook and others inevitably consider the relationship with the new administration to obtain a larger space for future development or bright prospects, control public opinion, win favour naturally and logically. This time, Trump’s Twitter account is blocking Twitter, Facebook, and the market entity “unauthorized”. Such censorious blocking is the power given to media platforms. (Allyn & Keith, 2021)


So, at the level of value rationality, Twitter’s censorship blocking raises concerns about manipulating public opinion. As Gillespie (2019) states, they want to believe that the platform aligns with their goals and values. These guidelines, therefore, reveal not only their performance on a particular issue but also the performance that legitimizes the anxieties and assumptions of the platform providers and the challenges they face as curators of public discourse.

A private company should not change its basic attributes as a market entity, and should not be able to influence and steal political power by becoming an unauthorized adjudicator of “freedom of speech”. The banning of Trump’s tweets is a reflection of the invisible theft of speech by the technological oligarchs, and of the fact that democracy is controlled by public opinion, which faces a great crisis in the information age.

IIII.Platform auditing is an excellent tool to maintain the social environment


However, some believe platform auditing does not guide the audience’s thoughts. A reasonable auditing mechanism is to deliver the correct ideology to the public. Content auditing is a realistic need to maintain society’s public and long-term interests under the conditions of new technology.

Example

During a pandemic, when people face unknown things out of fear, and all kinds of speculations and rumours spread rapidly on the Internet, the Chinese government successfully created a legalized network environment by controlling the dissemination of information through the auditing mechanism of network platforms. During China’s city closure, microblogging not only became a channel for people to follow the development of the epidemic and express their personal opinions but also an essential tool for Internet users who needed help with healthcare and other issues (Han et al., 2020, Luo et al., 2020). Thus, during the epidemic, the resulting massive amount of social media data reflected the dissemination of information on social networking platforms. Roberts (2019) argues that these details reflected perceptions of this major adverse public event and became crucial for improving emergency response. The Chinese government’s ability to utilize platform auditing to manage negative information on the Internet minimized public sentiment. It mitigated the unintentional chaos and erosion of public trust during the pandemic.

However, we must admit that the control of auditing does have a role in maintaining public ideology and a favourable social environment. However, this paper argues that when news and public opinion threaten the ruling class and its ideological security, the ideological attributes of media manipulated by power are exposed. Under the new technological conditions, it is difficult for the media’s public opinion, which is only based on a particular class or group of people, to gather the power for the stable development of society.

On the one hand, platform administrators promise not to be “Big Brother”, i.e. not to censor user content excessively, in order to protect users’ privacy and right to free expression. However, on the other hand, they also emphasize the need for users to behave carefully on the Internet, implying that users should be responsible for what they say and do.

References list

Allyn, B., & Keith, T. (2021, January 8). Twitter Permanently Suspends Trump, Citing “Risk Of Further Incitement Of Violence.” NPR.org. https://www.npr.org/2021/01/08/954760928/twitter-bans-president-trump-citing-risk-of-further-incitement-of-violence

Gillespie, T. (2019). Custodians of the Internet. Yale University Press. https://doi.org/10.12987/9780300235029

Khimm, S. (2017, December 22). Trump’s regulatory rollbacks, from net neutrality to campus sex assault. NBC News. https://www.nbcnews.com/politics/white-house/trump-s-regulatory-rollbacks-net-neutrality-campus-sex-assault-n830926

Napoli, P. M. (2017). What If More Speech Is No Longer the Solution? First Amendment Theory Meets Fake News and the Filter Bubble. Federal Communications Law Journal, 70(55). https://www.semanticscholar.org/paper/What-If-More-Speech-Is-No-Longer-the-Solution-First-Napoli/07a069f12fce08bf3c66d5b9866082db7b74bbd1

Ni, V., & Davidson, H. (2021, July 8). Outrage over shutdown of LGBTQ WeChat accounts in China. The Guardian. https://www.theguardian.com/world/2021/jul/08/outrage-over-crackdown-on-lgbtq-wechat-accounts-in-china

Roberts , S. T. (2019, June 25). Behind the Screen : Content Moderation in the Shadows of Social Media. Proquest; Yale University Press. https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/detail.action?docID=5783696&pq-origsite=primo#

Shi, W., Zeng, F., Zhang, A., Tong, C., Shen, X., Liu, Z., & Shi, Z. (2022). Online public opinion during the first epidemic wave of COVID-19 in China based on Weibo data. Humanities and Social Sciences Communications, 9(1). https://doi.org/10.1057/s41599-022-01181-w

Be the first to comment on "Can platform audit guide public opinion?"

Leave a comment