Problematic content surrounding your network? How to make it better

No violence no hate speech by John S. Quarterman is licensed under CC BY 2.0

Digital platform companies need to take social responsibility

The growth of social media has spawned a new set of institutional practices in recent years along with a growing accumulation of digital platform users more concerned about the right of platform companies to control online speech. The responsibility of platforms stems from the fact that social media business models and algorithms hide specific content. Internet platforms, including Facebook and Twitter, facilitate unfettered freedom of expression for millions of people. They moderate restrictions on speech based on “community standards” in order to provide a safe online environment for platform consumers to obtain more user data to sell to third-party service companies. This dual role gives Internet giants absolute power over digital online management, surpassing even most governments’ control over citizens’ personal information (Lee, 2020). Concerns about platform regulation perceived social media bias, and discussions of socially significant events, including elections, combine to form a complex social activity nexus for digital platforms. According to Riedl et al. (2021), approximately seventy percent of Americans believe that social platforms may censor sensitive terms such as terrorist propaganda, child pornography, and hate speech, among other socially damaging public discourse. Both government and online activist activists are calling out the effectiveness of content review on platform regulation.

“Addicted to social media and Internet” by Smriti Rai is licensed under CC0

The community-driven online media space has always been moderated in some way, but the growth in the volume of information has required media platforms to develop more sophisticated moderation techniques. Content moderation refers to governance mechanisms that build cultural engagement to promote collaboration and avoid rights abuse, and it provides greater transparency and specific safeguards. Content moderation on digital platforms comes in two primary forms, automated and manual information processing. Automated content review relies on machine learning techniques that match content with certain words or images from known domains for data analysis. The automated review requires comparison with existing data, meaning that the content needs to be canonical, or at least known to the machine in order to act on controversial content (Gerrard & Thornham, 2020). At the macro level, manual review is a specific discussion about what content is controversial. The third-person effect(TPE) is strongly associated with manual review. When individuals exposed to mass media messages perceive textual content to be more influential or persuasive to others than to themselves, individuals make decisions based on their expectations of media influence. For instance, people may support limiting or censoring media messages to protect others who may be affected, which may be motivated by specific emotional considerations such as sympathy. Bringing TPE into content censorship systems demonstrates that citizens evaluate the experiences of others with stronger third-person perceptions in terms of disinformation or cyberbullying, for example, and perceive a greater impact on others than themselves in the context of particular content (Riedl, 2021). A manual review-led censorship system largely controls what users can do and say on the platform. Therefore, balancing the structural aspects of information processing on platforms with the shaping of discourse about specific content is key to content review.

 

Content moderation while understanding the importance of freedom of expression

Freedom of expression is usually recognized as a human right in Western countries and has a certain priority as a major civil right. Typically, most measures used for platform regulation have freedom of expression as their main reference, assigning a strict regulatory framework to most online actors. In fact, the conflict between freedom of expression and public rights is part of the public sphere in the practice of digital culture. (Almeida, et al., 2021). Freedom of expression has long been seen as an umbrella by platform companies because most Western countries have laws that prevent Congress and government departments from enacting any laws that restrict freedom of expression. The Supreme Court has stated that this provision applies to all businesses and individuals except the legislative branch, meaning that the federal government and other branches cannot take effective action when online activists use online media to discuss negative information. In a way, the law limits the government’s interference in the behavior of social media and does not limit the behavior of private groups, also known as media platforms (Targeted News Service, 2022).

Transparency is only the starting point, with platform companies revealing little to no specific procedures used for content moderation to users. The line between violations, whether by politicians or citizens, always seems to be blurred. Many scholars have criticized digital media vetting procedures and the opaqueness of information, and platforms have responded by not wanting to subject operational weaknesses to public scrutiny. Content moderation is not perfect, and democratic institutions and freedom of expression can sometimes be a trigger for social events. For example, the national boycott following the murder of George Floyd led to a mass recall of businesses that stopped all advertising on Facebook, demanded that the

“Facebook login page” by SAIF is licensed under CC0.

platform agree to more content moderation to stop the spread of hate speech and misinformation, and expelled racially charged speech like white supremacy and Holocaust denial. More than three dozen Facebook employees resigned and expressed their disappointment with the actions taken by the company. Moreover, Zuckerberg admitted to making mistakes in content moderation, and militia groups promoted a campaign on the Facebook platform asking “patriots willing to take up arms and defend our city from the evil thugs”. Seventeen-year-old Kyle Rittenhouse has been charged with multiple counts of homicide after he shot and killed two protesters with an AR-15 rifle while traveling from Illinois to Kenosha. While it is unclear if Rittenhouse was following the militia’s FB page, according to the survey nearly 1,000 people responded that they were going to participate in events such as these, and another 4,000 said they were “interested” (Lee, 2020). Such events place FB in a social dilemma, where media platforms should be neutral to reconcile conflicting interests, and where their interests intersect with one another, decisions made are likely to be influenced by objective factors, and different content moderation goals can lead to biased decisions made by viewers with subjective emotions.

 

What “Open Data” can do

The open data principle is another way to reduce the amount of problematic content entering the public space. Algorithmic network effects rely on the availability of large data sets for machine learning. Having access to such data is a competitive advantage, and sharing data content can lower the barrier to entry and allow governments to challenge the existing position of tech giants. Open data as a highly intrusive regime carries risks in terms of user privacy protection, and data exchange can only take place in a controlled environment with pre-approved entities and under regulatory oversight. Since the 1980s, liberalized and lax regulation has fragmented the traditional network industry, and mandatory data openness in fragmented industries is an effective measure for network administrators and government agencies to regulate, and a prerequisite to allowing new entrants to compete with existing network effects. Open data enhances information transparency and can facilitate the realization of interoperability principles in old and new network industries (Montero & Finger, 2021). The interoperability obligation of digital networks is used as a regulatory tool for media platforms and has a positive role in limiting the publication of negative information and promoting the liberalization of the online industry.

“Backlinks Are Dead” by Sukhjinder is licensed under CC0.

 

Overall, the new ways of interacting with information that digital platforms have created new social problems. Social media has vaguely constructed its business model as a tool for users to share the information they generate every day, trying to avoid the impact of any questionable content on the platform. Content moderation and data openness as effective digital online regulatory tools can to some extent limit the flow of misinformation. The Internet as a relatively new industry still has a long way to go.

 

 

 

 

 

References

Almeida, V., Filgueiras, F., & Doneda, D. (2021). The Ecosystem of Digital Content Governance. IEEE Internet Computing, 25(3), 13-17. 10.1109/MIC.2021.3057756

Gerrard, Y., & Thornham, H. (2020). Content moderation: Social media’s sexist assemblages. New media & society, 22(7), 1266-1286. https://doi-org.ezproxy.library.sydney.edu.au/10.1177/1461444820912540

Lee, E. (2020). MODERATING CONTENT MODERATION: A FRAMEWORK FOR NONPARTISANSHIP IN ONLINE GOVERNANCE. The American University Law Review, 70(3), 913-1059.

Montero, J., & Finger, M. (2021). Regulating digital platforms as the new network industries. Competition and regulation in network industries, 22(2), 111-126. 

https://doi-org.ezproxy.library.sydney.edu.au/10.1177/17835917211028787

Riedl, J. M., Whipple, N. K., & Wallace, R. (2021). Antecedents of support for social media content moderation and platform regulation: the role of presumed effects on self and others. Information, Communication & Society, 25(11), 1632-1649.

https://doi-org.ezproxy.library.sydney.edu.au/10.1080/1369118X.2021.1874040

Targeted News Service, (2022). Congressional Research Service: ‘Online Content Moderation & Government Coercion’. https://howwww-proquest-com.ezproxy.library.sydney.edu.au/docview/2664063955?pq-origsite=primo