Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

"Social Media Landscape (redux)" by fredcavazza is licensed under CC BY-NC-SA 2.0.
“Social Media Landscape (redux)” by fredcavazza is licensed under CC BY-NC-SA 2.0.

background

Since the emergence of social media, with the continuous growth of the online market, people’s lives have been inseparable from the Internet, which has been fully integrated into modern society. On social media, users are no longer forced recipients of information, but also producers of content. Therefore, users can be the beneficiaries of the Internet, and at the same time, they can become aggressors. Therefore, some contents are becoming a problem for society. Social platforms and governments are trying to regulate the social content. Because of the harmful content, it will challenge the social rules and moral values. Managing users’ comments is a worldwide problem. The content is associated with the audience’s free speech ideology (Gillespie, 2010). Nevertheless, differences states and cultures have diverse social rules and knowledge background that is hard to analyze whether the content is harmful information.

Why is there harmful content?

On social media, the place where users are located is virtual, and most users speak anonymously. It’s not necessary to follow the norms and constraints of the real world on social media. Netizens don’t have to take any responsibility for their own speeches, so users’ sense of responsibility and legal awareness will be reduced, and they can express their thoughts and feelings emotionally or beyond the bottom line. When browsing some content, many netizens are in an irrational state, expressing their views emotionally through forwarding and verbal attacks on the poster. When there are more and more users with negative emotions, there will be online violence or problematic content events (Dhungana et al. 2021).

Who should be responsible for stopping the spread of this content?

Governments and social platforms should respond to the social citizens.

The government needs to provide laws and regulations to protect users. Even in the virtual world, users are real (Napoli, 2019). Because there is no legal restriction, it will lead to the arrogance of anonymous users. There are legal loopholes in local supervision and enforcement, which are not enough to deter potential perpetrators. Government departments need to put pressure on social platforms to make them spend more energy and time on user management. At the same time, it is also possible to set up clear laws to determine what the content of the problem is, and set up a network special court to let users realize that the network is not an arbitrary crime scene.

Social platform also needs to be responsible for the content of the problem. As the biggest interest party, social platform needs to take on more social responsibilities. The platform should not only be responsible for the obligation to authenticate the platform users’ usage information and review the information content, but also help the victim users, asking them to stop publishing the problem content, and forbidding algorithm recommendation for illegal and problem content. If the supervision obligation of the platform is not in place, it is necessary to share responsibility with the publisher to avoid the platform becoming a shield for the problem content.

The advantages and disadvantages of government or social platforms regulations.

"Facebook-Changes-Motivational-Poster-Mark-Zuckerberg-Funny-3" by CleveredFool.com is licensed under CC BY 2.0.
“Facebook-Changes-Motivational-Poster-Mark-Zuckerberg-Funny-3” by CleveredFool.com is licensed under CC BY 2.0.

These regulations will lead to prejudice and deviation. The Indian legal department advised Facebook auditors to mark all negative emotions and criticisms about religion as potentially illegal content. But according to legal scholars, it is illegal only if it is aimed at inciting violence. This reflects the inadequacy of government agencies as regulatory authorities. In order to control people’s thoughts, the government censors the contents, and users may not be able to freely judge the views and positions of the corresponding local governments.

However, the regulatory mechanism of social platforms may be inconsistent with the regulatory requirements of the government, and social media may decide the regulatory strategy according to the company’s preferences and interests, which may deviate from local policies or public interests (Popiel, 2018). Social platforms firstly consider the company benefit rather than users benefit. Media supervision may lead to more content problems because the platform needs more attention to Interact with user. Therefore, the platform will deliberately push some problems or push them to more people. The New York Times pointed out that the Chinese government will rectify the content of social media in 2020, because under the huge fan culture like Weibo, users of the platform are encouraged to participate in the money-hitting activities and various “friendly” communications. Many users are relieving the online bullying and hate content, because users will slander their disliked stars and create all kinds of fake news and content. After the rectification of government agencies, the content of the platform has been greatly improved, which not only eliminates the problematic contents, but also improves the network environment of ordinary users.

If there are no regulators inside or outside social platforms, then someone inside will use their power to gain benefits in exchange for content exposure, which will lead to various social problems, such as the algorithm of bribing social media by political parties, improving the negative information of the other candidate and reducing their own negative information, thus affecting the election results. Moreover, the regulators of social media can’t cope with the different regulatory requirements from governments of various countries, and lack of professional grasp of national ideologies, which makes them unable to make independent decisions, such as the requirements of the Indian government just mentioned.

Best solution 

The first step is to clarify the problems that need to be managed. The government needs to set up a network supervision department to jointly manage the social platform; with relevant departments, such as public security organs and market supervision and management offices. Then, under the cooperation of the government and social platforms, an algorithm suitable for the national conditions will be established, risk identification and prediction will be done, the comments sent by users will be intercepted, and the users will be prompted that the content is suspected of violating the rules. If the user insists on sending it, the content will not appear in front of other users, but will only be visible to himself, which will not affect the user’s freedom of speech. 

On the other hand, the visibility can be divided by age, just like a movie channel. The platform can share user data with government agencies, so that the platform and the government can supervise each other, reduce data leakage and strengthen user group management. Reducing the content of problems can also reduce the occurrence of problems from the user’s point of view. The platform implements the network real-name system, which can not only reduce the users’ escape psychology, but also investigate the responsibility and severely punish the operating platform with cyber violence. The last point is the arrangement and training of platform personnel. As a supervisor of content, knowledge background and ideology cognition must be well established, so as to greatly reduce work deviation, and the preference for political parties must not exist. The government also needs to equip professionals commensurate with the service scale, strengthen training and assessment, and improve the ability and quality of inspectors.

Even though the cooperation between government departments and social media can improve the network environment, the realization of supervision is still a little difficult. The government may unite or force enterprises to monitor the lives of users, which will build a new world where there is no privacy space and the government controls citizens’ thoughts (Chung, Wihbey, 2022). because social media platforms are separate from all of the world,each states may have different ideology or laws that the regulation may cause conflict.

But nowadays, if the network intends to introduce fair regulation, then social media must cooperate with government regulators.

References 

Aodhan Beirne. (2018). 5 Takeaways From Facebook’s Leaked Moderation Documents. New York Times. Retrieved from:

https://www.nytimes.com/2018/12/27/world/facebook-moderators-takeaways.html?_ga=2.235709761.11260552.1665155368-810460278.1665155368

Alexandra, S, Amy, C., & Cao Li. (2021). China’s Celebrity Culture Is Raucous. The Authorities Want to Change That. New York Times. Retrieved from:https://www.nytimes.com/2021/08/27/business/media/china-celebrity-culture.html

Chung, M., & Wihbey, J. (2022). Social media regulation, third-person effect, and public views: A comparative study of the United States, the United Kingdom, South Korea, and Mexico. New Media & Society. https://doi.org/10.1177/14614448221122996

Deb, A., Luceri, L., Badawy, A., & Ferrara, E. (2019). Perils and Challenges of Social Media and Election Manipulation Analysis: The 2018 US Midterms. arXiv.org.

Dhungana Sainju, K., Mishra, N., Kuffour, A., & Young, L. (2021). Bullying discourse on Twitter: An examination of bully-related tweets using supervised machine learning. Computers in Human Behavior, 120, 106735–. https://doi.org/10.1016/j.chb.2021.106735

Gillespie, T. (2010). Content moderation, Al, and the question of scale. Big Data & Society. 7(2), p. 1-5. https://doi.org/10.1177/2053951720943234.

Napoli, P. M. (2019). User Data as Public Resource: Implications for Social Media Regulation. Policy and Internet, 11(4), 439–459. https://doi.org/10.1002/poi3.216

Popiel, P. (2018). The Tech Lobby: Tracing the Contours of New Media Elite Lobbying Power. Communication, Culture & Critique, 11(4), 566–585. https://doi.org/10.1093/ccc/tcy027