Bullying, harassment, violent content, hate, porn and other problematic content circulates on digital platforms. Who should be responsible for stoping the spread of this content and how?

Instagram and other Social Media Apps By Jason A. Howie Under CC BY-NC 2.0

Today, digital platforms have permeated and connected people’s lives. Typical digital platforms are Google, TikTok, Facebook and Airbnb. Digital platforms have changed the way people communicate, access information, consume and entertain themselves; they have brought huge economic benefits, and they are important tools for managing public opinion in politics. But digital platforms also bring a lot of “problematic content”.

Heavy lifting By JonathanCohen Under CC BY-NC 2.0


There are several reasons for the existence of a large amount of problematic information on digital platforms: First, digital platforms do not set qualification requirements for users and have a high degree of anonymity, which gives more people the right to speak and also provides more room for the spread of “problematic information”. The high degree of anonymity, low barriers to entry and technical support make digital platforms a way for Internet users to express themselves, and some users believe they are not responsible for their actions on the Internet and engage in random acts such as bullying others and publishing problematic information; some organizations and political groups spread information that is violent or incites social division, leading to the spread of “problematic content” on digital platforms. 

Secondly, problematic content can be upgraded quickly on digital platforms. Digital platforms have the technology affordances such as sounds and images, which make information more inflammatory(Majchrzak et al., 2013). Plus, comments and retweets facilitate users’ second transmission and make them feel more involved. It’s not just the users’ fault that leads to the spread of problematic content. It’s also because the platform management is profit-driven, and sometimes the platform allows and facilitates the dissemination of such information. In the early stages of development, some platforms will actively publish “problem content” for the benefit of attracting more users. Meanwhile, the platform audit mechanism is not perfect. At the national level, due to the rapid development of the Internet, there is a certain lag in the establishment of regulatory and punishment mechanisms.


Cyberbullying, would you do it? By kid-josh Under CC BY-NC-SA 2.0 

Benadryl challenge on TikTok leads to teen’s death By 11 Alive Retrieved from https://www.youtube.com/watch?v=KZ3QqLgA_U4

The “problematic content” on digital platforms has a negative impact in various aspects. The distorted values of the Internet can mislead young people who cannot distinguish right and wrong. For example, many good-looking and well-built young on TikTok casually twist their bodies to dance with sexual innuendo to gain a lot of fans and money, which may make teenagers think that learning is not as important as good looks. Moreover, TikTok is often popular with dangerous content videos that are extremely harmful, such as the Benadryl challenge, which leads people to film themselves hallucinating or falling when they ingest large amounts of Benadryl and has even led to the deaths of several teenagers.

Data shows most teens have experienced different forms of cyberbullying, young people who spend more time using social media each day are more likely to suffer from mental illness. For the platform itself, problematic content can affect the platform’s reputation and lose users. For society, content such as hate speech might cause social division, which is detrimental to long-term development and peace. Studies have shown that an increase in online hate speech against minorities is positively correlated with offline violence against minorities. Therefore, it is essential to regulate online problematic content.


Social Media Mix 3D Icons – Mix #2 By Visual Content Under CC BY 2.0

Platforms-ISPs should be responsible for the “problematic content” because the platform is the most direct carrier of such information in the dissemination process. Besides being responsible for their reputation and long-term operation, platforms also need to have social responsibility(Gillespie,2017).

Firstly, technically, the digital platform should improve the effectiveness of the review mechanism. Set up and continuously improve the sample database of “sensitive words, images and sounds”, but the users’ discourse right within the normal range should not be deprived. Platforms could set up a”rating system”, If the AI detects that the content is mildly sensitive, then reduce push amounts to the user, and then restore the normal push after the manual review. Content with sensitive words and a high percentage of images can be temporarily taken down and put into a special manual review area, and then the manual review speed should be as fast as possible. The content posted by the user who downgraded the content should be given a reasonable explanation.

Second, the platform should increase the punishment, such as blocking incorrigible users and those who bring great negative social influence. The platform cannot condone the bad behaviour of some internet celebrities just because they can attract many users.

Third, in the era of Internet globalization, online public opinion has become a new means of international political gaming, which brings much problematic content. Some foreign forces will take advantage of the anonymity of the Internet to publish an inflammatory public opinion. China now discloses users’ IP location on many mainstream digital platforms. Although using VPN could change one’s IP, it does reduce the problematic content to a certain extent, as users will critical thinking the content posted by foreign IPs, and IP policy at least increases the cost of hiring the Internet water army.

Fourth, the platform should add a detailed user education link, clearly write the definition of “problematic content” in the platform convention, and make this part a part of the user registration process that cannot be skipped, which also prevents criminals from defending themselves by “not knowing”. Similarly, in the registration process, the platform should thoroughly implement the real-name system, and then rate the user’s real name level, to give users with low real-name degrees fewer exposure opportunities. Prevent them from posting socially harmful “problematic content” in a relatively anonymous manner.


Handcuff and Locked With Smart Phone
By Jangra Works Under CC BY 2.0 

Government cannot indulge in platform behaviour just because the platform will bring significant economic benefits.

Firstly, the government should strengthen the supervision of digital platforms-ISPs. Digital platforms should not be allowed to publish “problematic information” to attract users. In addition, in terms of regulation, the government should also monitor internet celebrities and media agencies with large audiences, as the information they publish usually reaps huge viewership.

Secondly, the government can cooperate with technology agencies and authorize them to develop security plug-ins that can identify, intercept and report problematic information promptly, and promote them to a wide range of users, of course voluntarily. China now has a similar concept app called the National Anti-Fraud Center app, but the government needs to have strong rights to keep the information collected by this plug-in from being taken away by technology companies for commercial use(Liu, 2021). Although China’s overly strict online content regulation has been criticized by some humanitarians, there are two sides to everything, and it is undeniable that the information regulation implemented in China has been very effective in combating telecom fraud and has largely prevented the spread of “problematic content”.

Thirdly, the government should improve the law, as the scope of “problematic information” is sometimes vague. The government needs to provide more detailed and complete guidelines and requirements for digital platforms. Mature laws can prevent criminals from taking advantage of loopholes in the law to commit crimes. In addition, the government should increase law enforcement efforts, such as training more specialized cyber-technology personnel to serve the government, such as cyber “police”.

International multistakeholder organisations 

The Importance of Multi-Stakeholder Internet Governance By USAandEurope under CC BY 2.0 Retrieved from: https://www.youtube.com/watch?v=qPLooC-Srzk

The Internet connects people all over the world, so criminals can also conduct criminal activities anywhere in the world. But different countries have different Internet governance awareness and models, which makes the work of law enforcement officers many difficult, and it is not enough to rely on platforms and separate government responsibility alone. It is essential to practice a multi-stakeholder Internet governance framework and strengthen cooperation among Internet multi-stakeholders, both to help clear the obstacles to Internet monitoring within countries and to prevent individual countries from using regional hegemony to interfere with public opinion in other countries(Who Makes the Internet Work: The Internet Ecosystem, 2014). International multistakeholder organisations can reconcile the difficulties of enforcing the law across international crimes posed by the splinternet(Lemley,2021).


Digital platform providers should implement the real-name system, strengthen the education of user guidelines, use a technicality to enhance the effectiveness of problematic content monitoring and increase the punishment for users who publish problematic content. The government should strengthen cooperation with digital platforms, and ISPs, but also needs to monitor and punish them, even if they can bring economic benefits. International multistakeholder organisations are also important under the splinternet, coordinating the problems caused by different awareness and management models across international boundaries.

References list

11Alive. (2020). Benadryl challenge on TikTok leads to teen’s death. In YouTube. https://www.youtube.com/watch?v=KZ3QqLgA_U4

Gillespie, T. (2017). Governance of and by platforms. The SAGE handbook of social media, 254-278.

Lemley, M. A. (2021). THE SPLINTERNET. Duke Law Journal, 70(6), 1397+. https://link.gale.com/apps/doc/A666103930/AONE?u=usyd&sid=bookmark-AONE&xid=86275a95

Liu, L. (2021). The Rise of Data Politics: Digital China and the World. Studies in Comparative International Development, 56(1), 45–67. https://doi.org/10.1007/s12116-021-09319-8

Majchrzak, A., Faraj, S., Kane, G. C., & Azad, B. (2013). The Contradictory Influence of Social Media Affordances on Online Communal Knowledge Sharing. Journal of Computer-Mediated Communication, 19(1), 38–55. https://doi.org/10.1111/jcc4.12030

USAandEurope. (2014, May 6). The Importance of Multi-Stakeholder Internet Governance. Www.youtube.com. https://www.youtube.com/watch?v=qPLooC-Srzk

Who Makes the Internet Work: The Internet Ecosystem. (2014). Internet Society. https://www.internetsociety.org/internet/who-makes-it-work/