Group 3
Xi Yu 500336166
Sirui Li 510166294
Jocelyn Tong 500401972
Example 1:
The government is the leader in preventing the dissemination of problematic content. Generally, the government protects the Internet environment by expressly prohibiting dissemination and enacting laws and policies, combating violent online extremism and terrorism. For example: In order to stop people from misusing the Internet for propaganda and incitement to terrorist acts, Canada joined other government and business leaders from around the world in adopting a global commitment to remove terrorist and violent extremist content online, establish a platform such as Moonshot, Mediasmart, YWCA and GIFCT for analyzing terrorist content, and host an online summit to combat violent extremism. Spontaneous order and community rules alone cannot effectively prevent the dissemination of the above-mentioned content in open public spaces such as Internet platforms and websites. Most platforms cooperate with the government, and the censorship and banning of their content will be in line with the government-led direction of anti-extremism and anti-violence. The government therefore assumes the primary responsibility for the dissemination of problematic content on the Internet and assigns platforms and online service providers a direction in which to establish a censorship order.
Figure1: GIFCT, which was created by Facebook, Google, Twitter, and Microsoft, brings together professionals from business, government, civil society, and academia to create and carry out joint solutions to violent extremism and terrorist use of the Internet.
Example 2:
Here’s an example of content governance at Facebook in 2018. Facebook is very concerned about the service safety and community environment of its digital platforms, whether it’s terrorism, bullying or speech threats. Facebook has proposed a series of solutions and blueprints for implementation. First, Facebook sets community standards to regulate user behaviour. Moderators will review content and images in strict accordance with the guidelines.
Second, Facebook introduced artificial intelligence to actively identify potentially problematic comments in the online community and remove them as quickly as possible. This intelligent technology allows harmful content to be reported before people find it.
Third, Facebook argues for an independent governance model. By decentralizing decisions about speech control to other organizations, Facebook ensures accountability and prevents over-centralization of decision-making.
Example 3:
Internet users and Internet service providers should be held liable for online violence and hold Internet service providers liable for indirect online violence when they fail to take necessary measures in a timely manner.
Reference list:
Government of Canada. (2022, Mar 16). Addressing violent extremist and terrorist use of the internet. https://www.publicsafety.gc.ca/cnt/bt/cc/vti-en.aspx
Zuckerberg, M. (n.d.). A Blueprint for Content Governance and Enforcement. Facebook. https://m.facebook.com/nt/screen/?params=%7B%22note_id%22%3A751449002072082%7D&path=%2Fnotes%2Fnote%2F&refsrc=deprecated&_rdr