In an era where digital platforms hold significant influence over public discourse, the way these platforms are regulated has never been more crucial. Social media platforms emerged from the vast complexity of the internet.These platforms provide unique chances for worldwide interaction but come with their own set of challenges(Gillespie, 2018).Flew, Martin, & Suzor (2019) propose that external regulation, in the form of new laws and regulations, may be necessary to effectively govern Facebook and other digital communication platforms.This paper explores the creation of the Facebook Oversight Board, its role, real-world implications, and the ongoing debate surrounding self-regulation versus external regulation of giant tech platforms like Facebook.
Section 1: The Emergence of Facebook’s Oversight Board
The massive scope and influence of Facebook, with over 2.5 billion monthly active users and an astounding daily process of around 275,000 content pieces, present a challenging task for thorough content monitoring and regulation (Douek, 2019). From 2004 to early 2010, Facebook lacked a dedicated professional content moderation team, along with a concrete content moderation policy. As the platform’s user base grew and diversified globally, coupled with an ever-expanding range of content, the initial approach relying on a handful of non-expert moderators and internal staff to curate content policies proved insufficient.
A stark illustration of the challenges faced was the tragic assassination of Jo Cox, a British Labour MP, in 2016. Subsequent investigations by the UK House of Commons Home Affairs Committee revealed the prevalence of online hate, abuse, and extremist content targeting public figures on major platforms, including YouTube, Facebook, and Google.
This content disproportionately targeted women and racial or ethnic minorities, underscoring the dire need for more robust contentregulation(Jones,2019). Roberts (2019) emphasized the pivotal role of moderation, stating, “Moderation and screening are critical processes that safeguard the brand integrity of platforms. They ensure user compliance with site rules, adherence to legal mandates, and sustain a user base enthusiastic about engaging with content”. In the wake of such complexities and in pursuit of safeguarding online free speech, Facebook instituted the Oversight Board (Klonick, K., 2020). This entity serves to guide Facebook through intricate decisions about content removal, retention, and rationale, with the overarching aim to augment the platform’s safety, credibility, and trustworthiness. The Oversight Board embodies a commitment to uphold free expression rights through autonomous judgment, ensuring this freedom is rightfully upheld.
Section 2: Real-World Implications
Globally, social media content censorship strategies are the focus of widespread public concern, especially when it comes to sensitive and controversial content. For example, the decision to photograph breast cancer is a case in point of a topic that has attracted widespread attention and controversy.In 2020, a Brazilian woman attempted to raise awareness of the disease by posting related photos on Instagram(Binder, 2021). Initially, Facebook’s automated system had removed the post for allegedly violating community standards related to adult nudity and sexual content.The Oversight Board determined that the post was aimed at “raising awareness about breast cancer,” which falls under an exception permitted by the policy and reversed a decision by Facebook to take down this Instagram post. In this case, Facebook’s automated review raised significant concerns about the protection of human rights.
The oversight board took a proactive stance on this, as they not only reviewed the decision, but also made a series of oversight recommendations to improve many of Facebook’s content review mechanisms. These recommendations covered key areas such as increased transparency, refinements to automated and manual content review, strategies to safeguard users’ due process rights, and consideration of cultural and social context in content review. In particular, the oversight board emphasizes the need for greater care and coordination in dealing with content involving women, members of racial and ethnic minorities. This is not only because these groups are commonly targeted by cyberattacks, but also because their rights and voices are often ignored or marginalized in society(Leibowitz, F.. 2003).
The actions and guidance of the Oversight Board highlight its positive role in the content censorship strategy. Not only does it provide a fair and impartial adjudication mechanism, it also offers valuable guidance on how to better protect user rights, respect cultural diversity and improve the quality of content censorship.
Section 3: Regulatory Challenges and Considerations
(i) Doubts about the independence of the oversight board
The Facebook Oversight Board, like any relatively new initiative with significant implications for public discourse, has faced its fair share of criticisms and concerns.For example, the independence of the Oversight Board and the process of case selection.In discussing the public acceptance and criticisms of Facebook’s Oversight board, we can explore in more detail Facebook’s Content Governance and Enforcement Blueprint and the Oversight Board Trust in more detail.
The blueprint requires audit teams to conduct audits following the published Facebook Internal Enforcement Guidelines to minimize biased decisions and guarantee uniform handling of analogous situations(Zuckerberg, 2018). However, while this approach helps improve transparency and consistency, it brings up concerns about whether the review team is relying heavily on the guidelines and lacks sufficient freedom of judgment. Some detractors might also argue whether such guidance limits flexibility in dealing with complex and gray area issues.The Chapter 2, Article 2 of Oversight Board Trust provides Facebook with a waiver of all fiduciary rights, with some key provisions and special exceptions (Zuckerberg, 2018). This raises questions about the true independence of the Oversight Board, as the document does not clearly define which provisions would implicate fiduciary rights. This could lead to public concerns about whether the Oversight Board might become an extension of Facebook in certain circumstances, rather than acting as a fully independent entity.
These considerations highlight the challenges Facebook faces in implementing content governance and the public’s skepticism about the platform’s management. The Oversight Board’s independence and freedom of decision-making are key to realizing its mission, and ongoing review and discussion of these issues is therefore critical.
(ii) Lack of transparency in the selection of cases by the Oversight Board
Openness and transparency of the Oversight Board’s review criteria, the personnel of the review panel, and other elements are prerequisites for guaranteeing due process.
In 2018 Facebook published its Community Standards and Content Policy so that users have some expectation of the consequences of their actions(Bickert, 2018).In terms of transparency, the Charter changed the previous way of hiding the content policy and its implementation by clarifying the structure of the Supervisory Board and its handling procedures, requiring that the Board should explain its handling decisions and publish an annual report on time to show the number and types of cases, Facebook’s advice on the policy, and the status of its implementation. Nonetheless, there are still aspects of the Oversight Board that are not transparent. For instance, although the identities of case review members are generally known to the public, specific experts within a panel operate anonymously to ensure their safety and impartial judgment. The Oversight Board is expected to detail in its yearly report the quantity and nature of cases it reviews, as well as the selection criteria. However, in the first quarterly report of 2023 (Rosen, 2023), the process and criteria for selection remain undisclosed.
The primary role of the oversight board is to safeguard the platform’s right to free speech while assessing Facebook’s contentious choices, aiming to decrease incorrect deletions and validate the platform’s content moderation methods.The Facebook Oversight Board represents an experiment in self-regulation amidst growing concerns about digital platforms’ influence and responsibility. Although the oversight board’s setup has not been perfect so far, it has played a positive role in exploring the path of platform governance.
Bickert, M. (2018, April 24). Publishing our internal enforcement guidelines and expanding our appeals process. Meta. https://about.fb.com/news/2018/04/comprehensive-community-standards/
Binder, M. (2021, October 29). Facebook’s oversight board makes bizarre ruling in its first group of decisions. Mashable. https://mashable.com/article/facebook-oversight-board-first-descisions
Douek, E. (2019). FACEBOOK’S “OVERSIGHT BOARD”: MOVE FAST WITH STABLE INFRASTRUCTURE AND HUMILITY. North Carolina Journal of Law & Technology, 21(1), 1–33. https://scholarship.law.unc.edu/ncjolt/vol21/iss1/2
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33–50. https://doi.org/10.1386/jdmp.10.1.33_1
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the Internet (pp. 1–23). Yale University Press. https://doi.org/10.12987/9780300235029-001
Jones, H. (2019). More in common: the domestication of misogynist white supremacy and the assassination of Jo Cox. Ethnic and Racial Studies, 42(14), 2431–2449. https://doi.org/10.1080/01419870.2019.1577474
Klonick, K. (2020). The Facebook oversight board: Creating an independent institution to adjudicate online free expression. The Yale Law Journal, 129(8), 2418–2499.https://sydney.primo.exlibrisgroup.com/permalink/61USYD_INST/2rsddf/cdi_rmit_agispt_search_informit_org_doi_10_3316_agispt_20200704032853
Leibowitz, F. (2003). “Images” of the Female and of the Self: Two Recent Interpretations by Women Authors. Hypatia, 18(4), 283–291. https://doi.org/10.1111/j.1527-2001.2003.tb01423.x
Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the Screen (pp. 33–72). Yale University Press. https://doi.org/10.12987/9780300245318-003
Rosen, G. (2023, June 13). Integrity and transparency reports, first quarter 2023. Meta. https://about.fb.com/news/2023/05/integrity-and-transparency-reports-q1-2023/
Zuckerberg, M. (2018, November 15). A blueprint for content governance and enforcement: Facebook. https://perma.cc/ZK5C-ZTSX