MarketOversight Board (Meta)
Company Profile

Oversight Board (Meta)

The Oversight Board is a body that makes consequential precedent-setting content moderation decisions on the social media platforms Facebook and Instagram, in a form of "platform self-governance".

History
Founding In November 2018, after meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook to oversee content moderation, CEO Mark Zuckerberg approved the creation of the board. Among the board's goals were to improve the fairness of the appeals process, give oversight and accountability from an outside source, and increase transparency. Between late 2017 and early 2018, Facebook had hired Brent C. Harris, who had previously worked on the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, and as an advisor to non-profits, to become the company's Director of Global Affairs. Harris led the effort to create the board, reporting to Nick Clegg, who reported directly to Zuckerberg. Harris also credited Clegg's involvement, saying that efforts to establish the board "wouldn't have moved absent Nick's sponsorship", and that it was "stalled within the company until Nick really took it on". In January 2019, Facebook received a draft charter for the board and began a period of public consultations and workshops with experts, institutions, and people around the world. In June 2019, Facebook released a 250-page report summarizing its findings and announced that they are in the process of looking for people to serve on a 40-person board (the board ended up having 20 members). In January 2020, it appointed British human rights expert and former Article 19 Executive Director Thomas Hughes as Director of Oversight Board Administration. It also said that board members would be named "in the coming months". In February 2025 it was announced that Daniel P. Eriksson was taking over as Executive Director, moving from the role as CEO for Transparency International. Activity On May 6, 2020, Facebook announced the 20 members that would make up the Oversight Board. Facebook's VP of Global Affairs and Communications Nick Clegg described the group as having a "wide range of views and experiences" and who collectively lived in "over 27 countries", speaking "at least 29 languages, but a quarter of the group and two of the four co-chairs are from the United States, which some free speech and internet governance experts expressed concerns about. It starting accepting cases on October 22, 2020. The board officially began to cover cases related to Threads in May 2024. Earliest decisions and actions On January 28, 2021, the board ruled on five moderation decisions made by Facebook, overturning four of them and upholding one. All but one were unanimous. Each ruling was decided by a majority vote of a panel of five members of the board, including at least one member from the region where the moderated post originated. The text further contrasted terrorist attacks in France in response to depictions of Muhammad with an asserted relative silence by Muslims in response to the persecution of Uyghurs in China, In reviewing Facebook's decision to remove the post, the board sought a re-translation of the post, "Zwarte Piet" Blackface Decision On April 13, 2021, the board upheld the removal of a Facebook post by a Dutch Facebook containing a 17-second video of a child and three adults wearing traditional Dutch "Sinterklaas" costumes, including two white adults dressed as Zwarte Piet (Black Pete), with faces painted black and wearing Afro wigs. The board found that although the cultural tradition is not intentionally racist, use of blackface is a common racist trope. Ban of Donald Trump Facebook's deplatforming of U.S. President Donald Trump was not among the initial decisions as it was collecting comments from the public. On January 6, 2021, amidst an attack at the Capitol while Congress was counting the electoral votes, Trump posted a short video to social media in which he praised the rioters, despite urging them to end the violence, and reiterated his baseless claim that the 2020 presidential election was fraudulent. Several platforms, including Facebook, removed it, with Facebook's vice president of integrity, Guy Rosen, explaining that the video "contributes to rather than diminishes the risk of ongoing violence". That day, Facebook also blocked Trump's ability to post new content; the next day, Facebook said the block would remain at least until the end of Trump's term on January 20. On April 16, 2021, the board announced that it was delaying the decision on whether to overturn Trump's suspensions on Facebook and Instagram to sometime "in the coming weeks" in order to review the more than 9,000 public comments it had received. Notably, on January 27, 2021, incoming board member Suzanne Nossel had published an op-ed in the Los Angeles Times titled "Banning Trump from Facebook may feel good. Here's why it might be wrong", but a spokesperson announced that she would not participate in the deliberations over the Trump's case and would be spending the upcoming weeks in training. The board specified that Facebook's standard procedures involve either a timed ban or a complete removal of the offending account, stating that Facebook must follow a "clear, published procedure" in the matter. On June 4, 2021, Facebook announced that it had changed the indefinite ban to a two-year suspension, ending on January 7, 2023. Trump's Facebook account was later reinstated in March 2023, with Meta saying the public should be allowed to hear from politicians, but that Trump would be subject to "heightened penalties" for repeated violations of its rules. XCheck Program In September 2021, the board announced it would review Facebook's internal XCheck system, which fully exempted high-profile users from some of the platform's rules and regulations as well as partially exempting less high-profile users with their posts subjected only to Facebook's content review. This program was a separate system and queue, intended only for around 5.8 million users. The board's quarterly report, issued on October 21, 2021, stated that the company was not transparent about the XCheck program and did not provide the board with complete information upon which to conduct a review. The board also noted that the company's lack of transparency with users about reasons for content deletion was unfair. In response, the company stated that it would aim for greater clarity in the future. == Enabling documents ==
Enabling documents
As the Oversight Board is not a tribunal, court of law, or quasi-judicial body, it is not guided by enabling legislation created by any government. Instead, a corporate charter, bylaws, and series of governing documents set out the scope and powers of the Board. == Governance ==
Governance
In order to ensure the board's independence, Facebook established an irrevocable trust with $130 million in initial funding, expected to cover operational costs for over half a decade. The board is able to hear appeals submitted by both Facebook and its users, and Facebook "will be required to respond publicly to any recommendations". The entire Oversight Board is overseen by the Oversight Board Trust, which has the power to confirm or remove new board appointees, as well as ensure that the board is operating in accordance with its stated purpose. Members The charter provides for future candidates to be nominated for board membership, through a recommendations portal operated by the U.S. law firm Baker McKenzie. The 20 members of the Oversight Board were announced on May 6, 2020. The co-chairs, who selected the other members jointly with Facebook, are former U.S. federal circuit judge and religious freedom expert Michael McConnell, constitutional law expert Jamal Greene, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt. On April 20, 2021, its newest board member, PEN America CEO Suzanne Nossel, was appointed to replace Pamela S. Karlan, who had resigned in February 2021 to join the Biden administration. , the United States has the most substantial representation with five members, including two of the four co-chairs of the board. Two board members come from South American countries, six come from countries all across Asia, three come from Africa including one with both African and European ties, who also counts towards three coming from Europe, and one comes from Australia. Former Members Oversight Board Trustees Former Trustees Table of decisions ==Responses==
Responses
Facebook's introduction of the Oversight Board elicited a variety of responses, with St. John's University law professor Kate Klonick describing its creation as a historic endeavor, and technology news website The Verge deeming it "a wild new experiment in platform governance". Politico described it as "an unapologetically globalist mix of academic experts, journalists and political figures". Other critics expressed doubts that it would be effective, leading to the creation of an unrelated and unaffiliated group of "vocal Facebook critics" calling itself the "Real Facebook Oversight Board". Facebook issued no official comment on the effort, while Slate described it as "a citizen campaign against the board". Legal affairs blogger Evelyn Douek noted that the board's initial decisions "strike at matters fundamental to the way Facebook designs its content moderation system and clearly signal that the FOB does not intend to play mere occasional pitstop on Facebook's journey to connect the world". ==References==
tickerdossier.comtickerdossier.substack.com