Founding In November 2018, after meeting with
Harvard Law School professor
Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook to oversee
content moderation, CEO Mark Zuckerberg approved the creation of the board. Among the board's goals were to improve the fairness of the appeals process, give oversight and accountability from an outside source, and increase transparency. Between late 2017 and early 2018, Facebook had hired
Brent C. Harris, who had previously worked on the
National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, and as an advisor to non-profits, to become the company's Director of Global Affairs. Harris led the effort to create the board, reporting to
Nick Clegg, who reported directly to Zuckerberg. Harris also credited Clegg's involvement, saying that efforts to establish the board "wouldn't have moved absent Nick's sponsorship", and that it was "stalled within the company until Nick really took it on". In January 2019, Facebook received a draft charter for the board and began a period of public consultations and workshops with experts, institutions, and people around the world. In June 2019, Facebook released a 250-page report summarizing its findings and announced that they are in the process of looking for people to serve on a 40-person board (the board ended up having 20 members). In January 2020, it appointed British
human rights expert and former
Article 19 Executive Director Thomas Hughes as Director of Oversight Board Administration. It also said that board members would be named "in the coming months". In February 2025 it was announced that
Daniel P. Eriksson was taking over as Executive Director, moving from the role as CEO for
Transparency International.
Activity On May 6, 2020, Facebook announced the 20 members that would make up the Oversight Board. Facebook's VP of Global Affairs and Communications
Nick Clegg described the group as having a "wide range of views and experiences" and who collectively lived in "over 27 countries", speaking "at least 29 languages, but a quarter of the group and two of the four co-chairs are from the United States, which some
free speech and
internet governance experts expressed concerns about. It starting accepting cases on October 22, 2020. The board officially began to cover cases related to
Threads in May 2024.
Earliest decisions and actions On January 28, 2021, the board ruled on five moderation decisions made by Facebook, overturning four of them and upholding one. All but one were unanimous. Each ruling was decided by a majority vote of a panel of five members of the board, including at least one member from the region where the moderated post originated. The text further contrasted terrorist attacks in France in response to
depictions of Muhammad with an asserted relative silence by Muslims in response to the
persecution of Uyghurs in China, In reviewing Facebook's decision to remove the post, the board sought a re-translation of the post,
"Zwarte Piet" Blackface Decision On April 13, 2021, the board upheld the removal of a Facebook post by a Dutch Facebook containing a 17-second video of a child and three adults wearing traditional Dutch "
Sinterklaas" costumes, including two white adults dressed as
Zwarte Piet (Black Pete), with faces painted black and wearing Afro wigs. The board found that although the cultural tradition is not intentionally racist, use of blackface is a common racist trope.
Ban of Donald Trump Facebook's
deplatforming of U.S. President Donald Trump was not among the initial decisions as it was collecting comments from the public. On January 6, 2021, amidst
an attack at the Capitol while Congress was counting the electoral votes, Trump posted a short video to social media in which he praised the rioters, despite urging them to end the violence, and reiterated his baseless claim that the
2020 presidential election was fraudulent. Several platforms, including Facebook, removed it, with Facebook's vice president of integrity, Guy Rosen, explaining that the video "contributes to rather than diminishes the risk of ongoing violence". That day, Facebook also blocked Trump's ability to post new content; the next day, Facebook said the block would remain at least until the
end of Trump's term on January 20. On April 16, 2021, the board announced that it was delaying the decision on whether to overturn Trump's suspensions on Facebook and Instagram to sometime "in the coming weeks" in order to review the more than 9,000 public comments it had received. Notably, on January 27, 2021, incoming board member Suzanne Nossel had published an
op-ed in the
Los Angeles Times titled "Banning Trump from Facebook may feel good. Here's why it might be wrong", but a spokesperson announced that she would not participate in the deliberations over the Trump's case and would be spending the upcoming weeks in training. The board specified that Facebook's standard procedures involve either a timed ban or a complete removal of the offending account, stating that Facebook must follow a "clear, published procedure" in the matter. On June 4, 2021, Facebook announced that it had changed the indefinite ban to a two-year suspension, ending on January 7, 2023. Trump's Facebook account was later reinstated in March 2023, with Meta saying the public should be allowed to hear from politicians, but that Trump would be subject to "heightened penalties" for repeated violations of its rules.
XCheck Program In September 2021, the board announced it would review Facebook's internal XCheck system, which fully exempted high-profile users from some of the platform's rules and regulations as well as partially exempting less high-profile users with their posts subjected only to Facebook's content review. This program was a separate system and queue, intended only for around 5.8 million users. The board's quarterly report, issued on October 21, 2021, stated that the company was not transparent about the XCheck program and did not provide the board with complete information upon which to conduct a review. The board also noted that the company's lack of transparency with users about reasons for content deletion was unfair. In response, the company stated that it would aim for greater clarity in the future. == Enabling documents ==