The decision will instead be made by the Facebook Oversight Board, an independent body often described as a kind of Supreme Court for Facebook. The board’s decision will be announced Wednesday.
If the idea of a Supreme Court for a social network leaves you with a lot of questions, well, you’re not the only one. Below, some frequent questions and answers about the board to help you get up to speed.
The board is an independent, court-like entity for appealing content decisions on Facebook-owned platforms. It’s made up of 20 experts in areas like free expression, human rights, and journalism.
Content moderation decisions — for instance, removing or not removing a particular post — made by Facebook and Instagram can be appealed to the board once users have gone all the way through the company’s internal review process. Facebook says that decisions made by the board are final.
Facebook first announced its intention to form an independent entity to vet content decisions in November 2018. After some delay, the company announced in October 2020 that the board would begin to hear cases.
Who is on the board?
Included among the 20 current members of the board are notable individuals from around the world, including Helle Thorning-Schmidt, former prime minister of Denmark; Alan Rusbridger, former editor-in-chief of The Guardian; and Tawakkol Karman, a Nobel Peace Prize laureate who promoted non-violent change in Yemen during the Arab Spring, a movement in which social media played an important role.
But the board is just going to do whatever Facebook wants, right?
Nope. The board is designed to be independent of Facebook, according to its charter. Facebook funds a trust that, in turn, funds the board. The trustees are “responsible for safeguarding the independence” of the board.Critics of the company argue the board is not truly independent and is a “Facebook-paid, Facebook-appointed body created by Facebook to use to launder its most politically sensitive decisions.”
Suzanne Nossel, a Facebook Oversight Board member and CEO of the free expression organization PEN America, told CNN Business last week, “Obviously, Facebook has its own motives in this. Let’s be clear. They’re a profit-making enterprise. They wouldn’t have done this if they didn’t think it was good for business. They have taken some steps in putting money in a trust and creating an independent set of trustees that oversee the board itself. And so there are some efforts to make it genuinely independent.”
“Whether those go far enough, whether circumstances arise that test or challenge those parameters, we’ll have to see, but I think it’s crucial, if the board is going to play any kind of useful role, that that independence be absolutely respected,” she added.
Some — perhaps many — decisions the board makes may ultimately not be what Facebook would want, or might put the company in some uncomfortable positions. But however the board rules, Facebook does get the benefit of some cover on the most difficult content questions.
Does Facebook have to do what the board says?
A decision made by the board “will be binding and Facebook will implement it promptly, unless implementation of a resolution could violate the law,” according to the board’s charter.
Remind me, what happened to Trump’s Facebook account?
Trump had access to his Facebook and Facebook-owned Instagram accounts cut off on January 7, a day after the deadly insurrection in Washington DC. Zuckerberg wrote at the time, “We believe the risks of allowing the President to continue to use our service during this period are simply too great.”
What happens after the board makes a decision on Trump’s Facebook account?
The board’s decision will be announced at 9 a.m. ET on Wednesday. The board’s decision on whether Trump should be allowed back on the platform is supposed to be binding, meaning Facebook is supposed to take the action.
The board will publish its decision on its website along with an explanation of how it reached the decision. It will not, however, make public who on the board voted which way; the board members are not supposed to reveal that information themselves either.
The board says it received more than 9,000 public responses on what to do about Trump’s account, and it is expected to publish a sample of those responses as well.
What cases has the board taken on before this?
In its first set of rulings in January, the board overturned some decisions Facebook had made.
In one case, Facebook had removed a post from a user in Myanmar who had shared two photos of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in 2015. The text accompanying the photo, according to the board’s description, said there was “something wrong with Muslims (or Muslim men) psychologically or with their mindset.” (Rohingya Muslims have been persecuted in Myanmar.)
Facebook removed the post due to its hate speech policies. The board overturned that decision.
In an explanation of the decision posted to its website, the board said, “[W]hile the post might be considered pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm. As such, the Board does not consider its removal to be necessary to protect the rights of others.”
You can read the full decisions here.
— Brian Fung and Kaya Yurieff contributed reporting.
#Facebook #Oversight #Board