The first decisions of Facebook’s independent Oversight Board make Facebook’s judgment look good by comparison. Who saw that coming?
The Board has in essence said that it is OK to insult Muslim men as a group — yet not Azerbaijanis — and that freedom of expression justifies spreading medical misinformation. How in any logic does that make for a better Facebook, a better internet, and a better world?
The problem is that the Oversight Board is interpreting Facebook’s community standards, which are intended to guide moderators and algorithms in their decisions on what posts to take down. The rules are not — as friend Jasper Jackson put it — fit for purpose to be used as the basis of interpretation and enforcement by a court of ultimate authority, the Board.
I have said again and again (and again and again and again) that Facebook — and other technology companies (and journalistic enterprises) — need to establish and be held accountable to Constitutions, Bills of Rights, North Stars (call them what you will) to act as a covenant of mutual obligation with users, customers, and the public, answering the fundamental question, “Why are we here?”
Because Facebook does not have that higher-level expression of principles, the Oversight Board is left to judge its moderation decisions against the company’s nitty-gritty statutes on one end, or on the other, overly broad concepts like “hate speech” and “human rights,” with nothing in between. The Board acted like Supreme Court strict constructionists without a Constitution to call upon, so it depended on the exact wording of statutes to set bad precedents that will make bad policy.
The Board said that criticizing Muslim men did not rise to the standard of “hate speech.” If only Facebook had a principle — an article in a Bill of Rights — that said it expected users to respect each other as individuals and as members of groups of many identities, then it would have been impossible, in my view, for Facebook, the Board, or the community of users to condone a post that says there is “something wrong with Muslims psychologically.” As the organization Muslim Advocates said: “Facebook’s Oversight Board bent over backwards to excuse hate in Myanmar — a country where Facebook has been complicit in a genocide against Muslims.”
As for the medical disinformation: The Board said that a post endorsing hydroxychloroquine as a COVID-19 treatment did not rise to Facebook’s standard of “imminent physical harm,” because one needs a prescription to get it. Good Lord. We saw in the United States how Donald Trump inspired people to get the drug — and ignore other precautions — risking the health of themselves and others. The Board properly criticizes Facebook for some of its guidelines being too broad. But in this case, the guideline is too specific and created a loophole that allowed the Board to require — require! — Facebook to post medical misinformation. The Board suggested Facebook could have taken other steps, like adding context — but unfortunately, experience and data have shown that fact-checks of misinformation tend to amplify the misinformation. This is not about free expression and debate; there are no two sides to this — medicine has spoken. This decision ill informs, ill serves, and endangers the public.
I am glad — relieved — that after the Board’s decisions, Monika Bickert, Facebook’s head of content policy, said the company would still stick by science: “We do believe, however, that it is critical for everyone to have access to accurate information, and our current approach in removing misinformation is based on extensive consultation with leading scientists, including from the CDC and WHO. During a global pandemic this approach will not change.”
The problem with much of this discussion about bad shit online is that it’s the bad shit that then monopolizes our attention. Look at the news: The Q conspirators are still getting much more attention on cable news, their messages amplified every day, while the Black women of Georgia who especially saved our election and our democracy are not heard (exactly what they feared and foretold: that they would be exploited for this victory and then their circumstances and issues would be ignored). This is what comes of a journalism that focuses on the bad and a debate — I say a moral panic — about the net that obsesses on the awful. Every intervention we see is to find something more to forbid until one day we’ll be done. Not. Thus Facebook’s community standards are expressed in the negative, as statutes, as commandments: Thou shalt not. What about: Thou shalt?
How could we express our expectations in the positive? If I could get a bunch of Facebook executives in a room with a whiteboard, I would start by asking them why Facebook exists. What is it here to do? How do you want its presence to make a positive influence in the world? How would you like people to treat each other? What might you expect them to accomplish? “A connected world is a better world” is fine and I agree (not everyone would), but that’s a bumper sticker, not a Constitution. I thus would press them to express Facebook’s raison d’être. At a less high-falutin’ level, I’d ask who Facebook wants in its garden party and how they should be expected to behave. Out of this discussion might come principles such as users being expected to treat each other with respect. And then I’d ask them what the company warrants to foster and support such an atmosphere. Perhaps out of that comes Facebook’s promise to follow science. Statutes — the community standards — should be based on these principles. Oversight Board decisions should call on these principles. Regulators should expect data from the company to hold it accountable for these principles (this is the basis of the regulatory framework proposed by a high-level working group of which I was a part and which I endorse).
But such a covenant does not exist. So users, moderators, engineers writing algorithms, the Board, regulators, and media are left to interpret and enforce a set of rules posted on the playground.
I now dread the Oversight Board’s upcoming decision on whether Facebook should reinstate Donald Trump. I fear they will call upon freedom of expression — even of a white-supremacist authoritarian ruler inciting violence and rebellion to tear down the sacred institutions of a democracy — and have little more to go on than Facebook’s vague description of what it may do in cases of incitement and violence. I further fear how other heads of state will use this decision, even if Facebook does not, as a precedent. As I said in an earlier post, I am concerned that Germany, the EU, the UK, and most worryingly Poland are contemplating forcing platforms to carry their speech. I will repeat: Compelled speech is not free speech.
I wish to stop a cycle of reaction: user does something new and bad; Facebook reacts by creating a rule against it; the next time a user does something similar a moderator reacts by taking it down; the user reacts by appealing to Facebook and the Board; the Board reacts by ruling according to the statute, and so on. Jane, stop this crazy thing.
Facebook has already reinstated the posts the Board ordered it to reinstate (including in another case about Hermann Goering quote and another with naked breasts in the context of cancer, which Facebook had already put back up). Facebook will then react, in turn, to understand how to enforce the Board’s enforcement of its statutes.
I hope instead that Facebook will use this opportunity to see the weakness of its community standards as the basis for governing the behavior of communities and users online and in society. I hope they will not just sit with someone like me in a room with a whiteboard but will call upon the communities to help draw up their own standards and will work with academics and civil society to imagine a better Facebook in a better world and the principles that would undergird that. I further hope that the Oversight Board will stand back and ask whether by ruling according to the letter of inadequate law it is making Facebook and the world better or worse. I hope for a lot.
Disclosure: Facebook has funded activities at my school regarding journalism and disinformation.