Trump v. Facebook

Facebook has decided to ask its new, independent Oversight Board to rule on its decision to suspend Donald Trump indefinitely. The Board will be able to make a binding determination regarding Trump, telling Facebook it was right or wrong, and Facebook and Instagram will obey. Trump will be free to a statement to the Board within two weeks.

Though the question is specific to Trump, it will undoubtedly have larger impact as other government officials — in Germany, the EU, the UK, and most worryingly Poland — are complaining about platforms being able to take down heads of state. I am equally — no, more — worried about governments thinking they can or should compel anyone, platforms or publishers, to carry their speech.

With this move, Facebook has certainly upped the ante with its Oversight Board. The first cases selected by the Board from users and sent to it by Facebook were, well, obscure. That’s not surprising. All sides of this polygon wanted to test this new institution and see how it would work. But this — the matter of Trump v. Facebook — is the case of cases. Before the Board was fully in operation, back in June, I urged Mark Zuckerberg to call them in on the question of Trump. I’m glad they’re doing it now.

When Facebook folk told me about this move, they said the company believed it did the right thing by taking down Trump. I agree. Then why appeal to the Board? Because, they said, they recognize this is an momentous decision being made inside a private enterprise and they understand the need for more perspective and accountability. Said Facebook’s VP for policy and communication (and former deputy prime minister of the UK) Nick Clegg:

Our decision to suspend then-President Trump’s access was taken in extraordinary circumstances: a U.S. President actively fomenting a violent insurrection designed to thwart the peaceful transition of power; five people killed; legislators fleeing the seat of democracy. This has never happened before – and we hope it will never happen again. It was an unprecedented set of events which called for unprecedented action.

In making our decision, our first priority was to assist in the peaceful transfer of power. This is why, when announcing the suspension on January 7th, we said it would be indefinite and for at least two weeks. We are referring it to the Oversight Board now that the inauguration has taken place.

The risks are many. Ubiquitous Facebook sceptics across media will likely accuse them of wimping out even though they already made the tough call. Governments will use whatever is said to fuel their fears.

Let me for a moment fuel my own fears: I do not want a society in which a government can outlaw the ability of platforms to choose what they do and do not carry (precisely what Poland is planning). Compelled speech is not free speech! I do not believe that platforms are media — an argument for another day — but if we stipulate for the moment that they are similar, then can you imagine a government in a free and enlightened nation walking into the office of an editor (of The Washington Post, The Guardian, the BBC, Die Zeit, El Pais, Le Monde, Gazeta Wyborcza) ordering that the publication must carry the words of an official (or, as in Italy, a fascist)? I pray Europeans especially would understand why this precedent in history, this idea, is dangerous.

I also worry that in seeking others — the Oversight Board, legislators, or regulators— to make its decisions, Facebook is engaging in regulatory capture. Clegg concedes: “Whether you believe the decision was justified or not, many people are understandably uncomfortable with the idea that tech companies have the power to ban elected leaders. Many argue private companies like Facebook shouldn’t be making these big decisions on their own. We agree.” Facebook can afford to deal with the legal medicine balls thrown its way by governments; new, small entrants into the net cannot. I want to see Facebook defend freedom of expression on the net for all.

In this process, I hope that Facebook decides to be as open and transparent as possible. I want to hear how they made the decision to take down Trump in the first place. I want to see data about the impact Trump’s incendiary and insurrectionist words had on users. I want to hear that they understood and debated key issues. I would like to think they listened to experts and perspectives — especially those of academics who research these matters — outside the company. I want them to be held accountable to do just that. It is not sufficient for Facebook to give the Oversight Board a binary, hot potato: Trump online? Trump offline? This is a nuanced and difficult discussion. I hope the Oversight Board sees it that way and returns a decision that looks at the many questions the case raises.

Again, Facebook is obligating itself to follow the decision of the Board only in the matter of Trump; the case is limited. Fine. What I find more valuable than the decision is the discussion. What precedents are set here for other situations in other countries? Last week, a journalist called me to discuss whether the Trump decision sets a precedent for taking down Ayatollah Khamenei based on human rights violations in Iran. Certainly this is a discussion that should be had in the Philippines — ask my friend Maria Ressa — in Myanmar, in Turkey, and elsewhere. Platforms must not become the outlets of governments, especially not autocrats and tyrants.

Twitter has been transparent with media about the process that led it to take down Trump; see stories in The Washington Post and The New York Times. I have met the company’s head of policy, Vijaya Gadde, as well as Jack Dorsey and the company’s staff working in safety, and I am impressed with their good will and judgment. I have more faith the more I hear of their decision-making. The same goes for every technology company. I have argued that Facebook, Twitter, Google — and, indeed, every journalistic enterprise — should establish covenants, North Stars, Constitutions (call them what you will) with the public and be held accountable for following them through transparency (I was part of the working group that recommended a regulatory and legal framework to do just that).

The internet is, at long last, the outlet for citizens, especially those too long not heard in mass media. This is our press. When we abuse it — whether as citizens or as heads of state — the platforms have the right and the responsibility to moderate us (this is why I am a staunch believer in Section 230) but governments should not control our speech (this is why I am a First Amendment absolutist).

These are indeed big questions as we decide together what standards the net — Facebook, Twitter, Google in the specific but the internet and society on the whole — should set in relation to speech and to power. The more discussion we have about these difficult issues, the better. For we are a society relearning how to hold a conversation with ourselves after half a millenium in Gutenberg’s thrall (that is the book I’m writing). This won’t be quick.

The Board will have 90 days to decide.

Disclosure: Facebook has funded activities at my school regarding journalism and disinformation. I receive nothing personally from any platform.