The Facebook Oversight Board is now open for cases and I look forward to seeing the results. But I have the same question I’ve had since the planning for its creation began, and I asked that question in a web call today with board leadership:
What higher principles will the Board call upon in making its decisions? It will be ruling on Facebook’s content decisions based on the company’s own statutes — that is, the “community standards” Facebook sets for the community.
The Board says it will also decide cases on the basis of international human rights standards. This could mean the board might find that Facebook correctly enforced its statute but that the statute violates a principle of human rights, which would result in a policy recommendation to Facebook. Good.
But there remains a huge gap between community statutes and international human rights law. What is missing, I have argued, is a Constitution for Facebook: a statement of why it exists, what kind of community it wants to serve, what it expects of its community, in short: a north star. That doesn’t exist.
But the Oversight Board might — whether it and Facebook know it or not — end up writing that Constitution, one in the English model, set by precedent, rather than the American model, set down in a document. That will be primarily in Facebook’s control. Though the Oversight Board can pose policy questions and make recommendations, it is limited by what cases come its way — from users and Facebook — and it does not set policy for the company; it only decides appeals and makes policy recommendations.
It’s up to Facebook to decide how it treats the larger policy questions raised by the Oversight Board and the cases. In reacting to recommendations, Facebook can begin to build a set of principles that in turn begin to define Facebook’s raison d’être, its higher goals, its north star, its Constitution. That’s what I’ve told people at Facebook I want to see happen.
The problem is, that’s not how Facebook or any of the technology companies think. Since, as Larry Lessig famously decreed, code is law, what the technologists want is rules — laws — to feed their code — their algorithms — to make consistent decisions at scale.
The core problem of the technology companies and their relationship with society today is that they do not test that code and the laws behind it against higher principles other than posters on the wall: “Don’t be evil.” “Work fast and break things.” Those do not make for a good Constitution.
But now is their chance to create one. And now, perhaps, is our chance. I didn’t realize that every Oversight Board case will begin with a public comment period. So we can raise issues with the Board. Indeed, community standards should come from the community, damnit, or they’re not community standards; they’re company standards. So we should speak up.
And the Board will consult experts. They can raise issues with the Board. And the Board can, in turn, raise issues not just for Facebook but, by example, for all the technology companies. That discussion could be useful.
Imagine if — as I so wish had been the case — the Board had been in operation when Twitter and Facebook decided what to do about blocking the blatant attempt at election interference by the New York Post and Rupert Murdoch in cahoots with Rudy Giuliani. The Board could have raised, addressed, and proposed policy recommendations based on principles useful to many internet companies and to the media that love to poke them.
Regulators could also get involved productively more than punitively. I was a member of a Transatlantic Working Group on Content Moderation and Freedom of Expression, which recommended a flexible framework for regulation that would have government hold companies accountable for their own assurances, requiring the companies to share data on usage and impact so researchers and regulators can monitor their performance. This, in my view, would be far better than government trying to tell companies how to operate, especially when it comes to interference in free speech. But government can’t hold companies accountable to keeping promises if there are no promises to keep. A Constitution is a promise, a covenant with users and the public. Every company should have one. Every company should be held accountable for meeting its requirements. And the public discussion should revolve around those principles, not around whether Johnny is allowed to use a bad word.
I make no predictions here. The Board could end up answering a handful of picayune complaints among tens of thousands of possible cases a week and produce the script of an online soap opera. Facebook could follow the letter of the law set down by the Board and miss the opportunity to set higher goals. Media, experts, and the public could be ignored or worse could just continue to snipe instead of contribute constructively.
But I can hope. The net is young. We — all of us — are still designing it by how we use it.
Be careful what you clamor for. You demand that platforms deal with harmful speech. Then he whose speech is thus affected unleashes the dogs of Trump. They harasstheplatform and its employees for exercising their freedom of speech. They threaten to limit freedom of expression for everyone on that platform and the net — including you.
Thus efforts to control noxious, right-wing speech have backfired as the right-wing exploits every tool used against them. The weapons Trump brandishes — regulating social platforms, limiting or repealing Section 230, redirecting government advertising, blaming algorithmic “bias,” demanding “neutrality,” defining the net as media and platforms as publishers — are things proposed by those who want to limit harmful speech online. In his so-called executive order, the Troll in Chief is using them all for his ends. Have we learned nothing from bad actors online— that every function, every lever, every precedent that can be gamed and exploited by them will be? Now Section 230, our best protection of freedom of expression on the internet, is in peril.
The more I study net regulation, the more of a free-speech absolutist I become. To think that speech is harmful is almost inevitably a third-person effect: believing that everyone else — but not you — is vulnerable to bad words and ideas and that protecting them from it will cure their ignorance. There is but one cure for ignorance: education. The goal of education is to prepare the mind to wrestle with lies and hatred and idiocy … and win.
It is worthwhile to remind us of that very argument made long ago by Franklin, Milton, and Wilkes. Sherman, set the Wayback Machine.
In 1731 Benjamin Franklin was fed up with people complaining about what came off his press — not just in his newspaper, but even in advertisements — and so he wrote an Apology for Printers, which was nothing of the sort. I’m going to take the heart of that essay and substitute modern words like platform and social media for old-fashioned words like printer to make my point: that Franklin’s point still stands. Let me be clear: I do not believe the internet is a medium. It is a platform, a platform for facts and opinions and conversation about them. That is how Franklin viewed his press, as a platform. He wrote:
I request all who are angry with me on the Account of serving things they don’t like, calmly to consider these following Particulars
1. That the Opinions of Men are almost as various as their Faces; an Observation general enough to become a common Proverb, So many Men so many Minds.
2. That the Business of Social Media has chiefly to do with Mens Opinions; most things that are posted tending to promote some, or oppose others….
4. That it is as unreasonable in any one Man or Set of Men to expect to be pleas’d with every thing that is posted, as to think that nobody ought to be pleas’d but themselves.
5. Technologists are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter: Hence they chearfully serve all contending Twitter or Facebook users, without regarding on which side they are of the Question in Dispute.
6. Being thus continually employ’d in serving all Parties, Platforms naturally acquire a vast Unconcernedness as to the right or wrong Opinions contain’d in what they serve; regarding it only as the Matter of their daily labour: They serve things full of Spleen and Animosity, with the utmost Calmness and Indifference, and without the least Ill-will to the Persons reflected on; who nevertheless unjustly think the Platform as much their Enemy as the Tweeter, and join both together in their Resentment.
7. That it is unreasonable to imagine Platforms approve of every thing they serve, and to censure them on any particular thing accordingly; since in the way of their Business they serve such great variety of things opposite and contradictory. It is likewise as unreasonable what some assert, That Platforms ought not to serve any Thing but what they approve; since if all of that Business should make such a Resolution, and abide by it, an End would thereby be put to Free Tweeting and Facebooking and Instagramming and TikToking and YouTubing, and the World would afterwards have nothing to read but what happen’d to be the Opinions of the Technologists.
8. That if all Platforms were determin’d not to serve any thing till they were sure it would offend no body, there would be very little posted.
9. That if they sometimes serve vicious or silly things not worth reading, it may not be because they approve such things themselves, but because the People are so viciously and corruptly educated that good things are not encouraged….
“Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties.” — John Milton, the Areopagitica
In 1638 Milton visited Gilileo, who was under house arrest for what authorities decreed were his dangerous ideas and harmful speech. Milton paid tribute to Galileo, including him in Paradise Lost, and the visit helped inspire the Areopagitica, Milton’s 1644 polemic against the licensing of books in England and in defense of freedom of expression.
The abolition of the Star Chamber in 1637 had led to the effective end of censorship and a flowering of publishing — too much publishing for the taste of authorities. In 1643, Parliament passed a Licensing Order “for suppressing the great late abuses and frequent disorders in Printing many false, forged, scandalous, seditious, libellous, and unlicensed Papers, Pamphlets, and Books to the great defamation of Religion and Government.” Might as well add tweets and Facebook comments to the list. Parliament argued, as unfortunately some do today, that there was too much speech. Bad actors, they said, “have taken upon them to set up sundry private Printing Presses in corners, and to print, vend, publish, and disperse books, pamphlets and papers, in such multitudes, that no industry could be sufficient to discover or bring to punishment all the several abounding Delinquents.”
Speech scaled and control did not. In England, the Stationers Company — a private, industry organization for printers — had been deputized to regulate this speech, just as Twitter and Facebook are expected to do today. The Order decreed no publication could be printed unless it was first licensed.
In the Areopagitica Milton rose up in righteous, eloquent anger in defense of speech, of debate, of learning, and of this less-than-200-year-old art of printing.
“For books are not absolutely dead things, but do contain a potency of life … of that living intellect that bred them.” Thus, Milton said, one might as well “kill a man as kill a good book…. he who destroys a good book, kills reason itself, kills the image of God.”
But what of bad books? Well, who is to decide the difference? A Star Chamber? The Stationers Company? Twitter? Facebook’s Oversight Board? The White House? Courts? Or readers? “Read any books whatever come to thy hands, for thou art sufficient both to judge aright and to examine each matter.” That is God speaking to Pope Dionysius of Alexandria in 240 A.D., according to Milton.
We learn by testing ourselves, Milton argues. “That which purifies us is trial and trial is by what is contrary…. Our faith and knowledge thrives by exercise.” He acknowledges the authorities’ fears that bad speech is “the infection that may spread” — just what we hold this fear today about internet disinformation. But he contends that “evil manners are as perfectly learned without books” and so eliminating bad books will not staunch the infection. So: “A fool will be a fool with the best book, yea or without a book; there is no reason that we should deprive a wise man of any advantage to his wisdom, while we seek to restrain from a fool, that which being restrained will be no hindrance to his folly.”
This is Milton’s article of faith: “See the ingenuity of Truth, who, when she gets a free and willing hand, opens herself faster than the pace of … discourse can overtake her.” And: “And though the winds of doctrine were let loose to play upon the earth, so Truth be in the field, we do injuriously, by licensing and prohibiting, to misdoubt her strength. Let her and Falsehood grapple.”
Yet he adds a caution: ‘Truth and understanding are not such wares as to be monopolized and traded in by tickets and statutes and standards. We must not think to make a staple commodity of all the knowledge in the land, to mark and license it like our broadcloth and our woolpacks.” Truth is not a product to be packaged. It is a choice.
He makes two key arguments: that citizens need to learn by facing and rejecting sin (“When God gave him reason,” Milton says of Adam, “he gave him freedom to choose, for reason is but choosing”) and that no small group of men is capable of making decisions to protect citizens from those choices: “Who shall regulate all the mixed conversation of our youth, male and female together, as is the fashion of this country? Who shall still appoint what shall be discoursed, what presumed, and no further? Lastly, who shall forbid and separate all idle resort, all evil company?”
Milton warned of the precedents licensing would set. If we license printing, must we not then license dancing and lutes and lyrics and visitors who bring ideas? And what does Adam teach us about forbidden fruit? “The punishing of wits enhances their authority… This Order, therefore, may provide a nursing-mother to sects.” To forbid it is to spread it; that is another lesson of disinformation on the net.
Milton, like Franklin, recognizes the value of the public conversation: “Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good men is but knowledge in the making.” I cannot help but also call on James Carey, who said: “Republics require conversation, often cacophonous conversation, for they should be noisy places.” In the development of the net I have come to see that what we are witnessing is a society relearning how to have a conversation with itself.
But what of nasty, hateful conversations with trolls? Should we not be protected from them?
I give you John Wilkes, the urtroll, who is also, in the title of Arthur H. Cash’s biography, The Scandalous Father of Civil Liberty. Wilkes was, by every description, unattractive, a cur, a libertine, a smartass. He feuded with the prime minister, Lord Bute, and published anonymously a newspaper that mocked him, which “proceeded with an acrimony, a spirit, and a licentiousness unheared [sic] of before even in this country,” said Horace Walpole.
In the first issue of the North Briton, Wilkes called a free press “the firmest bulwark of the liberties of this country … the terror of all bad ministers.” Says Cash: “Wilkes was in constant danger of having his ironies taken literally by humorless or stupid men.” Indeed, Wilkes and his printers were arrested and his papers seized and there were attempts to rob him of his seat in Parliament.
But he persevered and in the process, according to Cash, set many legal precedents: the end of general warrants, the establishment of a right to privacy, an enhanced right to sue the government for false arrest, in addition to a right to transparency of Parliament and freedom of the press. Wilkes did it by nastily trolling, because that was the power he had at hand. Wilkes is a hero of mine, not as a troll, of course, but as a defender of liberty.
Larry Kramer, who died this week, was also a hero of mine. He was also a troll, a power he used when it was all he had to save lives at the start of the AIDS epidemic. Hear Dr. Anthony Fauci about their relationship:
“How did I meet Larry? He called me a murderer and an incompetent idiot on the front page of the San Francisco Examiner magazine.” …
Addressing Dr. Fauci in the letter, Mr. Kramer wrote: “Your refusal to hear the screams of AIDS activists early in the crisis resulted in the deaths of thousands of Queers. Your present inaction is causing today’s increase in HIV infection outside of the Queer community.”
“I thought, ‘This guy, I need to reach out to him,’” Dr. Fauci recalled. “So I did, and we started talking. We realized we had things in common.”
How better to tell the story of the power of listening?
So what speech is it you want to control? Hate? I hate our president and say so. Lies? Who wants an official truth but the officials who set it? Trolling? We risk losing the righteous power of Wilkes and Kramer and the opportunity to learn from them.
Donald Trump is a hateful, lying troll. So what should Twitter do with him? Whatever it wants to. That is the point. That is its right as a private entity in the United States. That is its freedom of expression. It has the freedom to do nothing, to delete his tweets, to add fact checks and warnings to them, to not promote them. I think it is now doing the right thing.
Above all, what Facebook and Twitter and every technology company should be doing is deciding why they exist. I have complained that in establishing its Oversight Board, Facebook has not set a North Star, a raison d’être for the platform. Why does it exist? What behavior on it is beneficial and welcome and what is not, for what reason? They are asking the 20 wise members of the Oversight Board — its Stationers Company — to enforce a set of statutes without a Constitution. Twitter, by its actions, is beginning to write its Constitution, to decide what is acceptable and not and why. Those are their decisions to make.
So what of Trump’s people, those whom he eggs on? Well, what are the characteristics we know of his so-called base: they are uneducated, white males. White, male entitlement matters. But uneducated, that is the key. To update Milton as I updated Franklin: “A fool will be a fool with the best Twitter, yea or without Twitter.”
If we try to use official power to restrain speech on social media, we give fools the power to restrain wisdom there. That is what Trump is trying to do. We must recognize it for what it is: not a legal but a political ploy, an unconstitutional one, also unAmerican. We must fight to protect the freedom of expression, even for fools, so we protect our own. We must fight for the net.
In 1968, Lyndon Johnson appointed a National Commission on Obscenity and Pornography to investigate the supposed sexual scourge corrupting America’s youth and culture. Two years later — with Richard Nixon now president — the commission delivered its report, finding no proof of pornography’s harm and recommending repeal of laws forbidding its sale to adults, following Denmark’s example. Nixon was apoplectic. He and both parties in the Senate rejected the recommendations. “So long as I am in the White House,” he vowed, “there will be no relaxation of the national effort to control and eliminate smut from our national life.” That didn’t turn out to be terribly long.
A week ago, as part of my research on the Gutenberg age, I made a pilgrimage to Oak Knoll Books in New Castle, a hidden delight that offers thousands of used books on books. On the shelves, I found the 1970 title, Censorship: For and Against, which brought together a dozen critics and lawyers to react to the fuss about the so-called smut commission. I’ve been devouring it.
For the parallels between the fight against harmful and hateful speech online today and the crusade against sexual speech 50 years ago are stunning: the paternalistic belief that the powerless masses (but never the powerful) are vulnerable to corruption and evil with mere exposure to content; the presumption of harm without evidence and data; cries calling for government to stamp out the threat; confusion about the definitions of what’s to be forbidden; arguments about who should be responsible; the belief that by censoring content other worries can also be erased.
One of the essays comes from Charles Keating, Jr., a conservative whom Nixon added to the body after having created a vacancy by dispatching another commissioner to be ambassador to India. Keating was founder of Citizens for Decent Literature and a frequent filer of amicus curiae briefs to the Supreme Court in the Ginzberg, Mishkin, and Fanny Hill obscenity cases. Later, Keating was at the center of the 1989 savings and loan scandal — a foretelling of the 2008 financial crisis — which landed him in prison. Funny how our supposed moral guardians — Nixon or Keating, Pence or Graham — end up disgracing themselves; but I digress.
Keating blames rising venereal disease, illegitimacy, and divorce on “a promiscuous attitude toward sex” fueled by “the deluge of pornography which screams at young people today.” He escalates: “At a time when the spread of pornography has reached epidemic proportions in our country and when the moral fiber of our nation seems to be rapidly unravelling, the desperate need is for enlightened and intelligent control of the poisons which threaten us.” He has found the cause of all our ills: a textbook demonstration of moral panic.
There are clear differences between his crusade and those attacking online behavior today. The boogeyman then was Hollywood but also back-alley pornographers; today, it is big, American tech companies and Russian trolls. The source of corruption then was a limited number of producers; today, it is perceived to be some vast number of anonymous, malign conspirators and commenters online. The fear then was the corruption of the masses; the fear now is microtargeting drilling directly into the heads of a strategic few. The arena then was moral; now it is more political. But there are clear similarities, too: Both are wars over speech.
“Who determines who is to speak and write, since not everyone can speak?” asks former presidential peace candidate Gene McCarthy in his chapter of the book. But now, everyone can speak.
McCarthy next asks: “Who selects what is to be recorded or transmitted to others, since not everything can be recorded?” But now, everything can be recorded and transmitted. That is the new fear: too much speech.
A defense of speech
Many of the book’s essayists defend freedom of expression over freedom from obscenity. Says Rabbi Arthur Lelyveld (father of Joseph, who would become executive editor of The New York Times): “Freedom of expression, if it is to be meaningful at all, must include freedom for ‘that which we loathe,’ for it is obvious that it is no great virtue and presents no great difficulty for one to accord freedom to what we approve or to that to which we are indifferent.” I hear too few voices today defending speech of which they disapprove.
Lelyveld then addresses directly the great bone of contention of today: truth. “That which I hold to be true has no protection if I permit that which I hold to be false to be suppressed — for you may with equal logic turn about tomorrow and label my truth as falsehood. The same test applies to what I consider lovely or unlovely, moral or immoral, edifying or unedifying.” I am stupefied at the number of smart people I know who contend that truth should be the standard to which social-media platforms are held. Truths exist in their contexts, not relative but neither simple nor absolute. Truth is hard.
“We often hear freedom recommended on the theory that if all expression is permitted, the truth is bound to win. I disagree,” writes another contributor, Charles Rembar, an attorney who championed the cases of Lady Chatterley, Tropic of Cancer, and Fanny Hill. “In the short term, falsehood seems to do about as well. Even for longer periods, there can be no assurance of truth’s victory; but over the long term, the likelihood is high. And certainly truth’s chances are better with freedom than with repression.”
A problem of definition
So what is to be banned? That is a core problem today. The UK’s — in my view, potentially harmful — Online Harms White Paper cops out from defining what is harmful but still proposes holding online companies liable for harm. Worse, its plan is to order companies to take down legal but harmful content — which, of course, makes that content de facto illegal. Similarly, Germany’s NetzDG hate speech law tells the platforms they must take down anything that is “manifestly unlawful” within 24 hours, meaning a company — not a court — is required to decide what is unlawful and whether it’s manifestly so. It’s bad law that is being copied so far by 13 countries, including by authoritarian regimes.
As an American and a staunch defender of the First Amendment, I’m allergic to the notion of forbidden speech. But if government is going to forbid it, it damned well better clearly define what is forbidden or else the penumbra of prohibition will cast a shadow and chill on much more speech.
In the porn battle, there was similar and endless debate about the fuzzy definitions of obscenity. Author Max Lerner writes of the courts: “The lines they draw and the tests they use keep shifting, as indeed they must: What is ‘prurient’ or ‘patently offensive’ enough to offend the ‘ordinary reader’ and the going moral code? What will hurt or not hurt children and innocents? What is the offending passage like in its context…?” Even Keating questions the standard that emerged from Fanny Hill: to be censored, content must be utterly without redeeming social importance. “There are those who will say that if you can burn a book and warm your hands from the fire,” Keating says, “the book as some redeeming social value.” Keating also argues that pornography “is actually a form of prostitution because it advertises ‘sex for sale,’ offers pleasure for a price” — and since prostitution is illegal, so must pornography be. Justice Potter Stewart’s famed standard for obscenity— “I’ll know it when I see it” — is the worst standard of all, for just like the Harms White Paper and NetzDG it requires distributors and citizens to guess. It is chilling.
Who is being protected?
“Literary censorship is an elitist notion: obscenity is something from which the masses should be shielded. We never hear a prosecutor, or a condemning judge (and rarely a commentator) declare his moral fiber has been injured by the book in question. It is always someone else’s moral fiber for which anxiety is felt. It is ‘they’ who will be damaged. In the seventeenth century, ‘they’ began to read; literacy was no longer confined to the clergy and the upper classes. And it is in the seventeenth century when we first begin to hear about censorship for obscenity.” So writes Rembar.
In the twentieth century ‘they’ began to write and communicate as never before in history — nearly all of ‘them.’ That has frightened those who had held the power to speak and broadcast. Underrepresented voices are now represented and the powerful want to silence them to protect their power. It is in the twenty-first century that we hear about control of harmful speech and hate.
“I am opposed to censorship in all forms, without any exception,” writes Carey McWilliams, who was then editor of The Nation, arguing that censorship is a form of social control. “I do not like the idea of some people trying to protect the minds and morals of other people. In practice, this means that a majority seeks to impose its standards on a minority; hence, an element of coercion is inherent in the idea of censorship.” It is also inherent in the idea of civility, an imposition of standards, expectations, and behavior from top down.
McWilliams then quotes Donald Thompson on literary censorship in England: “Political censorship is necessarily based on fear of what will happen if those whose work is censored get their way…. The nature of political censorship at any given time depends on the censor’s answer to the simple question, ‘What are you afraid of?’” Or whom are you afraid of? Rembar’s “they”? McWilliams concludes:
But in a time of turmoil and rapid social change, fears of this sort can become fused with other kinds of fears; and their censorship becomes merely one aspect of a general repression. The extent of the demands for censorship may be taken, therefore, as an indicator of the social health of a society. It is not the presence — nor the prevalence — of obscene materials that needs to be feared so much as it is the growing demand for censorship or repression. Censorship — not obscenity nor pornography — is the real problem.
Lerner puts this another way, examining a shift in the “norm-setting classes” over time. In the past, the aristocracy set norms for dress, taste, and morals. Then the middle classes did. Now, I will argue, the internet blows that apart as many communities are in a position to compete to set or protect norms.
Rembar notes that “reading a book is a private affair” (as it has been since silent reading replaced reading aloud starting in about the seventh century A.D.). He addresses the Constitution’s implicit — not explicit — right to be let alone, the basis of much precedent in privacy. “Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation.
And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?
Max Lerner is sympathetic with courts having to judge obscenity. “I view the Court’s efforts not so much with approval or disapproval as with compassion. Its effort is herculean and almost hopeless, for given the revolution of erotic freedom, it is like trying to push back the onrushing flood.” The courts took on the task of defining obscenity though, as Lerner points out above, they never really did draw a clear line.
Today politicians are shying away from deciding what is hateful and harmful, and these questions aren’t even getting to the courts because the responsibility for deciding what to ban is being put squarely on the technology platforms. Because: scale. Also because: tech companies are being portrayed as the boogeymen — indeed, the pornographers — of the age; they’re being blamed for what people do on their platforms and they’re expected to just fix it, damnit. Of course, it’s even more absurd to expect Facebook or Twitter or Youtube to know and act on every word or image on their services than it was to expect bookseller Eleazer Smith to know the naughty bits in every book on his shelves. Nonetheless, the platforms are being blamed for what users do on those platforms. Section 230 was designed to address that by shielding companies — including, by the way, news publishers — from liability for what others do in their space while also giving them the freedom (but not the requirement) to police what people put there. The idea was to encourage the convening of productive conversation for the good of democracy. But now the right and the left are both attacking 230 and with it the internet and with that freedom of expression. One bite has been taken out of 230 thanks to — what else? — sex, and many are trying to dilute it further or just kill it.
Today, the courts are being deprived of the opportunity to decide cases about hateful and harmful speech because platforms are making decisions about content takedowns first under their community standards and only rarely over matters of legality. This is one reason why a member of the Transatlantic High-Level Working Group on Content Moderation and Freedom of Expression — of which I am also a member — proposes the creation of national internet courts, so that these matters can be adjudicated in public, with due process. In matters of obscenity, our legal norms were negotiated in the courts; matters of hateful and harmful speech are by and large bypassing the courts and thus the public loses an opportunity to negotiate them.
What harm, exactly?
The presidential commission looked at extensive research and found little evidence of harm:
Extensive empirical investigation, both by the Commission and by others, provides no evidence that exposure to or use of explicit sexual materials plays a significant role in the causation of social or individual harms such as crime, delinquency, sexual or nonsexual deviancy or severe emotional disturbances…. Studies show that a number of factors, such as disorganized family relationships and unfavorable peer influences, are intimately related to harmful sexual behavior or adverse character development. Exposure to sexually explicit materials, however, cannot be counted as among those determinative factors. Despite the existence of widespread legal prohibitions upon the dissemination of such materials, exposure to them appears to be a usual and harmless part of the process of growing up in our society and a frequent and nondamaging occurrence among adults.
Over this, Keating and Nixon went ballistic. Said Nixon: “The commission contends that the proliferation of filthy books and plays has no lasting harmful effect on a man’s character. If that were true, it must also he true that great books, great paintings and great plays have no ennobling effect on a man’s conduct. Centuries of civilization and 10 minutes of common sense tell us otherwise.” To hell with evidence, says Nixon; I know better.
Keating, likewise, trusts his gut: “That obscenity corrupts lies within the common sense, the reason, and the logic of every man. If man is affected by his environment, by circumstances of his life, by reading, by instruction, by anything, he is certainly affected by pornography.” In the book, Keating ally Joseph Howard, a priest and a leader of the National Office of Decent Literature, doesn’t need facts when he has J. Edgar Hoover to quote: “Police officials,” said the FBI director, “unequivocally state that lewd and obscene material plays a motivating role in sexual violence…. Such filth in the hands of young people and curious adolescents does untold damage and leads to disastrous consequences.” Damn the data; full speed ahead.
Today, we see a similar habit of skipping over research, data, and evidence to get right to condemnation. We do not actually know the full impact of Facebook, Cambridge Analytica, Twitter, and social media on the election. As the commission says of the causes of deviancy, there must be other factors that got us Trump. Legislation and regulation are being proposed based on a candy bowl of tropes — the filter bubble, the echo chamber, hate speech, digital harms — without sufficient research to back up the claims. Thank goodness we are starting to see research into these questions; see, for example, Axel Bruns’ dismantling of the filter bubble.
Here I will lay some blame at the feet of the platforms, for we cannot have adequate research to test these questions until we have data from the platforms that answer questions about what people see and how they behave.
How bad is the bad of the internet? That depends on evidence of impact. It also depends on relative judgment. Rembar’s view of what he calls the “seductio ad absurdum” of sex and titillation in media: “There is an acne on our culture.” It is “an unattractive aspect of our cultural adolescence.” And: “acne is hardly fatal.”
Is today’s online yelling and shouting, insulting and lying by some people— just some, remember — an “all-pervasive poison” that imperils the nation, as Keating viewed porn? Or is it an unsightly blemish we’ll likely grow out of, as Rembar might advise?
I am not saying we leave the zits alone. I have argued again and again that Facebook and Twitter should set their own north stars and collaborate with users, the public, and government on covenants they offer to which they will be held accountable. I think standards of behavior should apply to any user, including politicians and presidents and advertisers. I strongly argue that platforms should take down threatening and harassing behavior against their users. I embrace Section 230 precisely because it gives the platforms as well as publishers the freedom to decide and enforce their own limits. But I also believe that we must respect the public and not patronize and infantilize them by believing we should protect them from themselves, saving their souls.
Permission is not endorsement
To be clear, the anti-censorship authors in the book and other allies of the commission (including The New York Times editorial page) are not defending pornography. “To affirm freedom is not to applaud that which is done under its sign,” Lelyveld writes. van den Haag, the psychoanalyst, abstracts pornography to a disturbing end: “Pornography reduces the world to orifices and organs, human action to their combinations. Sex rages in an empty world; people use each other as its anonymous bearers and vessels, bereaved of individual love and hate, thought and feeling reduced to bare sensations of pain and pleasure existing only in and for incessant copulations, without apprehension, conflict, or relationship — without human bonds.”
Likewise, by opposing censorship conducted or imposed by government, I am not defending hateful or noxious speech. When I oppose reflexive regulation, I am not defending its apparent objects — the tech companies — but instead I defend the internet and with it the free expression it enables. The question is not whether I like the vile, lying, bigoted rantings of the likes of a Donald Trump or Donald Trump Jr. or a video faked to make Nancy Pelosi look drunk — of course, I do not — but whether by banning them a precedent is set that next will affect your or me.
Hollis Alpert, film critic then for Saturday Review, warns in his essay: “The unscrupulous politician can take advantage of the emotional, hysterical, and neurotic attitudes toward pornography to incite the multitude towards approval of repressive measures that go far beyond the control of the printed word and the photographed image.”
Of freedom of expression
Richard Nixon was quite willing to sacrifice freedom of expression to obliterate smut: “Pornography can corrupt a society and a civilization,” he wrote in his response to the commission. “The pollution of our culture, the pollution of our civilization with smut and filth is as serious a situation for the American people as the pollution of our once pure air and water…. I am well aware of the importance of protecting freedom of expression. But pornography is to freedom of expression what anarchy is to liberty; as free men willingly restrain a measure of their freedom to prevent anarchy, so must we draw the line against pornography to protect freedom of expression.”
Where will lines be drawn on online speech? Against what? For what reasons? Out of what evidence? At what cost? These questions are too rarely being asked, yet answers are being offering in legislation that is having a deleterious effect on freedom of expression.
Society survived and figured out how to grapple with porn, all in all, just as it survived and figured out how to adapt to printing, the telegraph, the dime novel, the comic book, and other supposed scourges on public morality — once society learned each time to trust itself. Censorship inevitably springs from a lack of trust in our fellow citizens.
Gene McCarthy writes:
There is nothing in the historical record to show that censorship of religious or political ideas has had any lasting effect. Christianity flourished despite the efforts of the Roman Emperors to suppress it. Heresies and new religions developed and flourished in the Christian era at the height of religious suppression. The theories of democracy did not die out even though kings opposed them. And the efforts in recent times to suppress the Communist ideology and to keep it from people has not had a measurable or determinable success. Insofar as the record goes, the indications are that heresy and political ideas either flourished or died because of their own strength or weakness even though books were suppressed or burned and authors imprisoned, exiled, or executed.
And he concludes: “The real basis of freedom of speech and of expression is not, however, the right of a person to say what he thinks or what he wishes to say but the right and need of all persons to learn the truth. The only practical approach to this end is freedom of expression.”
An amusing sidebar to this tale: When the commission released its report, an enterprising publisher printed and sold an illustrated version of it, adding examples of what was being debated therein: that is, 546 dirty pictures. William Hamling, the publisher, and Earl Kemp, the editor, were arrested on charges of pandering to prurient interests for mailing ads for the illustrated report, and sentenced to four and three years in prison, respectively. According to Robert Brenner’s Huffpost account, someone received the mailing, took it to Keating, who took it to Nixon, who told Attorney General John Mitchell to nab them. (And some wonder why we worry about Trump attacking the press as the enemy of the people and having a willing handmaid in William Barr, who will do his any bidding!) The Supreme Court upheld their convictions but the men served only 90 days, though the owner was forced to sell his publishing house and was not permitted to write about the case: censorship upon censorship.