A post in three parts: First, I dissect a specimen of the current elitist media attack on Facebook and its users as a guidepost on the path to moral panic. Second, as a counterpoint, I admire a report about how the leaders of our tomorrow — the youth of Parkland — are using social media to change the world. Third, I will tell Facebook it is not doing nearly enough to fix itself and if it does not act more decisively, honestly, and quickly, it will invite short-sighted regulation that could ruin the net for us all.
[First, my disclosure: I raised money from Facebook, Craig Newmark, the Ford Foundation, AppNexus, and others to start the News Integrity Initiative. We are independent of Facebook and I receive no payment from any platform.]
I have respected Matthew Yglesias as a political commentator since he was a blogger as a student at Harvard (he graduated a year before Mark Zuckerberg started Facebook there). I don’t agree with him this time. At Vox, Yglesias wrote an evisceration of Facebook, going so far as to tell Zuckerberg to shut it down. As I see it, his screed is:
- Elitist. Once ensconced in media, Yglesias pulls the ladder up behind him, proposing to cut off the tool that gives so many others — two billion others — the means to speak and connect. Siva Vaidhyanathan, no great ally of Facebook (whose own book-length scrutiny of its impact is coming out this fall) makes this point well in The New York Times. To those who would #DeleteFacebeook, Vaidhyanathan warns: “Please realize, though, that you might be offloading problems onto those who may have less opportunity to protect privacy and dignity and are more vulnerable tothreats to democracy.”
- Paternalistic. The argument we hear these days that Facebook, Twitter, et al are designed to addict the people — Yglesias portrays Facebook as a cigarette company — is a variation on the mass-society and mass-cultureworldviews I’ve been researching, which portray the unwashed public aslemmings who are readily hypnotized into idiocy and depression, acting against their own interests. Mike Godwin— he of Godwin’s Law — says it best on Facebook: “My big question for those who believe Facebook has overcome the free will of 2 billion people: how did all of you escape?” In a conversation (on Facebook) he adds: “Tristan Harris’s argument, which is weird to me, is that, when you encounter targeted ads or messages in social media, you will have thoughts put into your brain that you may not have wanted to have, and that you may not wish to have. My response is,dude, have you ever had a conversation before?”
- Un-self-aware. I am profoundly disappointed that my fellow journalists and media people refuse to examine their own responsibility for thepolarization of our society; for the clickbait media economy that the platforms adapted; for failing to effectively inform the public conversation; for leading a long decline in trust in institutions; and for the rise of Trump himself. Let he who has not helped screw up society throw the first snark.As to the complaint about Facebook and Google hurting journalism’s business model I say what I have always said: It was up to us to innovate and adapt. We didn’t. They competed. We lost. They’re trying to help us anyway.
- Self-righteous. Yglesias quite properly criticizes Facebook’s role — no matter how unwitting — in tragedies unfolding in Myanmar and I would add the Philippines. But then he adds: “I also lose sleep over a work screw-up sometimes, but I’m confident that I’ve never accidentally contributed to unleashing a genocide.” This isn’t easy but I have to say it as someone who shares Yglesias’ history: In our blogs, he and I each supported the war in Iraq. We have each recanted. We, like The New York Times, didn’t intend to lead to disaster. But it would be wise of us to judge how online tools are being manipulated and misused with this perspective. We live in an age of change and so necessarily in a time of unintended consequences.
- Alarmist. Good Lord, shut down Facebook? Does Yglesias really think it’s that awful? He does. “Facebook is bad,” he says. “And it probably can’t be fixed.” I challenge anyone in sight of this to go look at your Facebook or Twitter feed of your friends and those you follow and come back and tell me how overrun it is with Nazis, bigots, and conspiracy theorists. I’ll bet you won’t find many or any because you’re smart and your friends are, too. Does society have Nazis, bigots, and conspiracy theorists? Of course, it does. It always has. Now, online, you can just see them better. But I wouldn’t kill the messenger. And I wouldn’t declare all society broken, not yet.
To blame a single actor for larger ills in society is a sign of moral panic, which Ashley Crossmen defines as “a widespread fear, most often an irrational one, that someone or something is a threat to the values, safety, and interests of a community or society at large. Typically, a moral panic is perpetuated by news media, fueled by politicians, and often results in the passage of new laws or policies that target the source of the panic. In this way, moral panic can fosterincreased social control.” Sound familiar?
Before deciding that Facebook is the root of all society’s ills today, I urge you to read Dave Cullen’s inspiring Vanity Fair report inside the “secret meme lab” run by the students, survivors, leaders, and heroes of Marjory Stoneman Douglas High School.
To echo Margaret Sullivan, these young people are “amazing communicators.” That is to say, they are smart, informed, and articulate. Now if you try to argue that they come off so well because they come from privilege —and they do — listen to all the many young people from many different schools and communities who spoke and were interviewed at the March for Our Lives. This is an articulate generation. The collection of Facebook, Twitter, Instagram, YouTube, and Snap did not ruin them. It empowered them. It connected them. It taught them how to speak to a public. In these dark, divided, Trumpian times when even an optimist such as myself could start to lose hope, I have regained my optimism watching, listening to, and following these young people.
Says Cullen: “This response would not have been possible for the Columbine generation. Today, every high-school kid in America is a content creator, churning out daily posts on Instagram and Snapchat without a thought — or, actually, with a tremendous amount of thought…. For the two dozen kids that came together in Cameron Kasky’s living room, content creation isn’t just a social diversion; it’s a way of life.”
I would call them more than “content creators.” I would call them leaders.Cullen’s report shows how they have learned to use their social-media savvy and use it responsibly. They listen. They collaborate. They understand and govern their impact. Journalists, too, should learn from them and how they use these tools to inform, to educate, and to engage people not in “content” but in conversation and action. These are lessons I will share with our Social Journalism students at CUNY.
Without their social tools — if they were still dependent on the gatekeepers of big, old, elite media — this campaign, like #BlackLivesMatter, could not have grown. For that alone, it is well worth understanding, protecting — and, yes, fixing — these platforms and our net.
Having defended Facebook and then praised it, now I will demand more of it, much more. If Facebook does not take quick and decisive action, I worry that we will find not just Facebook but our internet regulated in perilous ways. I do not believe that government — especially our government in the U.S. today — is competent to regulate the platforms and thus our speech. We can all see the future in the form of Europe’s regulation. Germany’s NetzDG hate-speech law and Europe’s “right to be forgotten” court ruling are monuments to unintended consequences for freedom of expression. I fear that the upcoming EU GDPR privacy regulations will also have serious consequences for the future of post-mass media. And at the extreme, I dread China’s internet. The Trump administration’s regulation? I shudder to imagine.
Zuckerberg vowed to fix Facebook and I must say I am disappointed in his actions so far. His response to the Cambridge Analytica story was slow and tactical, defensive about the details and silent on the deeper issues to which he and his company’s leadership must pay immediate attention.
I try to be both critical of and helpful to the media and technology industries, pressing news organizations to innovate and seeking to help them explore new business models in my day job, while I try to build bridges with technology companies while also pressing them to face their greater responsibility to society. In that spirit, I expect Facebook to be:
- Respectful. Facebook needs to respect its users’ rights. I recall — but cannot find — an effort by Facebook in its early days to formulate a crowdsourced constitution for the community. Though I want to see Facebook listen to its users — and there will be plenty of good ideas — the company’s leaders need to propose their own principles to follow for product and business decisions regarding privacy of data above all. It is facile to say simply that we should “own” our data when the issues are more complex, with information about users coming from what they openly share, from their actions and transactions with others, and from inference and extrapolation. But it is possible for Facebook to assure users that they should know what Facebook knows about them; they should know how that information is being used; and they should have the right and means to delete and correct that data. Start there.
- Honest. Brutally honest, that is, about itself. I believe Facebook needs to bring in outsiders to undertake a thorough ethical audit of its actions and its culture. I shake my head at each new revelation — collecting call historyis the latest. In what sane universe could a company with a righteous culture enable even one employee to survey users about pedophiles asking children for sexual pictures? Facebook needs to unearth and reveal that which does not comport with its new principles and statement of users’ rights. Going forward, I would recommend that it institute the kind of review panels hospitals and universities have to examine the implications and impact of experiments before they occur. Move fast but think first should be their new motto.
- Transparent. I’ve been asked often lately whether this is an existential crisis for Facebook. I don’t think so; from what I see, there’s more furor in the media than in the public. But if anything will kill Facebook, I say it will be its culture — and the Valley’s culture — of secrecy. Exactly what competition is Facebook worried about? It owns social, so far. I say it is far more important for Facebook to be open with its users and trusted by the public than it is to hold onto secrets. Transparency must become a key ethic for the company.
- Responsible. I have argued and will continue to argue that Facebook and the other platforms must recognize the roles and responsibilities they have taken on in informing the public conversation and assuring its civility and with it the health of the public sphere. Instead of walking away from decisions about quality, authority, credibility, and civility, the platforms are forced — by dint of the manipulation of them — to set standards and make decisions about what does and does not meet those standards. I believe they cannot do this alone and that is why I am headed to San Francisco as I write this to work with many others to set up networks to share signals of repute and disrepute about content and warnings of campaigns of disinformation. (More on that later.)
- Collaborative. In all those areas of responsibility — informing society, encouraging civility, monitoring impact, sharing signals and alerts — it is critical that Facebook expand its efforts to collaborate with news organizations, other technology companies, government, academics, and others who can be helpful.
- A leader. Facebook says it is implementing features of a proposed political advertising bill but then Quartz says it is trying to kill the legislation. I say that Facebook should leap so far ahead of these proposals that it makes the legislation moot. Facebook could and should make every political ad — and why not every ad? — transparent as to buyer and targeting, beating every other media outlet and legislator to the punch. Given the trouble Facebook — and now apparently Politico — are in for helping certain candidates with content and strategy, maybe it would be better for the platform — and media companies — to swear off acting as political aids. Let’s at least have that discussion.
- Moral. Google’s famous injunction to not be evil was, as has often been reported, the founders’ license given to employees to keep the company from doing wrong, for that would be bad business. Google changed the line to “do the right thing.” Whichever. The leadership of a company has to set an example and create a culture of ethical and moral expectation.
- Innovative. Mind you, it would be a huge mistake for Facebook to retreat into its shell in the belief that its product is finished and what it has to do now is defend what it has and screw up no more. No, Facebook is not finished. I believe it still has much work to do to imagine a fuller definition of community and how the platform can help bring people — friends and strangers — together into civil, informed, and productive conversation. I want to see the company continue to invent and take risks. I just want it to be more open, responsible, principled, ethical, and collaborative about it.
Unlike Matthew Yglesias, I do not believe we can — or should want to — back-button our way to a society before and without Facebook or social platforms or the net or for that matter trolls and Russian bots. We must recognize the reality of the world we live in today. We would be wise to take account of the many benefits these advances have brought. And we need to take responsibility together for using these new powers wisely. That includes all the platforms and technology companies and media companies and government — and every one of us.