In the long ago, when I was the TV critic for TV Guide, I liked Roseanne. Above, see my credentials.
Now, not so much. I had to force myself to watch Roseanne’s reboot just to see what is being foisted on America by ABC — especially because this network’s parent company, Disney, will soon have as its largest individual shareholderthe man who, more than any other single person, ruined our democracy: Rupert Murdoch. You like conspiracy theories? Nevermind Roseanne’s crackpot paranoia about left-wing pedophile rings. Try looking at how Fox and now ABC will conspire as propaganda outlets for Trump.
What’s most disturbing about the new Roseanne is how the network takes a populist movement that at its roots and its head is racist and tries to cleanse it. In the most blatant incident of racial tokenism on TV in memory, an innocent, young, African-American actor is hired to do nothing so much as be black and in the room with Roseanne to demonstrate that the old lady’s not so bad; she doesn’t spit insults at the child — even if, in real life, Roseanne Barr is not above attacking the children of Parkland.
And then, of course, there is the child of fluid gender definition who is also tolerated by the Conner family, nevermind that their hero, Trump, keeps trying to kick transgender patriots out of the U.S. military. I await the goofball and lovable Mexican and Muslim neighbors, whom Roseanne and Dan will also not report to ICE or lynch just to prove that these Trumpists are actually OK. Roseanne says her fondness for Trump is explained solely by his talk about jobs. It has nothing to do with white nationalism and populism, at least not in the fantasy world of the Disney network sitcom.
On the other side, there is the pathetic portrayal of Roseanne’s liberal sister by Laurie Metcalf, an actress I used to respect. Now, in her pink pussy hat and nasty woman T-shirt, she is meant to be nothing but the butt of jokes. Odd how Roseanne is allowed to make a joke at Jackie’s expense about taking a knee during dinner-table grace, but ABC pulled an episode of Blackishabout African-American athletes taking a knee to draw attention to racism in this country.
Some critics have tried to compare Roseanne’s character to Archie Bunker but they have it exactly wrong. In this show, sister Jackie is Archie, the buffoon, and Roseanne is Meathead, the sensible one. And the show doesn’t have an ounce of the intelligence and moral value of All in the Family.
If I were still a TV critic, I’d complain about the obvious gags and the lack of higher social awareness the once-upon-a-time Roseanne had when she made a show about class and feminism. The new version is just a collection of stereotypes exploited one way or another to support the stereotyper-in-chief, Trump. The show isn’t worth the dissection. I won’t be watching anymore. I also don’t see much on ABC that interests me anymore.
But I’m proud to say that Roseanne won’t be sending me thank-you notes. I’m dead to her. She blocked me on Twitter.
A post in three parts: First, I dissect a specimen of the current elitist media attack on Facebook and its users as aguidepost on the path to moral panic. Second, as a counterpoint, I admire a report about how the leaders of our tomorrow — the youth of Parkland — are using social media to change the world. Third, I will tell Facebook it is not doing nearly enough to fix itself and if it does not act more decisively, honestly, and quickly, it will invite short-sighted regulation that could ruin the net for us all.
[First, my disclosure: I raised money from Facebook, Craig Newmark, the Ford Foundation, AppNexus, and others to start the News Integrity Initiative. We are independent of Facebook and I receive no payment from any platform.]
I have respected Matthew Yglesias as a political commentator since he was a blogger as a student at Harvard (he graduated a year before Mark Zuckerberg started Facebook there). I don’t agree with him this time. At Vox, Yglesias wrote an evisceration of Facebook, going so far as to tell Zuckerberg to shut it down. As I see it, his screed is:
Elitist. Once ensconced in media, Yglesias pulls the ladder up behind him, proposing to cut off the tool that gives so many others — two billion others — the means to speak and connect. Siva Vaidhyanathan, no great ally of Facebook (whose own book-length scrutiny of its impact is coming out this fall) makes this point well in The New York Times. To those who would #DeleteFacebeook, Vaidhyanathan warns: “Please realize, though, that you might be offloading problems onto those who may have less opportunity to protect privacy and dignity and are more vulnerable tothreats to democracy.”
Paternalistic. The argument we hear these days that Facebook, Twitter, et al are designed to addict the people— Yglesias portrays Facebook as a cigarette company — is a variation on the mass-society and mass-cultureworldviews I’ve been researching, which portray the unwashed public aslemmings who are readily hypnotized into idiocy and depression, acting against their own interests.Mike Godwin— he of Godwin’s Law — says it best on Facebook: “My big question for those who believe Facebook has overcome the free will of 2 billion people: how did all of you escape?”In a conversation (on Facebook) he adds: “Tristan Harris’s argument, which is weird to me, is that, when you encounter targeted ads or messages in social media, you will have thoughts put into your brain that you may not have wanted to have, and that you may not wish to have. My response is,dude, have you ever had a conversation before?”
Un-self-aware. I am profoundly disappointed that my fellow journalists and media people refuse to examine their own responsibility for thepolarization of our society; for the clickbait media economy that the platforms adapted; for failing to effectively inform the public conversation; for leading a long decline in trust in institutions; and for the rise of Trump himself.Let he who has not helped screwup society throw the first snark.As to the complaint about Facebook and Google hurting journalism’s business model I say what I have always said: It was up to us to innovate and adapt. We didn’t. They competed. We lost. They’re trying to help us anyway.
Self-righteous. Yglesias quite properly criticizes Facebook’s role — no matter how unwitting — in tragedies unfolding in Myanmar and I would add the Philippines. But then he adds: “I also lose sleep over a work screw-up sometimes, but I’m confident that I’ve never accidentally contributed to unleashing a genocide.” This isn’t easy but I have to say it as someone who shares Yglesias’ history: In our blogs, he and I each supported the war in Iraq. We have each recanted. We, like The New York Times, didn’t intend to lead to disaster. But it would be wise of us to judge how online tools are being manipulated and misused with this perspective. We live in an age of change and so necessarily in a time of unintended consequences.
Alarmist. Good Lord, shut down Facebook? Does Yglesias really think it’s that awful? He does. “Facebook is bad,” he says. “And it probably can’t be fixed.” I challenge anyone in sight of this to go look at your Facebook or Twitter feed of your friends and those you follow and come back and tell me how overrun it is with Nazis, bigots, and conspiracy theorists. I’ll bet you won’t find many or any because you’re smart and your friends are, too. Does society have Nazis, bigots, and conspiracy theorists? Of course, it does. It always has. Now, online, you can just see them better. But I wouldn’t kill the messenger. And I wouldn’t declare all society broken, not yet.
To blame a single actor for larger ills in society is a sign of moral panic, which Ashley Crossmen defines as “a widespread fear, most often an irrational one, that someone or something is a threat to the values, safety, and interests of a community or society at large.Typically, a moral panic is perpetuated by news media, fueled by politicians, and often results in the passage of new laws or policies that target the source of the panic. In this way, moral panic can fosterincreased social control.” Sound familiar?
Before deciding that Facebook is the root of all society’s ills today, I urge you to read Dave Cullen’s inspiring Vanity Fair report inside the “secret meme lab” run by the students, survivors, leaders, and heroes of Marjory Stoneman Douglas High School.
To echo Margaret Sullivan, these young people are “amazing communicators.” That is to say, they are smart, informed, and articulate. Now if you try to argue that they come off so well because they come from privilege —and they do — listen to all the many young people from many different schools and communities who spoke and were interviewed at the March for Our Lives. This is an articulate generation. The collection of Facebook, Twitter, Instagram, YouTube, and Snap did not ruin them. It empowered them. It connected them. It taught them how to speak to a public.In these dark, divided, Trumpian times when even an optimist such as myself could start to lose hope, I have regained my optimism watching, listening to, and following these young people.
Says Cullen: “This response would not have been possible for the Columbine generation. Today, every high-school kid in America is a content creator, churning out daily posts on Instagram and Snapchat without a thought — or, actually, with a tremendous amount of thought…. For the two dozen kids that came together in Cameron Kasky’s living room, content creation isn’t just a social diversion; it’s a way of life.”
Iwould call them more than “content creators.” I would call them leaders.Cullen’s report shows how they have learned to use their social-media savvy and use it responsibly. They listen. They collaborate. They understand and govern their impact. Journalists, too, should learn from them and how they use these tools to inform, to educate, and to engage people not in “content” but in conversation and action. These are lessons I will share with our Social Journalism students at CUNY.
Without their social tools — if they were still dependent on the gatekeepers of big, old, elite media — this campaign, like #BlackLivesMatter, could not have grown. For that alone, it is well worth understanding, protecting — and, yes, fixing — these platforms and our net.
Having defended Facebook and then praised it, now I will demand more of it, much more. If Facebook does not take quick and decisive action, I worry that we will find not just Facebook but our internet regulated in perilous ways. I do not believe that government — especially our government in the U.S. today — is competent to regulate the platforms and thus our speech. We can all see the future in the form of Europe’s regulation. Germany’s NetzDG hate-speech law and Europe’s “right to be forgotten” court ruling are monuments to unintended consequences for freedom of expression. I fear that the upcoming EU GDPR privacy regulations will also have serious consequences for the future of post-mass media. And at the extreme, I dread China’s internet. The Trump administration’s regulation? I shudder to imagine.
Zuckerberg vowed to fix Facebook and I must say I am disappointed in his actions so far. His response to the Cambridge Analytica story was slow and tactical, defensive about the details and silent on the deeper issues to which he and his company’s leadership must pay immediate attention.
I try to be both critical of and helpful to the media and technology industries, pressing news organizations to innovate and seeking to help them explore new business models in my day job, while I try to build bridges with technology companies whilealsopressingthemtoface their greater responsibility to society. In that spirit, I expect Facebook to be:
Respectful. Facebook needs to respect its users’ rights. I recall — but cannot find — an effort by Facebook in its early days to formulate a crowdsourced constitution for the community. Though I want to see Facebook listen to its users — and there will be plenty of good ideas — the company’s leaders need to propose their own principles to follow for product and business decisions regarding privacy of data above all. It is facile to say simply that we should “own” our data when the issues are more complex, with information about users coming from what they openly share, from their actions and transactions with others, and from inference and extrapolation. But it is possible for Facebook to assure users that they should know what Facebook knows about them; they should know how that information is being used; and they should have the right and means to delete and correct that data. Start there.
Honest. Brutally honest, that is, about itself. I believe Facebook needs to bring in outsiders to undertake a thorough ethical audit of its actions and its culture. I shake my head at each new revelation — collecting call historyis the latest. In what sane universe could a company with a righteous culture enable even one employee to survey users about pedophiles asking children for sexual pictures? Facebook needs to unearth and reveal that which does not comport with its new principles and statement of users’ rights. Going forward, I would recommend that it institute the kind of review panels hospitals and universities have to examine the implications and impact of experiments before they occur. Move fast but think first should be their new motto.
Transparent. I’ve been asked often lately whether this is an existential crisis for Facebook. I don’t think so; from what I see, there’s more furor in the media than in the public. But if anything will kill Facebook, I say it will be its culture — and the Valley’s culture — of secrecy.Exactly what competition is Facebook worried about? It owns social, so far. I say it is far more important for Facebook to be open with its users and trusted by the public than it is to hold onto secrets. Transparency must become a key ethic for the company.
Responsible. I have argued and will continue to argue that Facebook and the other platforms must recognize the roles and responsibilities they have taken on in informing the public conversation and assuring its civility and with it the health of the public sphere. Instead of walking away from decisions about quality, authority, credibility, and civility, the platforms are forced — by dint of the manipulation of them — to set standards and make decisions about what does and does not meet those standards. I believe they cannot do this alone and that is why I am headed to San Francisco as I write this to work with many others to set up networks to share signals of repute and disrepute about content and warnings of campaigns of disinformation. (More on that later.)
Collaborative. In all those areas of responsibility — informing society, encouraging civility, monitoring impact, sharing signals and alerts — it is critical that Facebook expand its efforts to collaborate with news organizations, other technology companies, government, academics, and others who can be helpful.
A leader. Facebook says it is implementing features of a proposed political advertising bill but then Quartz says it is trying to kill the legislation. I say that Facebook should leap so far ahead of these proposals that it makes the legislation moot. Facebook could and should make every political ad — and why not every ad? — transparent as to buyer and targeting, beating every other media outlet and legislator to the punch. Given the trouble Facebook — and now apparently Politico — are in for helping certain candidates with content and strategy, maybe it would be better for the platform — and media companies — to swear off acting as political aids. Let’s at least have that discussion.
Moral. Google’s famous injunction to not be evil was, as has often been reported, the founders’ license given to employees to keep the company from doing wrong, for that would be bad business. Google changed the line to “do the right thing.” Whichever. The leadership of a company has to set an example and create a culture of ethical and moral expectation.
Innovative. Mind you, it would be a huge mistake for Facebook to retreat into its shell in the belief that its product is finished and what it has to do now is defend what it has and screw up no more. No, Facebook is not finished. I believe it still has much work to do to imagine a fuller definition of community and how the platform can help bring people — friends and strangers — together into civil, informed, and productive conversation. I want to see the company continue to invent and take risks. I just want it to be more open, responsible, principled, ethical, and collaborative about it.
Unlike Matthew Yglesias, I do not believe we can — or should want to — back-button our way to a society before and without Facebook or social platforms or the net or for that matter trolls and Russian bots. We must recognize the reality of the world we live in today. We would be wise to take account of the many benefits these advances have brought. And we need to take responsibility together for using these new powers wisely. That includes all the platforms and technology companies and media companies and government — and every one of us.
I’m going to straddle a sword by on the one hand criticizing the platforms for not taking their public responsibility seriously enough, and on the other hand pleading for some perspective before we descend into a moral panic with unintended consequences for the net and the future.
[Disclosure: I raised $14 million for the News Integrity Initiative at CUNY from Facebook, Craig Newmark, the Ford Foundation, AppNexus, and others. We are independent of Facebook and I personally receive no money from any platform.]
The Observer’s reporting on Cambridge Analytica’s exploitation of Facebook data on behalf of Donald Trump has raised what the Germans call a shitstorm. There are nuances to this story I’ll get to below. But to begin, suffice it to say that Facebook is in a mess. As much as the other platforms would like to hide behind their schadenfreude, they can’t. Google has plenty of problems with YouTube (I write this the night before Google is set to announce new mitzvahs to the news industry). And Twitter is wisely begging for help in counteracting the ill effects it now concedes it has had on the health of the public conversation.
The platforms need to realize that they are not trusted. (And before media wrap themselves up in their own blanket of schadenfreude, I will remind them that they are not trusted either.) The internet industry’s cockiness cannot stand. They must listen to and respect concerns about them. They must learn humility and admit how hard that will be for them. They need to perform harsh and honest self-examinations of their cultures and moral foundations. Underlying all this, I believe they must adopt an ethic of radical transparency.
For a few years, I’ve been arguing that Facebook and its fellows should hire journalists not just to build relationships with media companies but more importantly to embrace a sense of public responsibility in decisions about their products, ranking, experiments, and impact. Now they would do well to also hire ethicists, psychologists, philosophers, auditors, prosecutors, and the Pope himself to help them understand not how to present themselves to the world — that’s PR — but instead to fully comprehend the responsibility they hold for the internet, society, and the future.
I still believe that most people in these companies themselves believe that they are creating and harnessing technology for the good. What they have not groked is the greater responsibility that has fallen on them based on how their technologies are used. In the early days of the internet, the citizens of the net — myself included — and the platforms that served them valued openness über alles. And it was good. What we all failed to recognize was — on the good side — how much people would come to depend on these services for information and social interaction and — on the bad side — how much they would be manipulated at scale. “When we built Twitter,” Ev Williamssaid at South by Southwest, “we weren’t thinking about these things. We laid down fundamental architectures that had assumptions that didn’t account for bad behavior. And now we’re catching on to that.”
This means that the platforms must be more aware of that bad behavior and take surer steps to counteract it. They must make the judgments they feared making when they defended openness as a creed. I will contend again that this does not make them media companies; we do not want them to clean and polish our internet as if the platforms were magazines and the world were China. We also must recognize the difficulty that scale brings to the task. But they now have little choice but to define and defend quality on their platforms and in the wider circles of impact they have on society in at least these areas:
Civility of the public conversation. Technology companies need to set and enforce standards for basic, civilized behavior. I still want to err on the side of openness but I see no reason to condone harassment and threats, bigotry and hate speech, and lies as incitement. (By these considerations, Infowars, for example, should be toast.)
An informed public conversation. Whether they wanted it or not, Facebook and Twitter particularly — and Google, YouTube, Snap and others as well — became the key mechanisms by which the public informs itself. Here, too, I’ll err on the side of openness but the platforms need to set standards for quality and credibility and build paths that lead users to both. They cannot walk away from the news because it is messy and inconvenient for we depend upon them now.
A healthy public sphere. One could argue that Facebook, Twitter, et al are the victims of manipulation by Russia, Cambridge Analytica, trolls, the alt-right, and conspiracy theorists. Except that they are not the bad guys’ real targets. We are. The platforms have an obligation to detect, measure, reveal, and counteract this manipulation. For a definition of manipulation, I give you C. Wright Mills in The Power Elite: “Authority is power that is explicit and more or less ‘voluntarily’ obeyed; manipulation is the ‘secret’ exercise of power, unknown to those who are influenced.”
Those are broad categories regarding the platforms’ external responsibilities. Internally they need to examine the ethical and moral bases for their decisions about what they do with user data, about what kinds of behaviors they reward and exploit, about the impact of their (and mass media’s) volume-based business model in fostering clickbait, and so on.
If the internet companies do not get their ethical and public acts together and quickly — making it clear that they are capable of governing their behavior for the greater good — I fear that the growing moral panic overtaking discussion of technology will lead to harmful legislation and legal precedent, hampering the internet’s potential for us all. In the rush to regulation, I worry that we will end up with more bad law (like Germany’s NetzDG hate-speech law and Europe’s right-to-be-forgotten court ruling — each of which, paradoxically, fights the platforms’ power by giving them more power to censor speech). My greater fear is that the regulatory mechanisms installed for good governments will be used by bad ones — and these days, what country does not worry about bad government? — leading to a lowest common denominator of freedom on the net.
So now let me pose a few challenges to the platforms’ critics.
On the current Cambridge Analytica story, I’ll agree that Facebook is foolish to split hairs about the use of the word “breach” even if Facebook is right that it wasn’t one. But it behooves us all to get the story right. Please read the complete threads (by opening each tweet) from Jay Pinho and Patrick Ruffini:
Note well that Facebook created mechanisms to benefit all campaigns, including Barack Obama’s. At the time, this was generally thought to be a good: using a social platform to enable civic participation. What went wrong in the meantime was (1) a researcher broke Facebook’s rules and shared data intended for research with his own company and then with Cambridge Analytica and (2) Donald Trump.
So do you think that Facebook should be forbidden from helping political campaigns? If we want television and the unlimited money behind it to lose influence in our elections, shouldn’t we desire more mechanisms to directly, efficiently, and relevantly reach voters by candidates and movements? If you agree, then what should be the limits of that? Should Facebook choose good and bad candidates as we expect them to choose good and bad news? I could argue in favor of banning or not aiding, say, a racist, admitted sexual abuser who incites hatred with conspiracy theories and lies. But what if such a person becomes the candidate of one of two major parties and ultimately the victor? Was helping candidates good before Trump and bad afterwards?
Before arguing that Facebook should never share data with anyone, know that there are many researchers who are dying to get their hands on this data to better understand how information and disinformation spread and how society is changing. I was among many such researchers some weeks ago at a valuable event on disinformation at the University of Pennsylvania (where, by the way, most of the academics in attendance scoffed at the idea that Cambridge Analytica actually had a secret sauce and any great power to influence elections … but now’s not the time for that argument). So what are the standards you expect from Facebook et al when it comes to sharing data? To whom? For what purposes? With what protections and restrictions?
I worry that if we reach a strict data crackdown — no data ever shared or used without explicit permission for the exact purpose — we will cut off the key to the only sustainable future for journalism and media that I see: one built on a foundation of delivering relevant and valuable services to people as individuals and members of communities, no longer as an anonymous mass. So please be careful about the laws, precedents, and unintended consequences you set.
When criticizing the platforms — and yes, they deserve criticism — I would ask you to examine whether their sins are unique. The advertising model we now blame for all the bad behavior we see on the net originated with and is still in use by mass media. We in news invented clickbait; we just called it headlines. We in media also set in motion the polarization that plagues society today with our chronic desire to pit simplistic stereotypes of red v. blue in news stories and cable-news arguments. Mass media is to blame for the idea of the mass and its results.
When demanding more of the platforms — as we should — I also would urge us to ask more of ourselves, to recognize our responsibility as citizens in encouraging a civil and informed conversation. The platforms should define bad behavior and enable us to report it. Then we need to report it. Then they need to act on what we report. And given the scale of the task, we need to be realistic in our expectations: On any reasonably open platform, someone will game the system and shit will rise — we know that. The question is how quickly and effectively the platforms respond.
I’ll repeat what I said in a recent post: No one — not platforms, not ad agencies and networks, not brands, not media companies, not government, not users — can stand back and say that disinformation, hate, and incivility are someone else’s problem to solve. We all bear responsibility. We all must help by bringing pressure and demanding quality; by collaborating to define what quality is; by fixing systems that enable manipulation and exploitation; and by contributing whatever resources we have (ad dollars to links to reporting bad actors).
Finally, let’s please base our actions and our pressure on platforms and government on research, facts, and data. Is Facebook polarizing or depolarizing society? We do not know enough about how Facebook and Twitter affected our election and we would be wise to know more before we think we can prescribe treatments that could be worse then the disease. That’s not to say there isn’t plenty we know that Facebook, Google, Twitter, media, and society need to fix now. But treating technology companies as the agents of ill intent that maliciously ruin our elections and split us apart and addict us to our devices is simplistic and ultimately won’t get us to the real problems we all must address.
Today I talked about this with my friend and mentor Jay Rosen — who four years ago wrote this wise piece about the kind of legitimacy platforms rely upon. Jay said we really don’t have the terms and concepts we need for this discussion. I agree.
I’ve been doing a lot of reading lately about the idea of the mass and its reputed manipulation at the hands of powerful and bad actors at other key moments in history: the French and American revolutions; the Industrial Revolution; the advent of mass media. At each wendepunkt, scholars and commentators worried about the impact of the change and struggled to find the language to describe and understand it. Now, in the midst of the digital revolution, we worry and struggle again. Facebook, Google, Twitter, and many of the people who created the internet we use today have no way to fully understand what their machines really do. Neither do we. I, for example, preached the openness that became the architecture and religion of the platforms without understanding the inevitability of that openness breeding trolls. We cannot use our analogs of the past to explain this future. That can be frightening. But I will continue to argue — optimist to a fault — that we can figure this out together.
Sometimes, things need to get bad before they can get good. Such is the case, I fear, with content, conversation, and advertising on the net. But I see signs of progress.
First let’s be clear: No one — not platforms, not ad agencies and networks, not brands, not media companies, not government, not users — can stand back and say that disinformation, hate, and incivility are someone else’s problem to solve. We all bear responsibility. We all must help by bringing pressure and demanding quality; by collaborating to define what quality is; by fixing systems that enable manipulation and exploitation; and by contributing whatever resources we have (ad dollars to links to reporting bad actors).
Last May, I wrote about fueling a flight to quality. Coming up on a year later, here’s what I see happening:
Twitter CEO Jack Dorsey recently posted a thread acknowledging his company’s responsiblity to the health and civility of the public conversation and asking for help in a bunch of very knotty issues balancing openness with civility, free speech with identifying and stopping harassment and disinformation. It is an important step.
Facebook made what I now come to understand was an unintended but critical mistake at the start of News Feed when it threw all “public content” into one big tub with few means for identifying differences in quality. Like Twitter and like the net culture itself, Facebook valued openness and equality. But when some of that content — especially Russian disinformation and manipulation campaigns — got them in trouble, they threw out the entire tub of bathwater. Now they’re trying to bring some of the better babies back by defining and promoting quality news. This, too, involves many difficult questions about the definitions of quality and diversity. But when it is done, I hope that good content can stand out.
In that post last May, I wrote about how Google Search would thenceforth account for the reliability, authority, and quality of sources in ranking. Bravo. I believe we will see that as a critical moment in the development of the net. But as we see in the news about Logan Paul and Alex Jones on YouTube, there is still work to be done on the ad side of the company. A system that enables platforms to give audience and major brands to give financial support to the likes of Jones is broken. Can we start there?
Through the News Integrity Initiative,* we helped start an effort called Open Brand Safety to identify the worst, low-hanging, rotten fruit of disinformation sites to help advertisers shun them. It’s still just a bare beginning. But through it, we have seen that not-insignificant amounts of ad dollars still go to known crap sites.
That is why I’ve joined an effort to organize a meeting later this month, bringing together the many organizations trying to identify signals of quality v. crap with representatives from platforms, ad networks, ad agencies, brands, NGOs, and others. I do not believe the solution is one-size-fits-all black lists and white lists, for it is impossible to define trust and quality for everyone — Gerber and Red Bull have different standards for advertising, as they should. What I’ve been arguing for is a network made up of all these constituencies to share signals of quality or a lack of it so each company can use that information to inform decisions about ranking, promotion, ad buys, and so on. I’ll report more when this happens.
I’ve spoken with the people in these companies and I believe their sincerity in trying to tackle this problem. I also see the complexity of the issues involved. We all want to preserve the openness of our internet but we also have to acknowledge that that openness makes the net vulnerable to manipulation by bad actors. So, to start, we need to recognize, reveal, and counteract that manipulation while also identifying and supporting good content.
It is because I believe in the need for openness that I will continue to argue that the the internet is not a medium and the platforms are not publishers. When the net is viewed as a next-generation medium like a newspaper or TV network, that brings perilous presumptions — namely that the net should be edited and packaged like a media property. I don’t want that. I treasure the openness and freedom that allow me to blog and say whatever I want and to find and hear voices I never was able to hear through the eye of media’s needle.
I also think it’s important to recognize that scale is a double-edged sword: It is the scale of the net and the platforms that enables anyone anywhere to speak to anyone else without need of capital, technical expertise, or permission. But it is also scale that makes the problems being addressed here so difficult to attack. No, the platforms should not — I do not want them to — pass judgment on everything that is posted on the net through them. I do not want the platforms to be my or your editor, to act like media or to build a Chinese internet.
But the platforms — and media companies like them — can no longer sit back and argue that they are just mirrors to society. Society warped and cracked itself to exploit their weaknesses. Facebook is not blameless in enabling Russian disinformation campaigns; YouTube is not blameless in creating a mechanism that allows and pays for Alex Jones to spew his bile; Twitter is not blameless in helping to foster incivility. Add to that: news organizations are not blameless in helping to spread disinformation and give it attention, and in fueling polarization and incivility. The ad industry is not blameless in helping to support the manipulators, spammers, trolls, and haters. Law enforcement is not blameless when it does not alert platforms and media companies to intelligence about bad actors. And you — yes, you and I — are not blameless when we share, click on, laugh at, encourage, and fail to report the kind of behavior that threatens our net.
Every effort I mention here is just a beginning. Every one of them is entangled with knotty questions. We need to help each other tackle this problem and protect our net. We need to discuss our mutual, moral responsibility to society and to an informed, civil, and productive public conversation.
There is much more to be done: Journalists and news organizations need to help the platforms define quality (wisely but generously to continue to encourage new and diverse voices). Journalists should also get smarter about not being exploited by manipulators. And news organizations need to do much more to build bridges between communities in conflict to foster understanding and empathy. The platforms, researchers, law enforcement, and NGOs should share alerts about manipulation they see to cut the bad guys off at the pass. Ad networks and platforms have to make it possible for advertisers to support the quality not the crap (and not claim ignorance of where their dollars go). Consumers — now banded together by campaigns like Sleeping Giants and Grab Your Wallet — need to continue to put pressure on platforms, brands, agencies, and networks and thereby give them cover so they are empowered to do what’s right.
Above all, let’s please remember that the internet is not ruined just because there are a few assholes on it. This, too, is why I insist on not seeing the net as a medium. It is Times Square. On Times Square, you can find pickpockets and bad Elmos and idiots, to be sure. But you also find many more nice tourists from Missoula and Mexico City and New Yorkers trying to dodge them on their way to work. Let’s bring some perspective to the media narrative about the net today. Please go take a look at your Facebook or Twitter or Instagram feeds or any Google search. I bet you will not find them infested with nazis and Russians and trolls, oh, my. I bet you will still find, on the whole, decent people like you and me. I fear that if we get carried away by moral panic we will end up not with a bustling Times Square of an internet but with China or Singapore or Iran as the model for a controlled digital future.
The net is good. We can and should make it better. We must protect it. That’s what these efforts are about.
*Disclosure: NII is funded by Facebook, Craig Newmark, the Ford Foundation, AppNexus, and others.
Facebook wants to build community. Ditto media. Me, too.
But I fear we are all defining and measuring community too shallowly and transiently. Community is not conversation — though that is a key metric Facebook will use to measure its success. Neither is community built on content: gathering around it, paying attention to it, linking to it, or talking about it — that is how media brands are measuring engagement. Conversation and content are tools or byproducts of real community.
Community means connecting people intimately and over time to share interests, worldviews, concerns, needs, values, empathy, and action. Facebook now says it wants to “prioritize posts that spark conversations and meaningful interactions between people.” I think that should be meaningful, lasting, and trusting interactions among people, plural. Think of community not as a cocktail party (or drunken online brawl) where friends and strangers idly chat. Instead, think of community a club one chooses to join, the sorts of clubs that society has been losing since my parents’ generation grew old. Meetuphas been trying to rebuild them. So should we all.
What if instead of just enabling people to share and talk about something — content — Facebook created the means for people to organize a modern, digital Rotary Club of concerned citizens who want to improve their circumstances for neighbors, geographic or virtual? Or it provides pews and pulpits where people can flock as congregations of shared belief. Or it opens the basement in that house of worship where addicts come to share their stories and needs. Or it creates the tools for a community of mutual support to reach out and lift each other up. Or it makes a classroom where people come to share knowledge and skills. Or it creates the means to build a craft union or guild for professionals to share and negotiate standards for quality. Or it builds the tools for citizens to join together in a positive social movement…. And what if journalism served these communities by informing their conversations and actions, by reflecting their desires, by answering their information needs, by convening them into dialogue, by helping to resolve instead of inflame conflict?
That is community. That is belonging. That is what Facebook and media should be enabling. I’ll reprise my definition of journalism from the other day as the imperative Facebook and news share:
Convening communities into civil, informed, and productive conversation, reducing polarization and building trust through helping citizens find common ground in facts and understanding.
How can we convene communities if we don’t really know what they are, if we are satisfied with mere conversation — yada, yada, yada — as a weak proxy for community?
While doing research for another project on the state of the mass, I recently read the 1959 book by sociologist William Kornhauser, The Politics of Mass Society, and reread Raymond Williams’ 1958 book, Culture & Society. I found lessons for both Facebook and media in their definitions of connected community vs. anonymous mass.
Kornhauser worries that “there is a paucity of independent groups” [read: communities] to protect people “from manipulation and mobilization.” In a proper pluralist and diverse society, he argues, “the population is unavailable [for such manipulation] in that people possess multiple commitments to diverse and autonomous groups.” To communities. “When people are divorced from their communites and work, they are free to reunite in new ways.” They are feed for trolls and totalitarians.
Thus we find ourselves working under a false definition of community — accepting any connection, any conversation, any link as qualification — and we end up with something that looks like a mob or a mass: singular, thin, and gross. “The mass man substitutes an undifferentiated image of himself for an individualized one,” Kornhauser says; “he answers the perennial question of ‘Who am I?’ with the formula ‘I am like everyone else.’” He continues:
The autonomous man respects himself as an individual, experiencing himself as the bearer of his own power and having the capacity to determine his life and to affect the lives of his fellows…. Non-pluralist society lacks the diversity of social worlds to nurture and sustain independent persons…. [I]n pluralist society there are alternative loyalties (sanctuaries) which do not place the noncomformist outside the social pale.
In other words, when you cannot find a community to identify with, you are anonymously lumped in with — or lump yourself in with — the mob or the mass. But when you find and join with other people with whom you share affinity, you have the opportunity to express your individuality. That is the lovely paradox of community: real community supports the individual through joining while the mass robs of us of our individuality by default. The internet, I still believe, is built so we can both express our individuality and join with other individuals in communities. That is why I value sharing and connection.
And that is why I have urged Facebook — and media — to find the means to introduce us to each other, to make strangers less strange, to rob the trolls and totalitarians of the power of the Other. How? By creating safe spaces where people can reveal themselves and find fellows; by creating homes for true communities; and by connecting them.
That is what might get us out of this mess of Trumpian, Putinistic, fascistic, racist, misogynistic, exclusionary hate and fear and rule by the mob. There’s nothing easy in that task for platforms or for journalists. But for God’s sake, we must try.
Now you might say that what is good for the goose is good for the nazi: that the same tools that are used to build my hip, digital Rotary Club can be used by the white supremicists to organize their riot in Charlottesville or advertise their noxious views to the vulnerable. Technology is neutral, eh? Perhaps, but society is not. Society judges by negotiating and setting standards and norms. A healthy society or platform or media or brand should never tolerate, distribute, or pay for the nazi and his hate.This means that Facebook — like Google and like the media — will need to give up the pretense of neutrality in the face of manipulation and hate. They must work to bring communities together and respect the diverse individuals in them.
“An atomized society invites the totalitarian movement,” Kornhauser warns. In mass society, the individual who does not conform to the group is the cuck; in totalitarian society, he is a criminal. In pluralist, open, and tolerant society, the individual who does not conform to someone else’s definition of the whole is free to find his or her community and self. That is the connected net society we must help build. Or as Kornhauser puts it, in terms we can understand today: “A pluralist society supports a liberal democracy, whereas a mass society supports a populist democracy.” Trump and his one-third base are built on populism, while the two-thirds majority (not “the mass”) of the nation disapproves. But our platforms and our media are not built to support that majority. They pay attention to Trump’s base because mass media is built for the mass and conflict and platforms are built as if all connections are the same.
In the end, Kornhauser is optimistic, as am I. “[T]hese conditions of modern life carry with them both the heightened possibility of social alienation andenhanced opportunities for the creation of new forms of association.” We can use Facebook, Twitter, et al to snap and snark at each other or to find ourselves in others and join together. The platforms and media can and should help us — but the choice, once offered, is ours to take.
I’ll end with these words of sociologist Raymond Williams:
If our purpose is art, education, the giving of information or opinion, our interpretation will be in terms of the rational and interested being. If, on the other hand, our purpose is manipulation — the persuasion of a large number of people to act, feel, think, know, in certain ways — the convenient formula will be that of the masses….
To rid oneself of the illusion of the objective existence of ‘the masses’, and to move towards a more actual and more active conception of human beings and relationships, is in fact to realize a new freedom.