Posts about facebook

The Flight to Quality is on the Runway

Sometimes, things need to get bad before they can get good. Such is the case, I fear, with content, conversation, and advertising on the net. But I see signs of progress.

First let’s be clear: No one — not platforms, not ad agencies and networks, not brands, not media companies, not government, not users — can stand back and say that disinformation, hate, and incivility are someone else’s problem to solve. We all bear responsibility. We all must help by bringing pressure and demanding quality; by collaborating to define what quality is; by fixing systems that enable manipulation and exploitation; and by contributing whatever resources we have (ad dollars to links to reporting bad actors).

Last May, I wrote about fueling a flight to quality. Coming up on a year later, here’s what I see happening:

Platforms:

  • Twitter CEO Jack Dorsey recently posted a thread acknowledging his company’s responsiblity to the health and civility of the public conversation and asking for help in a bunch of very knotty issues balancing openness with civility, free speech with identifying and stopping harassment and disinformation. It is an important step.
  • Facebook made what I now come to understand was an unintended but critical mistake at the start of News Feed when it threw all “public content” into one big tub with few means for identifying differences in quality. Like Twitter and like the net culture itself, Facebook valued openness and equality. But when some of that content — especially Russian disinformation and manipulation campaigns — got them in trouble, they threw out the entire tub of bathwater. Now they’re trying to bring some of the better babies back by defining and promoting quality news. This, too, involves many difficult questions about the definitions of quality and diversity. But when it is done, I hope that good content can stand out.
  • In that post last May, I wrote about how Google Search would thenceforth account for the reliability, authority, and quality of sources in ranking. Bravo. I believe we will see that as a critical moment in the development of the net. But as we see in the news about Logan Paul and Alex Jones on YouTube, there is still work to be done on the ad side of the company. A system that enables platforms to give audience and major brands to give financial support to the likes of Jones is broken. Can we start there?

Advertising:

  • Through the News Integrity Initiative,* we helped start an effort called Open Brand Safety to identify the worst, low-hanging, rotten fruit of disinformation sites to help advertisers shun them. It’s still just a bare beginning. But through it, we have seen that not-insignificant amounts of ad dollars still go to known crap sites.
  • That is why I’ve joined an effort to organize a meeting later this month, bringing together the many organizations trying to identify signals of quality v. crap with representatives from platforms, ad networks, ad agencies, brands, NGOs, and others. I do not believe the solution is one-size-fits-all black lists and white lists, for it is impossible to define trust and quality for everyone — Gerber and Red Bull have different standards for advertising, as they should. What I’ve been arguing for is a network made up of all these constituencies to share signals of quality or a lack of it so each company can use that information to inform decisions about ranking, promotion, ad buys, and so on. I’ll report more when this happens.

I’ve spoken with the people in these companies and I believe their sincerity in trying to tackle this problem. I also see the complexity of the issues involved. We all want to preserve the openness of our internet but we also have to acknowledge that that openness makes the net vulnerable to manipulation by bad actors. So, to start, we need to recognize, reveal, and counteract that manipulation while also identifying and supporting good content.

It is because I believe in the need for openness that I will continue to argue that the the internet is not a medium and the platforms are not publishers. When the net is viewed as a next-generation medium like a newspaper or TV network, that brings perilous presumptions — namely that the net should be edited and packaged like a media property. I don’t want that. I treasure the openness and freedom that allow me to blog and say whatever I want and to find and hear voices I never was able to hear through the eye of media’s needle.

I also think it’s important to recognize that scale is a double-edged sword: It is the scale of the net and the platforms that enables anyone anywhere to speak to anyone else without need of capital, technical expertise, or permission. But it is also scale that makes the problems being addressed here so difficult to attack. No, the platforms should not — I do not want them to — pass judgment on everything that is posted on the net through them. I do not want the platforms to be my or your editor, to act like media or to build a Chinese internet.

But the platforms — and media companies like them — can no longer sit back and argue that they are just mirrors to society. Society warped and cracked itself to exploit their weaknesses. Facebook is not blameless in enabling Russian disinformation campaigns; YouTube is not blameless in creating a mechanism that allows and pays for Alex Jones to spew his bile; Twitter is not blameless in helping to foster incivility. Add to that: news organizations are not blameless in helping to spread disinformation and give it attention, and in fueling polarization and incivility. The ad industry is not blameless in helping to support the manipulators, spammers, trolls, and haters. Law enforcement is not blameless when it does not alert platforms and media companies to intelligence about bad actors. And you — yes, you and I — are not blameless when we share, click on, laugh at, encourage, and fail to report the kind of behavior that threatens our net.

Every effort I mention here is just a beginning. Every one of them is entangled with knotty questions. We need to help each other tackle this problem and protect our net. We need to discuss our mutual, moral responsibility to society and to an informed, civil, and productive public conversation.

There is much more to be done: Journalists and news organizations need to help the platforms define quality (wisely but generously to continue to encourage new and diverse voices). Journalists should also get smarter about not being exploited by manipulators. And news organizations need to do much more to build bridges between communities in conflict to foster understanding and empathy. The platforms, researchers, law enforcement, and NGOs should share alerts about manipulation they see to cut the bad guys off at the pass. Ad networks and platforms have to make it possible for advertisers to support the quality not the crap (and not claim ignorance of where their dollars go). Consumers — now banded together by campaigns like Sleeping Giants and Grab Your Wallet — need to continue to put pressure on platforms, brands, agencies, and networks and thereby give them cover so they are empowered to do what’s right.

Above all, let’s please remember that the internet is not ruined just because there are a few assholes on it. This, too, is why I insist on not seeing the net as a medium. It is Times Square. On Times Square, you can find pickpockets and bad Elmos and idiots, to be sure. But you also find many more nice tourists from Missoula and Mexico City and New Yorkers trying to dodge them on their way to work. Let’s bring some perspective to the media narrative about the net today. Please go take a look at your Facebook or Twitter or Instagram feeds or any Google search. I bet you will not find them infested with nazis and Russians and trolls, oh, my. I  bet you will still find, on the whole, decent people like you and me. I fear that if we get carried away by moral panic we will end up not with a bustling Times Square of an internet but with China or Singapore or Iran as the model for a controlled digital future.

The net is good. We can and should make it better. We must protect it. That’s what these efforts are about.

*Disclosure: NII is funded by Facebook, Craig Newmark, the Ford Foundation, AppNexus, and others.

What Makes a Community?

Facebook wants to build community. Ditto media. Me, too.

But I fear we are all defining and measuring community too shallowly and transiently. Community is not conversation — though that is a key metric Facebook will use to measure its success. Neither is community built on content: gathering around it, paying attention to it, linking to it, or talking about it — that is how media brands are measuring engagement. Conversation and content are tools or byproducts of real community.

Community means connecting people intimately and over time to share interests, worldviews, concerns, needs, values, empathy, and action. Facebook now says it wants to “prioritize posts that spark conversations and meaningful interactions between people.” I think that should be meaningful, lasting, and trusting interactions among people, plural. Think of community not as a cocktail party (or drunken online brawl) where friends and strangers idly chat. Instead, think of community a club one chooses to join, the sorts of clubs that society has been losing since my parents’ generation grew old. Meetuphas been trying to rebuild them. So should we all.

What if instead of just enabling people to share and talk about something — content — Facebook created the means for people to organize a modern, digital Rotary Club of concerned citizens who want to improve their circumstances for neighbors, geographic or virtual? Or it provides pews and pulpits where people can flock as congregations of shared belief. Or it opens the basement in that house of worship where addicts come to share their stories and needs. Or it creates the tools for a community of mutual support to reach out and lift each other up. Or it makes a classroom where people come to share knowledge and skills. Or it creates the means to build a craft union or guild for professionals to share and negotiate standards for quality. Or it builds the tools for citizens to join together in a positive social movement…. And what if journalism served these communities by informing their conversations and actions, by reflecting their desires, by answering their information needs, by convening them into dialogue, by helping to resolve instead of inflame conflict?

That is community. That is belonging. That is what Facebook and media should be enabling. I’ll reprise my definition of journalism from the other day as the imperative Facebook and news share:

Convening communities into civil, informed, and productive conversation, reducing polarization and building trust through helping citizens find common ground in facts and understanding.

How can we convene communities if we don’t really know what they are, if we are satisfied with mere conversation — yada, yada, yada — as a weak proxy for community?

While doing research for another project on the state of the mass, I recently read the 1959 book by sociologist William Kornhauser, The Politics of Mass Society, and reread Raymond Williams’ 1958 book, Culture & Society. I found lessons for both Facebook and media in their definitions of connected community vs. anonymous mass.

Kornhauser worries that “there is a paucity of independent groups” [read: communities] to protect people “from manipulation and mobilization.” In a proper pluralist and diverse society, he argues, “the population is unavailable [for such manipulation] in that people possess multiple commitments to diverse and autonomous groups.” To communities. “When people are divorced from their communites and work, they are free to reunite in new ways.” They are feed for trolls and totalitarians.

Thus we find ourselves working under a false definition of community — accepting any connection, any conversation, any link as qualification — and we end up with something that looks like a mob or a mass: singular, thin, and gross. “The mass man substitutes an undifferentiated image of himself for an individualized one,” Kornhauser says; “he answers the perennial question of ‘Who am I?’ with the formula ‘I am like everyone else.’” He continues:

The autonomous man respects himself as an individual, experiencing himself as the bearer of his own power and having the capacity to determine his life and to affect the lives of his fellows…. Non-pluralist society lacks the diversity of social worlds to nurture and sustain independent persons…. [I]n pluralist society there are alternative loyalties (sanctuaries) which do not place the noncomformist outside the social pale.

In other words, when you cannot find a community to identify with, you are anonymously lumped in with — or lump yourself in with — the mob or the mass. But when you find and join with other people with whom you share affinity, you have the opportunity to express your individuality. That is the lovely paradox of community: real community supports the individual through joining while the mass robs of us of our individuality by default. The internet, I still believe, is built so we can both express our individuality and join with other individuals in communities. That is why I value sharing and connection.

And that is why I have urged Facebook — and media — to find the means to introduce us to each other, to make strangers less strange, to rob the trolls and totalitarians of the power of the Other. How? By creating safe spaces where people can reveal themselves and find fellows; by creating homes for true communities; and by connecting them.

That is what might get us out of this mess of Trumpian, Putinistic, fascistic, racist, misogynistic, exclusionary hate and fear and rule by the mob. There’s nothing easy in that task for platforms or for journalists. But for God’s sake, we must try.

Now you might say that what is good for the goose is good for the nazi: that the same tools that are used to build my hip, digital Rotary Club can be used by the white supremicists to organize their riot in Charlottesville or advertise their noxious views to the vulnerable. Technology is neutral, eh? Perhaps, but society is not. Society judges by negotiating and setting standards and norms. A healthy society or platform or media or brand should never tolerate, distribute, or pay for the nazi and his hate. This means that Facebook — like Google and like the media — will need to give up the pretense of neutrality in the face of manipulation and hate. They must work to bring communities together and respect the diverse individuals in them.

“An atomized society invites the totalitarian movement,” Kornhauser warns. In mass society, the individual who does not conform to the group is the cuck; in totalitarian society, he is a criminal. In pluralist, open, and tolerant society, the individual who does not conform to someone else’s definition of the whole is free to find his or her community and self. That is the connected net society we must help build. Or as Kornhauser puts it, in terms we can understand today: “A pluralist society supports a liberal democracy, whereas a mass society supports a populist democracy.” Trump and his one-third base are built on populism, while the two-thirds majority (not “the mass”) of the nation disapproves. But our platforms and our media are not built to support that majority. They pay attention to Trump’s base because mass media is built for the mass and conflict and platforms are built as if all connections are the same.

In the end, Kornhauser is optimistic, as am I. “[T]hese conditions of modern life carry with them both the heightened possibility of social alienation andenhanced opportunities for the creation of new forms of association.” We can use Facebook, Twitter, et al to snap and snark at each other or to find ourselves in others and join together. The platforms and media can and should help us — but the choice, once offered, is ours to take.

I’ll end with these words of sociologist Raymond Williams:

If our purpose is art, education, the giving of information or opinion, our interpretation will be in terms of the rational and interested being. If, on the other hand, our purpose is manipulation — the persuasion of a large number of people to act, feel, think, know, in certain ways — the convenient formula will be that of the masses….

To rid oneself of the illusion of the objective existence of ‘the masses’, and to move towards a more actual and more active conception of human beings and relationships, is in fact to realize a new freedom.

Facebook’s changes

[Disclosure: I raised $14 million from Facebook, the Craig Newmark and Ford foundations, and others to start the News Integrity Initiative. I personally receive no money from and am independent of Facebook.]

So, here’s what’s on my mind about Facebook’s changes, just announced by Mark Zuckerberg, to “prioritize posts that spark conversations and meaningful interactions between people” over content from media and brands.

Yes, I’m worried. Let me start there.

I’m worried that now that Facebook has become a primary distributor of news and information in society, it cannot abrogate its responsibility — no matter how accidentally that role was acquired — to help inform our citizenry.

I’m worried that news and media companies — convinced by Facebook (and in some cases by me) to put their content on Facebook or to pivot to video — will now see their fears about having the rug pulled out from under them realized and they will shrink back from taking journalism to the people where they are having their conversations because there is no money to be made there.

I’m worried for Facebook and Silicon Valley that both media and politicians will use this change to stir up the moral panic about technology I see rising in Europe and now in America.

But…

I am hopeful that Facebook’s effort to encourage “meaningful interactions” could lead to greater civility in our conversations, which society desperately needs. The question is: Will Facebook value and measure civility, intelligence, and credibility or mere conversation? We know what conversation alone brings us: comments and trolls. What are “meaningful interactions?”

And…

I wish that Facebook would fuel and support a flight to quality in news. Facebook has lumped all so-called “public content” into one, big, gnarly bucket. It is is dying to get rid of the shit content that gets them into political and PR trouble and that degrades the experience on Facebook and in our lives. Fine. But they must not throw the journalistic baby out with the trolly bathwater. Facebook needs to differentiate and value quality content — links to The Washington Post, The New York Times, The Guardian, and thousands of responsible, informative, useful old and new news outlets around the world.

I wish that Facebook would make clear that it will not use this change to exploit media companies for more advertising revenue when the goal is to inform the public.

I wish that Facebook would not just connect us with the people we know and agree with — our social filter bubbles — but also would devote effort to making strangers less strange, to robbing the demagogues and hate mongers of their favorite weapon: the Other. That, I firmly believe, is the most valuable thing Facebook could do to combat polarization in our world: creating safe spaces where people can share their lives and perspectives with others, helping to build bridges among communities.

I wish that Facebook would work with journalists to help them learn how to use Facebook natively to inform the public conversation where and when it occurs. Until now, Facebook has tried to suck up to media companies (and by extension politicians) by providing distribution and monetization opportunties through Instant Articles and video. Oh, well. So much for that. Now I want to see Facebook help news media make sharable journalism and help them make money through that. But I worry that news organizations will be gun-shy of even trying, sans rug.

So…

I have been rethinking my definition of journalism. It used to be: helping communties organize their knowledge to better organize themselves. That was an information-based definition.

After our elections in the U.S., the U.K., Austria, Germany, and elsewhere, I have seen that civility is a dire need and a precondition for journalism and an informed society. So now I have a new definition for journalism, an imperative that I believe news organizations share with Facebook (if it is serious about building communities).

My new definition of journalism: convening communities into civil, informed, and productive conversation, reducing polarization and building trust through helping citizens find common ground in facts and understanding.

Will Facebook’s changes help or hurt that cause? We shall see.

LATER: One more thought overnight on what publishers and Facebook should do now: Facebook makes it clear that the best way to get distribution there is for users to share and talk about your links. Conversation is now a key measure of value in Facebook.

The wrong thing to do would be to make and promote content that stirs up short and nasty conversation: “Asshole.” “Fuck you.” “No, fuck you, troll.” “Cuck.” “Nazi.” You know the script. I don’t want to see media move from clickbait to commentbait. Facebook won’t value that. No one will.

The right thing to do — as I have been arguing for almost two years — is to bring journalism to people on Facebook (and Twitter and Snap and Instagram and YouTube…) natively as part of people’s conversations. The easiest example, which I wrote about here, is the meme that someone passes along because it speaks for them, because it adds facts and perspectives to their conversation. There are many other forms and opportunities to make shareable, conversational journalism; a colleague is planning to create a course at CUNY Journalism School around just that.

The problem that will keep publishers from doing this is that there are few ways to monetize using Facebook as Facebook should be used. I’ve been arguing to Facebook for more than two years that they should see Jersey Shore Hurricane News as a model of native news on Facebook — with its loyal members contributing and conversing about their community — and that they should help its creator, Justin Auciello, make money there. Instead, Facebook has had to play to large, established publishers’ desires to distribute the content they have. So Facebook created formats for self-contained content — Instant Articles and videos — with the monetization within. Facebook and publishers painted themselves into a corner by trying to transpose old forms of media into a new reality. Now they’re admitting that doesn’t work.

But journalism and news clearly do have a place on Facebook. Many people learn what’s going on in the world in their conversations there and on the other social platforms. So we need to look how to create conversational news. The platforms need to help us make money that way. It’s good for everybody, especially for citizens.

Moral Authority as a Platform

[See my disclosures below.*]

Since the election, I have been begging the platforms to be transparent about efforts to manipulate them — and thus the public. I wish they had not waited so long, until they were under pressure from journalists, politicians, and prosecutors. I wish they would realize the imperative to make these decisions based on higher responsibility. I wish they would see the need and opportunity to thus build moral authority.

Too often, technology companies hide behind the law as a minimal standard. At a conference in Vienna called Darwin’s Circle, Palantir CEO Alexander Karp (an American speaking impressive German) told Austrian Chancellor Christian Kern that he supports the primacy of the state and that government must set moral standards. Representatives of European institutions were pleasantly surprised not to be challenged with Silicon Valley libertarian dogma. But as I thought about it, I came to see that Karp was copping out, delegating his and his company’s ethical responsibility to the state.

At other events recently, I’ve watched journalists quiz representatives of platforms about what they reveal about manipulation and also what they do and do not distribute and promote on behalf of the manipulators. Again I heard the platforms duck under the law — “We follow the laws of the nations we are in,” they chant — while the journalists pushed them for a higher moral standard. So what is that standard?

Transparency should be easy. If Facebook, Twitter, and Google had revealed that they were the objects of Russian manipulation as soon as they knew it, then the story would have been Russia. Instead the story is the platforms.

I’m glad that Mark Zuckerberg has said that in the future, if you see a political ad in your feed, you will be able to link to the page or user that bought it. I’d like the platforms to all go farther:

  • First, internet platforms should make every political ad available for public inspection, setting a standard that goes far beyond the transparency required of political advertising on broadcast and certainly beyond what we can find out about dark political advertising in direct mail and robocalls. Why shouldn’t the platforms lead the way?
  • Second, I think it is critical that the platforms reveal the targeting criteria used for these political ads so we can see what messages (and lies and hate) are aimed at whom.
  • Third, I’d like to see all this data made available to researchers and journalists so the public — the real target of manipulation — can learn more about what is aimed at them.

The reason to do this is just not to avoid bad PR or merely to follow the law, to meet minimal expectations. The reason to do all this is to establish public responsibility consumate with the platforms’ roles as the proprietors of so much of the internet and thus the future.

In What Would Google Do?, I praised the Google founders’ admonition to their staff — “Don’t be evil” — as a means to keep the company honest. The cost of doing evil in business has risen as customers have gained the ability to talk about a company and as anyone could move to a competitor with a click. But that, too, was a minimal standard. I now see that Google — and its peers — should have evolved to a higher standard:

“Do good. Be good.”

I don’t buy the arguments of cynics who say it is impossible for a corporation to be anything other than greedy and evil and that we should give up on them. I believe in the possibility and wisdom of enlightened self-interest and I believe we can hold these companies to an expectation of public spirit if not benevolence. I also take Zuck at his word when he asks forgiveness “for the ways my work was used to divide people rather than bring us together,” and vows to do better. So let us help him define better.

The caveats are obvious: I agree with the platforms that we do not want them to become censors and arbiters of right v. wrong; to enforce prohibitions determined by the lowest-common-demoninators of offensiveness; to set precedents that will be exploited by authoritarian governments; to make editorial judgments.

But doing good and being good as a standard led Google to its unsung announcement last April that it would counteract manipulation of search ranking by taking account of the reliability, authority, and quality of sources. Thus Google took the side of science over crackpot conspirators, because it was the right thing to do. (But then again, I just saw that Alternet complains that it and other advocacy and radical sites are being hit hard by this change. We need to make clear that fighting racism and hate is not to be treated like spreading racism and hate. We must be able to have an open discussion about how these standards are being executed.)

Doing good and being good would have led Facebook to transparency about Russian manipulation sooner.

Doing good and being good would have led Twitter to devote resources to understanding and revealing how it is being used as a tool of manipulation — instead of merely following Facebook’s lead and disappointing Congressional investigators. More importantly, I believe a standard of doing good and being good would lead Twitter to set a higher bar of civility and take steps to stop the harassment, stalking, impersonation, fraud, racism, misogyny, and hate directed at its own innocent users.

Doing good and being good would also lead journalistic institutions to examine how they are being manipulated, how they are allowing Russians, trolls, and racists to set the agenda of the public conversation. It would lead us to decide what our real job is and what our outcomes should be in informing productive and civil civic conversation. It would lead us to recognize new roles and responsibilities in convening communities in conflict into uncomfortable but necessary conversation, starting with listening to those communities. It should lead us to collaborate with and set an example for the platforms, rather than reveling in schadenfreude when they get in trouble. It should also lead us all — media companies and platforms alike — to recognize the moral hazards embedded in our business models.

I don’t mean to oversimplify even as I know I am. I mean only to suggest that we must raise up not only the quality of public conversation but also our own expectations of ourselves in technology and media, of our roles in supporting democratic deliberation and civil (all senses of the word) society. I mean to say that this is the conversation we should be having among ourselves: What does it mean to do and be good? What are our standards and responsibilities? How do we set them? How do we live by them?

Building and then operating from that position of moral authority becomes the platform more than the technology. See how long it is taking news organizations to learn that they should be defined not by their technology — “We print content” — but instead by their trust and authority. That must be the case for technology companies as well. They aren’t just code; they must become their missions.


* Disclosure: The News Integrity Initiative, operated independently at CUNY’s Tow-Knight Center, which I direct, received funding from Facebook, the Craig Newmark Philanthropic Fund, and the Ford Foundation and support from the Knight and Tow foundations, Mozilla, Betaworks, AppNexus, and the Democracy Fund.

Real News

I’m proud that we at CUNY’s Graduate School of Journalism and the Tow-Knight Center just announced the creation of the News Integrity Initiative, charged with finding ways to better inform the public conversation and funded thus far with $14 million by nine foundations and companies, all listed on the press release. Here I want to tell its story.

This began after the election when my good friend Craig Newmark — who has been generously supporting work on trust in news — challenged us to address the problem of mis- and disinformation. There is much good work being done in this arena — from the First Draft Coalition, the Trust Project, Dan Gillmor’s work at ASU bringing together news literacy efforts, and the list goes on. Is there room for more?

I saw these needs and opportunities:

  • First, much of the work to date is being done from a media perspective. I want to explore this issue from a public perspective — not just about getting the public to read our news but more about getting media to listen to the public. This is the philosophy behind the Social Journalism program Carrie Brown runs at CUNY, which is guided by Jay Rosen’s summary of James Carey: “The press does not ‘inform’ the public. It is ‘the public’ that ought to inform the press. The true subject matter of journalism is the conversation the public is having with itself.” We must begin with the public conversation and must better understand it.
  • Second, I saw that the fake news brouhaha was focusing mainly on media and especially on Facebook — as if they caused it and could fix it. I wanted to expand the conversation to include other affected and responsible parties: ad agencies, brands, ad networks, ad technology, PR, politics, civil society.
  • Third, I wanted to shift the focus of our deliberations from the negative to the positive. In this tempest, I see the potential for a flight to quality — by news users, advertisers, platforms, and news organizations. I want to see how we can exploit this moment.
  • Fourth, because there is so much good work — and there are so many good events (I spent about eight weeks of weekends attending emergency fake news conferences) — we at the Tow-Knight Center wanted to offer to convene the many groups attacking this problem so we could help everyone share information, avoid duplication, and collaborate. We don’t want to compete with any of them, only to help them. At Tow-Knight, under the leadership of GM Hal Straus, we have made the support of professional communities of practice — so far around product development, audience development and membership, commerce, and internationalization — key to our work; we want to bring those resources to the fake news fight.

My dean and partner in crime, Sarah Bartlett, and I formulated a proposal for Craig. He quickly and generously approved it with a four-year grant.

And then my phone rang. Or rather, I got a Facebook message from the ever-impressive Áine Kerr, who manages journalism partnerships there. Facebook had recently begun working with fact-checking agencies to flag suspect content; it started its Journalism Project; and it held a series of meetings with news organizations to share what it is doing to improve the lot of news on the platform.

Áine said Facebook was looking to do much more in collaboration with others and that led to a grant to fund research, projects, and convenings under the auspices of what Craig had begun.

Soon, more funders joined: John Borthwick of Betaworks has been a supporter of our work since we collaborated on a call to cooperate against fake news. Mozilla agreed to collaborate on projects. Darren Walker at the Ford Foundation generously offered his support, as did the two funders of the center I direct, the Knight and Tow foundations. Brian O’Kelley, founder of AppNexus, and the Democracy Fund joined as well. More than a dozen additional organizations — all listed in the release — said they would participate as well. We plan to work with many more organizations as advisers, funders, and grantees.


Now let me get right to the questions I know you’re ready to tweet my way, particularly about one funder: Have I sold out to Facebook? Well, in the end, you will be the judge of that. For a few years now, I have been working hard to try to build bridges between the publishers and the platforms and I’ve had the audacity to tell both Facebook and Google what I think they should do for journalism. So when Facebook knocks on the door and says they want to help journalism, who am I to say I won’t help them help us? When Google started its Digital News Initiative in Europe, I similarly embraced the effort and I have been impressed at the impact it has had on building a productive relationship between Google and publishers.

Sarah and I worked hard in negotiations to assure CUNY’s and our independence. Facebook — and the other funders and participants present and future — are collaborators in this effort. But we designed the governance to assure that neither Facebook nor any other funder would have direct control over grants and to make sure that we would not be put in a position of doing anything we did not want to do. Note also that I am personally receiving no funds from Facebook, just as I’ve never been paid by Google (though I have had travel expenses reimbursed). We hope to also work with multiple platforms in the future; discussions are ongoing. I will continue to criticize and defend them as deserved.

My greatest hope is that this Initiative will provide the opportunity to work with Facebook and other platforms on reimagining news, on supporting innovation, on sharing data to study the public conversation, and on supporting news literacy broadly defined.


The work has already begun. A week and a half ago, we convened a meeting of high-level journalists and representatives from platforms (both Facebook and Google), ad agencies, brands, ad networks, ad tech, PR, politics, researchers, and foundations for a Chatham-House-rule discussion about propaganda and fraud (née “fake news”). We looked at research that needs to be done and at public education that could help.

The meeting ended with a tangible plan. We will investigate gathering and sharing many sets of signals about both quality and suspicion that publishers, platforms, ad networks, ad agencies, and brands can use — according to their own formulae — to decide not just what sites to avoid but better yet what journalism to support. That’s the flight to quality I have been hoping to see. I would like us to support this work as a first task of our new Initiative.

We will fund research. I want to start by learning what we already know about the public conversation: what people share, what motivates them to share it, what can have an impact on informing the conversation, and so on. We will reach out to the many researchers working in this field — danah boyd (read her latest!) of Data & Society, Zeynep Tufekci of UNC, Claire Wardle of First Draft, Duncan Watts and David Rothschild of Microsoft Research, Kate Starbird (who just published an eye-opening paper on alternative narratives of news) of the University of Washington, Rasmus Kleis Nielsen of the Reuters Institute, Charlie Beckett of POLIS-LSE, and others. I would like us to examine what it means to be informed so we can judge the effectiveness of our — indeed, of journalism’s — work.

We will fund projects that bring journalism to the public and the conversation in new ways.

We will examine new ways to achieve news literacy, broadly defined, and investigate the roots of trust and mistrust in news.

And we will help convene meetings to look at solutions — no more whining about “fake news,” please.

We will work with organizations around the world; you can see a sampling of them in the release and we hope to work with many more: projects, universities, companies, and, of course, newsrooms everywhere.

We plan to be very focused on a few areas where we can have a measurable impact. That said, I hope we also pursue the high ambition to reinvent journalism for this new age.

But we’re not quite ready. This has all happened very quickly. We are about to start a search for a manager to run this effort with a small staff to help with information sharing and events. As soon as we begin to identify key areas, we will invite proposals. Watch this space.