My Facebook op-ed

Aftenposten asked me to adapt my Medium post about the Facebook napalm photo incident as an op-ed. Here it is in Norwegian. Here is the English text:

Text:

Facebook needs an editor — to stop Facebook from editing.

An editor might save Facebook from making embarrassing and offensive judgments about what will offend, such as its decision last week requiring writer Tom Egeland, Aftenposten editor Espen Egil Hansen, then Norwegian Prime Minister Erna Solberg to take down a photo of great journalistic meaning and historic importance: Nick Ut’s image of Vietnamese girl Kim Phúc running from a 1972 napalm attack after tearing off her burning clothes. Only after Hansen wrote an eloquent, forceful, and front-page letter to Facebook founder Mark Zuckerberg did the service relent.

Facebook’s reflexive decision to take down the photo is a perfect example of what I would call algorithmic thinking, the mindset that dominates the kingdom that software built, Silicon Valley. Facebook’s technologists, from top down, want to formulate rules and then enable algorithms to enforce those rules. That’s not only efficient (who can afford the staff to make these decisions with more than a billion people posting every day?) but they also believe it’s fair, equally enforced for all. As they like to say in Silicon Valley, it scales.

The rule that informed the algorithm in this case was clear: If a photo portrays a child (check) who is naked (check) then the photo is rejected. The motive behind that rule could not be more virtuous: eliminating the distribution of child pornography. But in this case, of course, the naked girl did not constitute child pornography. No, the pornography here is a tool of war, which is what Ut’s photo so profoundly portrays.

Technology scales but life does not and that is a problem Facebook of all companies should recognize, for Facebook is the post-mass company. Mass media treat everyone the same because that’s what Gutenberg’s invention demands; the technology of printing scales by forcing media to publish the exact same product for thousands unto millions of readers. Facebook, on the other hand, does not treat us all alike. Like Google, it is a personal services company that gives every user a unique service, no two pages ever the same. The problem with algorithmic thinking, paradoxically, is that it continues the mass mindset, treating everyone who posts and what they post exactly the same, under a rule meant to govern every circumstance.

The solution to Facebook’s dilemma is to insert human judgment into its processes. Hansen is right that editors cannot live with Zuckerberg and company as master editor. Facebook would be wise to recognize this. It should treat editors of respected, quality news organizations differently and give them the license to make decisions. Facebook might want to consider giving editors an allocation of attention they can use to better inform their users. It should allow an editor of Hansen’s stature to violate a rule for a reason. I am not arguing for a class system, treating editors better than the masses. I am arguing only that recognizing signals of trust, authority, credibility, and quality will improve Facebook’s recommendations and service.

When there is disagreement , and there will be, Facebook needs a process in place — a person: an editor — who can negotiate on the company’s behalf. The outsider needn’t always win; this is still Facebook’s service, brand, and company and in the end it has the right to decide what it distributes just as much as Hansen has the right to decide what appears in these pages. That is not censorship; it is editing. But the outsider should at least be heard: in short, respected.

If Facebook would hire an editor, would that not be the definitive proof that Facebook is what my colleagues in media insist it is: media? We in media tend to look at the world, Godlike, in our own image. We see something that has text and images (we insist on calling that content ) with advertising (we call that our revenue) and we say it is media, under the egocentric belief that everyone wants to be like us.

Mark Zuckerberg dissents. He says Facebook is not media. I agree with him. Facebook is something else, something new: a platform to connect people, anyone to anyone, so they may do what they want. The text and images we see on Facebook’s pages (though, of course, it’s really just one endless page) is not content. It is conversation. It is sharing. Content as media people think of it is allowed in but only as a tool, a token people use in their conversations. Media are guests there.

Every time we in media insist on squeezing Facebook into our institutional pigeonhole, we miss the trees for the forest: We don’t see that Facebook is a place for people — people we need to develop relationships with and learn to serve in new ways. That, I argue, is what will save journalism and media from extinction: getting to know the needs of people as individuals and members of communities and serving them with greater relevance and value as a result. Facebook could help us learn that.

An editor inside Facebook could explain Facebook’s worldview to journalists and explain journalism’s ethics, standards, and principles to Facebook’s engineers. For its part, Facebook still refuses to fully recognize the role it plays in helping to inform society and the responsibility — like it or not — that now rests on its shoulders. What are the principles under which Facebook operates? It is up to Mark Zuckerberg to decide those principles but an editor — and an advisory board of editors — could help inform his thinking. Does Facebook want to play its role in helping to better inform the public or just let the chips fall where they may (a question journalists also need to grapple with as we decide whether we measure our worth by our audience or by our impact)? Does Facebook want to enable smart people — not just editors  but authors and prime ministers and citizens— to use its platform to make brave statements about justice? Does Facebook want to have a culture in which intelligence — human intelligence — wins over algorithms? I think it does.

So Facebook should build procedures and hire people who can help make that possible. An editor inside Facebook could sit at the table with the technologists, product, and PR people to set policies that will benefit the users and the company. An editor could help inform its products so that Facebook does a better job of enlightening its users, even fact-checking users when they are about to share the latest rumor or meme that has already been proven false through journalists’ fact-checking. An editor inside Facebook could help Facebook help the journalism survive by informing the news industry’s strategy, teaching us how we must go to our readers rather than continuing to make our readers come to us.

But an editor inside Facebook should not hire journalists, create content, or build a newsroom. That would be a conflict of interest, not to mention a bad business decision. No, an editor inside Facebook would merely help make a better, smarter Facebook for us all.

Who should do that job? Based on his wise letter to Mark Zuckerberg, I nominate Mr. Hansen.

15 years later

Fifteen years later, the one odd vestige of that day that still affects me is that my emotions are left vulnerable. It reveals itself in the most ridiculous moments: an obvious tear-jerking moment in a movie, a TV show, someone talking. In these manipulative moments, my emotions are too easily manipulated. I can’t help but feel it well up. I realize what is happening and why and I tamp it back down. But this is how I am reminded when I least expect to be.

And then there are the photos I cannot bear to look at. The worst for me — I can barely type the words — is the falling man photo. It brings back the images I wrote about once in my news report of the events and never speak of again.

I haven’t yet been able to bear the idea of going to the 9/11 museum. I don’t much like going to the memorial, which is beautiful, yes, but it is a hole in our city and souls.

On this morning at this moment, as I type this, hearing the bell that marks the minute when the second plane hit the south tower brings back the feeling of the heat I felt on the other side of the impact and then I cry.

We said we would never forget. It is not easy to remember.

* * *

Here is the story I wrote for the Star-Ledger the afternoon of the attacks.

Here is my oral history of my experience on 9/11, recorded (badly) a few days after the event.

Here is a meditation I delivered on the jahreszeit of 9/11 in my church, when I read the Kaddish.

Here are the tweets I posted remembering each moment as it passed ten years later.

Dear Mark Zuckerberg

Dear Mark Zuckerberg

I’ve said it before and I’ll say it again: Facebook needs an editor — to stop Facebook from editing. It needs someone to save Facebook from itself by bringing principles to the discussion of rules.

There is actually nothing new in this latest episode: Facebook sends another takedown notice over a picture with nudity. What is new is that Facebook wants to take down an iconic photo of great journalistic meaning and historic importance and that Facebook did this to a leading editor, Espen Egil Hansen, editor-in-chief of Aftenposten, who answered forcefully:

The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons. This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California…. Editors cannot live with you, Mark, as a master editor.

Facebook has found itself — or put itself — in other tight spots lately, most recently the trending topics mess, in which it hired and then fired human editors to fix a screwy product.

In each case, my friends in media point their fingers, saying that Facebook is media and thus needs to operate under media’s rules, which my media friends help set. Mark Zuckerberg says Facebook is not media.

On this point, I will agree with Zuckerberg (though this isn’t going to get him off the hook). As I’ve said before, we in media tend to look at the world, Godlike, in our own image. We see something that has text and images (we insist on calling that content ) with advertising (we call that our revenue) and we say it is media, under the egocentric belief that everyone wants to be like us.

No, Facebook is something else, something new: a platform to connect people, anyone to anyone, so they may do whatever they want. The text and images we see on Facebook’s pages (though, of course, it’s really just one endless page, a different page for every single user) is not content. It is conversation. It is sharing. Content as we media people think of it is allowed in but only as a tool, a token people use in their conversations. We are guests there.

Every time we in media insist on squeezing Facebook into our institutional pigeonhole, we miss the trees for the forest: We miss understanding that Facebook is a place for people, people we need to develop relationships with and learn to serve in new ways. It’s not a place for content.

For its part, Facebook still refuses to acknowledge the role it has in helping to inform society and the responsibility — like it or not — that now rests on its shoulders. I’ve written about that here and so I’ll spare you the big picture again. Instead, in these two cases, I’ll try to illustrate how an editor — an executive with an editorial worldview — could help advise the company: its principles, its processes, its relationships, and its technology.

The problem at work here is algorithmic thinking. Facebook’s technologists, top down, want to formulate a rule and then enable an algorithm to enforce that rule. That’s not only efficient (who needs editors and customer-service people?) but they also believe it’s fair, equally enforced for all. It scales.Except life doesn’t scale and that’s a problem Facebook of all companies should recognize as it is the post-mass-media company, the company that does not treat us all alike; like Google, it is a personal-services company that gives every user a unique service and experience. The problem with algorithmic thinking, paradoxically, is that it continues a mass mindset.

In the case of Aftenposten and the Vietnam napalm photo, Hansen is quite right that editors cannot live with Mark et al as master editor. Facebook would be wise to recognize this. It should treat editors of respected, quality news organizations differently and give them the license to make decisions. Here I argued that Facebook might want to consider giving editors an allocation of attention they can use to better inform their users. In this current case, the editor can decide to post something that might violate a rule for a reason; that’s what editors do. I’m not arguing for a class system, treating editors better. I’m arguing that recognizing signals of trust, authority, credibility will improve Facebook’s recommendation and service. (As a search company, Google understands those signals better and this is the basis of the Trust Project Google is helping support.)

When there is disagreement , and there will be, Facebook needs a process in place — a person: an editor — who can negotiate on the company’s behalf. The outside editor needn’t always win; this is still Facebook’s service, brand, and company. But the outside editor should be heard: in short, respected.

These decisions are being made now on two levels: The rule in the algorithm spots a picture of a naked person (check) who is a child (check!) and kills it (because naked child equals child porn). The rule can’t know better. The algorithm should be aiding a human court of appeal who understand when the rule is wrong. On the second level, the rule is informed by the company’s brand protection: “We can’t ever allow a naked child to appear here.” We all get that. But there is a third level Facebook must have in house, another voice at the table when technology, PR, and product come together: a voice of principle.

What are the principles under which Facebook operates? Facebook should decide but an editor — and an advisory board of editors — could help inform those principles. Does Facebook want to play its role in helping to better inform the public or just let the chips fall where they may (something journalists also need to grapple with)? Does it want to enable smart people — not just editors — to make brave statements about justice? Does it want to have a culture in which intelligence — human intelligence — rules? I think it does. So build procedures and hire people who can help make that possible.

Now to the other case, trending topics . You and Facebook might remind me that here Facebook did hire people and that didn’t help; it got them in hot water when those human beings were accused of having human biases and the world was shocked!

Here the problem is not the algorithm, it is the fundamental conception of the Trending product. It sucks. It spits out crap. An algorithmist might argue that’s the public’s fault: we read crap so it gives us crap — garbage people in, garbage links out. First, just because we read it doesn’t mean we agree with it; we could be discussing what crap it is. Second, the world is filled with a constant share of idiots, bozos, and trolls and a bad algorithm listens to them and these dogs of hell know how to game the algorithm to have more influence on it. But third — the important part — if Facebook is going to recommend links, which Trending does, it should take care to recommend good links. If its algorithm can’t figure out how to do that then kill it. This is a simple matter of quality control. Editors can sometimes help with that, too.

Apology to Mexico

I’m honored that my friends at El Universal in Mexico City published a brief opinion piece I wrote for them apologizing to Mexicans for sending them Donald Trump.

Here’s the English text:

 

To my Mexican friends,

I am sorry as an American that we have sent you Donald Trump. Please know that in the end, he speaks for few Americans — too few, God willing, for him to be elected our President. He is merely an aberration of the moment, a fluke, a freak, a phenomenon we can only hope will never be repeated. But in the meantime, your president invites him and you must suffer his company. I apologize.

The blame for Trump rests on many shoulders. There is, of course, Trump’s adopted political party, the Republicans, who for years has tried to reduce government by blocking its legitimate work. They have become the party of anger, finding scapegoats for every problem — most of all, President Obama but also strangers, namely immigrants and Muslims. They became the party of pessimism, declaring that America is falling into deep decline, even as the Obama Administration made great progress in fixing the problems it inherited: the economy, jobs, and wars, most notably. Thus the Republicans created a breeding ground for Trump, someone who would harness the emotions of a certain slice of America.

News media deserve a large share of the blame for Trump. First, they treated him as a carnival attraction, a funny clown who would attract audiences to their networks and pages. The heads of CNN and CBS rubbed their hands in greedy glee at how good Trump was for their businesses, which are still built on attracting masses with show business, rather than serving citizens with reliable information. My journalistic colleagues didn’t see the danger ahead and so they didn’t warn the public until it was too late, until Trump stood a step from the White House. Media have become his willing accomplices, treating his offensive and insane pronouncements — for example, that a wall blocking Mexico will solve our problems, that Hillary Clinton is a bigot — as serious topics that should be discussed for hours on end rather than disproven, ridiculed, and dismissed with facts and reason.

Journalism also failed badly at reflecting the concerns and problems of Trump’s core: underemployed, angry white men from the center of the nation. If media had done a better job of reporting — and then informing — their worldviews, I wonder whether Trump and his promoters would have found fertile soil for their divisiveness, fear, ignorance, and bigotry. If my party, the Democrats, had done a better job of hearing and addressing their concerns, could they also have blunted Trump’s appeal?

I believe we are seeing the last gasp of the myth of the American melting pot. When I grew up, we were taught to believe in assimilation: that every American would end up sounding if not looking alike. That is the presumption of the mass (though I believe that in the suffering of publishing and broadcasting in the internet age, we are witnessing the death of the mass-media business model and will also witness the end of the idea of the mass). Rule by the majority looks good when the majority looks like you; what Trump’s troops fear is they will soon be in the minority.

Today, living in New York and teaching at its City University, which values diversity, I have learned instead how much richer America is for the many distinct identities and backgrounds that make up this nation. We are, of course, better because Mexican Americans have brought their culture, worldview, heritage, and language to the United States. We are better for having doors, not walls. Though today, many of you might wish you had a wall to keep Trump out.

Americans — myself included — still struggle to learn the lesson of diversity, to see the value that Mexicans, Latin Americans of many nations, and people from all around the world bring to our culture, economy, language, and daily life. In that sense, Trump is the fault of all of us, for we have not quickly enough embraced the value of embracing people we thought of as strangers.

 

 

Apology to Germany

For the record. I did not insult Germans about VR. I was honored that Die Welt asked me to write about VR for a special they were doing. The lede gained something in the translation. I wrote:

Virtual reality will not change the world. But it might help change how we see it.

This was replaced by this subhed:

Deutsche Verbraucher sind laut Umfragen besonders skeptisch, wenn es um virtuelle Eindrücke geht. Liegt das etwa an der Nazi-Zeit? Oder daran, dass schon der Begriff Virtual Reality in die Irre führt?

Which means:

German consumers are particularly skeptical when it comes to virtual reality. Does that have something to do with the Nazi era? Or that is it that the term virtual reality is misleading? 

I have been critical of Germany’s overreaction, in my view, about American technology companies and copyright and privacy. But I purposely did not want to make this another German #technophobia story. Lower down in the piece, I raised the question and cited a few oddities — like the philosopher who found Nazi ideology in Pokemon Go (!) — but said that VR is sweeping Germany as elsewhere. And note that I pinned those oddities on German media.

Not a big deal. But I wanted to be clear, for the record. Here, by the way, is the English text (with German quotes still in German so as not to double-translate):

 

Virtual reality will not change the world. But it might help change how we see it.

Thanks to the internet, we are coming to the end of the Gutenberg Age. His era — not quite six centuries long — was ruled by text: content that filled the containers we call books, magazines, and newspapers. Now the information and entertainment that media provided are available in so many more forms: as databases, applications, visualizations, bot chats, videos, podcasts, memes, online conversations, social connections, education, and so on. Text is not dead. It just has a lot of new company.

Are we also leaving the Kodak Age, thanks to the advent of virtual reality? The printed photograph — like the movie and TV screens that followed — was bound by its two dimensions. But now images are freed to expand past those borders.

“VR” is being used, incorrectly, to include everything that breaks out of film photography’s flat Weltanschauung: 360-degree (and panoramic) photography, 360-degree video, augmented reality, light-field photography, and virtual reality itself (that is, a computer-generated, interactive representation of an environment).

At the City University of New York Graduate School of Journalism, where I teach, we believe we need to start our students not with VR but with 360-degree photography and video. We will push them to think and see outside the single path between the lens and the subject: straight-on, static, one-way. We want our students to ask when it could be useful for the public to see what is happening to either side or even behind them. How does that peripheral view impart added information or perspective?

As with every shiny new gadget that tempts us media folk, 360-degree media are being misused. There is no point in bringing a 360-degree camera to an interview, for when do you want to turn around and look the other way when talking with a person? Augmented reality is being used to make two-dimensional printed pages look three-dimensional; I frankly don’t see much point.

The first good and obvious use of 360-degree media is to put the viewer in the middle of a scene. Recently, news outlets used 360-degrees to put viewers in the middle of the balloon drop at the end of each American political convention or in an Olympic arena in Rio. They have used these cameras to give us a daredevil’s you-are-there perspective. All that is fine. But once you’ve seen one dangerous fall off a cliff, haven’t you seen them all?

I hear much talk that VR brings empathy to media, putting the viewer in the body of a story’s subject to enhance the viewer’s understanding. True. The Guardian took viewers into a six-by-nine-foot solitary confinement prison cell, a frightening experience. Bild took users to a battle in Iraq. I’ve stood in a virtual setting in which an angry man was pointing a gun at a woman just the other side of me; it is unnerving. I’ve even heard the empathy argument used to justify VR porn.

Making 360-degree video requires much expertise and expense: Multiple cameras sit in tricky rigs that can warp with heat and ruin the end result. Complex software is used to stitch all this video into one scene or to animate action. Virtual reality requires even more difficult software. And watching VR still requires a hassle: donning a cheap or expensive headset and looking like a fool while avoiding puking. Since the equipment is so expensive and difficult, I wonder whether we’ll soon see VR cafés just as, not long ago, we went to internet cafés to get online.

All that is why I am more enthused about using relatively inexpensive 360-degree cameras like the Samsung Gear 360 or Ricoh Theta S (or shooting panoramas on a phone). The best way to reach an audience of scale today is to post 360-degree photos on Facebook or video on YouTube.

The shiny new panoramic camera that has me most excited these days doesn’t even shoot 360 degrees around, only 150 degrees. The Mevo video camera captures a wider angle than regular video cameras, which simulates having multiple cameras as in a TV studio. It is controlled entirely on an iPhone or iPad: Click on someone’s face and that’s the closeup; move a box on the screen to shift the closeup. It’s the first camera written to Facebook Live standards. I’ve been using it to make my own podcasts. When I showed this small, $400 device to a newspaper owner, he ordered his staff to stop building their TV studio and control room.

Pokémon Go got me jazzed anew about the opportunities of augmented reality or AR. Years ago, a Dutch company called Layar showed the possibilities of adding information to what the camera on a phone saw (“This restaurant has great steaks” or “George Washington slept there”) but they were early. Now Pokémon shows how we could augment what a user sees in public with history or news about a location, restaurant reviews, ads and bargains, or annotations left by other users.

Ah, but leave it to German media to worry about the implications of a new technology and its application. In Bild, Franz Josef Wagner complained: “Aber die Nerds, die Millionen Pokémon-Süchtigen, sollten nicht nach Monstern suchen. Sie sollten die Wirklichkeit suchen.”

More amazingly (or amusingly), in Die Zeit, philosopher Slavoj Žižek discerned Nazi philosophy in Pokémon Go: “Und hat Hitler den Deutschen nicht das Fantasiebild seiner nationalsozialistischen Ideologie beschert, durch dessen Raster sie überall ein besonderes Pokémon – ‘den Juden’ – auftauchen sahen, das sie mit einer Antwort auf die Frage versorgte, wogegen man zu kämpfen habe?”

VR wariness is not just German media’s fault. An international survey by the firm GfK found German consumers the most skeptical about the value of a virtual experience. Nonetheless, especially in gaming, VR is also storming Germany.

Yes, there are issues to be grappled with in VR and its related technologies: When you shoot 360-degree photos and video, do the people behind the camera realize they are being captured? We have already seen that people watching virtual reality experiences have a heightened sense of reality. But I don’t buy the fear that people will withdraw into their VR headsets and experiences; it’s just another way to look at images.

VR et al might just give us another way to experience what other people experience — that is why both Facebook and Google are investing heavily in the medium, so we all can more fully share our lives. But fear not: just as text lives on after the Gutenberg age, reality will still exist after virtual reality.