Posts about publicparts

Tech companies: Whose side are you on?

I wrote this for the Guardian. I’m crossposting it here for my archive. The post is all the more relevant a day later as Google, Apple, AT&T, and Public Knowledge attend a secret White House meeting about secrecy. I’d have a lot more respect for them if they refused, given the condition.

Technology companies: Now is the moment when you must answer for us, your users, whether you are collaborators in the U.S. government’s efforts to collect it all — our every move on the internet — or whether you, too, are victims of its overreach.

Every company named in Edward Snowden’s revelations has said that it must comply with government demands, including requirements to keep secret court orders secret. True enough. But there’s only so long they can hide behind that cloak before making it clear whether they are resisting government’s demands or aiding in them. And now the time has come to go farther: to use both technology and political capital to actively protect the public’s privacy. Who will do that?

We now know, thanks to Snowden, of at least three tiers of technology companies enmeshed in the NSA’s hoovering of our net activity (we don’t yet know whether the NSA has co-opted companies from the financial, retail, data services, and other industries):

(1) Internet platforms that provide services directly to consumers, allowing government to demand access to signals about us: Google with search, mail, calendars, maps; Facebook with connections; Skype with conversations, and so on.

In its first Prism reporting, the Washington Post apparently unfairly fingered nine of these companies, accusing the NSA and FBI of “tapping directly into the central servers” that hold our “chats, photographs, e-mails, documents, and connection logs.” Quickly, the companies repudiated that claim and sought the right to report at least how many secret demands are made. But there’s more they can and should do.

(2) Communications brands with consumer relationships that hand over metadata and/or open taps on internet traffic for collection by the NSA and Britain’s GCHQ, creating vast databases that can then be searched via XKeyscore. Verizon leads that list, and we now know from the Süddeutsche Zeitung that it also includes BT and Vodafone.

(3) Bandwidth providers that enable the NSA and its international partners to snoop on the net, wholesale. The Süddeutsche lists the three telco brands above in addition to Level 3, Global Crossing, Viatel, and Interroute. Eric King, head of research for Privacy International, asked in the Guardian, “Were the companies strong-armed, or are they voluntary intercept partners?”

The bulk data carriers have no consumer brands or relationships and thus are probably the least likely to feel commercial pressure to protect the rights of the users at the edge. The telephone companies should care more but they operate as oligopolies with monopoly attitudes and rarely exhibit consumer empathy (which is a nice way of saying their business models are built on customer imprisonment).

A hodgepodge alliance of U.S. legislators is finally waking up to the need and opportunity to stand up for citizens’ rights, but they will be slow and, don’t we know, ineffective and often uninformed. The courts will be slower and jealous of their power. Diplomacy’s the slowest route to reform yet, dealing in meaningless symbolism.

So our strongest expectations must turn to the first tier above, the consumer internet platforms. They have the most to lose — in trust and thus value — in taking government’s side against us.

At the Guardian Activate conference in London last month, I asked Vint Cerf, an architect of the net and evangelist for Google, about encrypting our communication as a defense against NSA spying. He suggested that communication should be encrypted into and out of internet companies’ servers (thwarting, or so we’d hope, the eavesdropping on the net’s every bit over telcos’ fibre) but should be decrypted inside the companies’ servers so they could bring us added value based on the content: a boarding pass on our phone, a reminder from our calendar, an alert about a story we’re following (not to mention a targeted ad).

Now there are reports that Google is looking at encrypting at least documents stored in Google Drive. That is wise in any case, as often these can contain users’ sensitive company and personal information. I now think Google et al need to go farther and make encryption an option on any information. I don’t want encryption to be the default because, in truth, most of my digital life is banal and I’d like to keep getting those handy calendar reminders. But technology companies need to put the option and power of data security directly into users’ hands.

That also means that the technology companies have to reach out and work with each other to enable encryption and other protections across their services. I learned the hard way how difficult it is to get simple answers to questions about how to encrypt email. The industry should work hard to make that an option on every popular service.

But let’s be clear that encryption is not the solution, probably only a speed bump to the NSA’s omnivorous ingesting. At the Activate conference, Cerf was asked whether the solution in the end will be technical or institutional. No doubt, institutional, he answered. That means that companies and government agencies must operate under stated principles and clear laws with open oversight.

Before Snowden’s leaks, technology CEOs would have had to balance cooperation and resistance just as the nation supposedly balances security and privacy. But now the tide of public opinion has clearly shifted — at least for now — and so this is the moment to grab control of issue.

If they do not assert that clear control, these technology companies risk losing business not only from skittish consumers but also from corporate and foreign-government clients. The Cloud Security Alliance polled companies and found that 10% had canceled U.S. cloud business and 56% were less likely to do business with U.S. providers. “If businesses or governments think they might be spied on,” said European Commission Vice President Neelie Kroes, “they will have less reason to trust the cloud, and it will be cloud providers who ultimately miss out.”

Besides taking action to secure technology and oversight within their companies and the industry, right-thinking technology companies also need to band together to use their political capital to lobby governments across the world to protect the rights of users and the freedom and sanctity of privacy and speech on the net. They must take bold and open stands.

To do that, they must first decide on the principles they should protect. In my book Public Parts, I proposed some principles to discuss, among them:
* the idea that if any bit on the net is stopped or detoured — or spied upon — then no bit and the net itself cannot be presumed to be free;
* that the net must remain open and distributed, commandeered and corrupted by no government;
* that citizens have a right to speak, assemble, and act online and thus have a right to connect without fear;
* that privacy is an ethic of knowing someone else’s information and coming by it openly;
* and that government must become transparent by default and secret by necessity (there are necessary secrets). Edward Snowden has shown us all too clearly that the opposite is now true.

I also believe that we must see a discussion of principles and ethics from the technologists inside these companies. One reason I have given Google the benefit of the doubt — besides being an admirer — is that I believe the engineers I know inside Google would not stay if they saw it violating their ethics even if under government order.

Yonathan Zunger, the chief architect of Google+, said this after the Guardian’s and Glenn Greenwald’s first revelations were published:

I can tell you that it is a point of pride, both for the company and for many of us, personally, that we stand up to governments that demand people’s information…. I can categorically state that nothing resembling the mass surveillance of individuals by governments within our systems has ever crossed my plate. If it had, even if I couldn’t talk about it, in all likelihood I would no longer be working at Google.

In the end, it’s neither technologies nor institutions that will secure us from the inexorable overreach of government curiosity in the face of technical capability. Responsibility for oversight and correction begins with individuals, whether whistleblowers or renegade politicians or employees of conscience who finally remind those in power: “Don’t be evil.”

Matters of principle

America is supposed to be a nation governed by principles, which are undergirded by the Constitution and the Bill of Rights and carried into law. The discussion about the government and its capture of *our* data should be held on the level of principles.

* Privacy: Our direct and personal communication in any medium and by any means — mail, email, phone, VOIP, Twitter DM, and any technology yet to be invented — should be considered private, as our physical mail is, and subject to government intervention only through lawful warrant. That is not the case. Thus it is quite reasonable to be disturbed at the news that government can demand and receive communication we believe to be private. Government may call itself the protector of our privacy but it is our privacy’s worst enemy.

* Transparency: The actions of government should be known to citizens. I argue in Public Parts that our institutions should be public by default, secret by necessity; now they are secret by default and open by force. There are necessary secrets. There is a need for intelligence. There I agree with David Simon. I saw people die before me on 9/11 and I fault intelligence or not stopping it.

But we are left out of the discussion of where the line of necessity should be. If President Obama believes in the transparency he talks about and if he now says he welcomes the debate about security and freedom then it should have occurred *before* government took the actions now being reported and not by force through leaks. There I agree with James Fallows that this leak is not harmful — what bad guys didn’t already realize that their phones could be tracked? — and will be beneficial for democracy.

* Balance of powers: The best protection of our nation’s principles is the balance of powers. Yes, Congress passed the Patriot Act and yes, a FISA court does approve the executive branch’s actions. But both our representatives and our justices are prevented from sharing anything with us, as are the companies that are forced to be their accomplices. The true balance of powers is the exercise of democracy by citizens, but without information we have no power and government has it all.

* Freedom of speech and of the press: Information comes to the public from the press, which is now anyone with information to share. And citizens exercise power through speech. But in its jihad against leaks… that is whistleblowers… that is reporting… that is journalism and the public’s right to know, the White House is chilling both the press and speech. I pray that Glenn Greenwald doesn’t have a Verizon phone.

This discussion is less about privacy and more about transparency and speech. The principles most offended here are those embedded in the First Amendment for those are the principles we rely upon to take part in the debate that is democracy.

I am asking for government to behave according to principles. I am also asking companies to do so. Twitter — whose behavior toward developers and users can sometimes mystify me — is apparently the platform most stalwart in standing for its users’ rights as a matter of principle. They apparently refused to make it easier for government to get data. Now one could argue that helping government thwart terrorists is also behaving according to principle. But again we and these companies aren’t allowed to have that debate. So I’d now advise following what is apparently Twitter’s route in only responding to demands, nothing more. And I’d advise following Google’s example in revealing government demands for information (though under FISA, once again, they’re not allowed to reveal — even by a count — them all).

There is much debate and sometimes conspiracy theorizing swirling around about what Google, Facebook, et al did and didn’t provide to government. I take Larry Page’s and Mark Zuckerberg’s statements at their literal word and agree with Declan McCullagh that I so far see no evidence that these companies handed the keys to their servers to the NSA. We know and they have long said that they comply with government orders, whether in the U.S. or China.

Though some are attacking him on this issue and though I often disagree with him on the state of the news business, I again say that I agree with David Simon on the unsophisticated and emotional interpretation of this news. Since the initial New York Times report on NSA “warrantless wiretapping,” I have understood that one of government’s goals is to use data to find anomalies but to do that it has to have a baseline of normal behavior. We’re the normal. This has been going on for sometime, as Simon says; we just haven’t known how.

Are we as a nation OK with allowing government to make such an analysis to find the terrorists’ anomalous behaviour or not? That’s a discussion that should occur according to principles, properly informed about the risks and benefits. Are we OK with government using that same data to fish for other crimes — like, say, leaking a PowerPoint to the Guardian? I am not. Are we OK with government treating whistleblowers and leakers as traitors — starting with Bradley Manning? I am not. I agree with Bruce Shneier: “We need whistleblowers.” Are we OK with government having access to our private communications without warrants? I say: most definitely not, as a matter of principle.

Under a regime of secrecy, assuming the worst becomes the default in the discussion. We assume the worst of government because they keep from us even activities they say are harmless and beneficial. We see people who want to be suspicious of technology and technology companies assuming the worst of them because, after all, we can’t know precisely what they are doing. I agree with Farhad Manjoo about the danger. People in other nations — I’m looking at you, EU — already distrust both the American government and American technology companies, often in the past for emotional reasons or with anti-American roots but now with more cause. You can bet we’ll hear governments across Europe and elsewhere push harder for legislation now in process to require that their citizens’ data be held outside the U.S. and to European standards because, well, they assume the worst. We’ll hear calls to boycott American-made platforms because — even if they try not to go along — their acquiescence to our government means they cannot be trusted. This is bad for the net and bad for the country. The fault lies with government.

This is a story about transparency and the lack of it. It is a story about secrecy and its damages. It is a story about principles that are being flouted. It should be a discussion about upholding principles.

NY Times technobias

nytimesp1From the headline to the lede to the chosen sources to the writing to the page-one placement, today’s New York Times coverage of Google’s $7 million settlement for the drive-by capture of wifi data is one-sided, shallow, and technopanicky.

First, let’s remind ourselves of the facts. Google’s Street View cars captured wifi addresses as they drove by as a way to provide better geolocation on our phones (this is why your phone suggests you turn on wi-fi when using maps — so you can take advantage of the directory of wifi addresses and physical addresses that Google and other companies keep). Stupidly and for no good reason, the cars also recorded other data passing on *open* wifi networks. But that data was incredibly limited: just what was transmitted in the random few seconds in which the Google car happened to pass once by an address. There is no possible commercial use, no rationally imagined nefarious motive, no goldmine of Big Data to be had. Nonetheless, privacy’s industrial-regulator complex jumped into action to try to exploit the incident. But even Germany — the rabid dog of privacy protectors — dropped the case. And the U.S. case got pocket lint from Google.

But that didn’t stop The Times from overplaying the story. Neither did it stop a CNN producer from calling me to try to whip up another technopanic story about privacy; I refused. I won’t pay into the panic.

Let’s dissect the Times story from the headline down:

* The Times calls what Google did “prying.” That implies an “improper curiosity” and an intentionality, as if Google were trying to open our drawers and find something there. It’s a loaded word.

* The lede by David Streitfeld says Google “casually scooped up passwords, e-mail and other personal information from unsuspecting computer users.” Later in the story, he says: “For several years, the company also secretly collected personal information — e-mail, medical and financial records, passwords — as it cruised by. It was data-scooping from millions of unencrypted wireless networks.”

The cars recorded whatever data was passing on these — again — *open* and *public* networks, which can be easily closed. Google was obviously not trying to vacuum up passwords. To say “unsuspecting computer users” is again loaded, as if these were victims. And to list particularly medical and financial records and not mention bits employed in playing Farmville is loaded as well.

* Here’s the worst of it: Streitfeld says unnamed “privacy advocates and Google critics characterized the overall agreement as a breakthrough for a company they say has become a serial violator of privacy.” A “serial violate or privacy”? Really? Where’s the link to this long and damning rap sheet? Facebook, maybe. But I doubt even Google’s vocal and reasonable critics would characterize the company this way. If Streitfeld found someone who said that, it should be in quotes and attributed to someone, or else he and the paper are the ones issuing this judgment.

* If anyone would say such a thing, it would certainly be the people Streitfeld did quote in the story, for he sought out only the worst of the company’s critics, including Scott Cleland, “a consultant for Google’s competitors” [cough] and Marc Rotenberg, self-styled protector of privacy at the so-called Electronic Privacy Information Center. Streitfeld also went to the attorneys general and a former FTC bureaucrat who went after Google. Nowhere in this story is there any sense of another side, let alone of context and perspective. That’s just not good reporting.

I have made it clear that I’m generally a fan of Google; I wrote a book about that. Nonetheless, I have frequently called Google’s recording of this data as its cars passed by — and this is my technical term — a fuckup. It was stupid. It was damaging to Google’s reputation. It played into the hands of the critics. That’s what I can’t stand.

I’m tired of media’s and governments’ attempts to raise undue panic about technology. Look at the silly, preemptive, and panicky coverage of Google Glass before the product is even out. A Seattle dive bar said it would ban Glass and media picked it up all over (8,000+ references at last check on Google News) — though the bar admitted, as any fool could see, that it was just a publicity stunt.

There are plenty of serious issues to discuss about protecting privacy and there is certainly a need to educate people about how to protect their privacy. But this simplistic, biased, anti-technology, panicked coverage does neither. I might expect this other outlets. But I’m sad to see The Times join in.

Note that as part of its settlement, Google will educate people to close their open wifi networks. The Times found someone to ridicule even that when its ink would have been better put to telling people how to close their networks.

: See also Phillip Dampier on the topic.

I see you: The technopanic over Google Glass

Screenshot 2013-03-06 at 2.45.02 PM
Google Glass isn’t available yet. Even so, the technopanic it’s inspiring is rising to full swivet. But I say there’s no need to panic. We’ll figure it out, just as we have with many technologies—from camera to cameraphone—that came before.

The greatest compilation of worries to date comes from Mark Hurst, who frets: “The most important Google Glass experience is not the user experience— it’s the experience of everyone else. The experience of being a citizen, in public, is about to change.” [His typography]

This is the fear we hear most: That someone wearing Glass will record you—because they can now—and you won’t know it. But isn’t that what we heard when cell phones added cameras? See The New York Times from a decade ago about Chicago Alderman Edward Burke:

But what Mr. Burke saw was the peril.
“If I’m in a locker room changing clothes,” he said, “there shouldn’t be some pervert taking photos of me that could wind up on the Internet.”
Accordingly, as early as Dec. 17, the Chicago City Council is to vote on a proposal by Mr. Burke to ban the use of camera phones in public bathrooms, locker rooms and showers.
His fear didn’t materialize. Why? Because we’re civilized. We’re not as rude and stupid—as perverted—as our representative, Mr. Burke, presumed us to be.

How will we deal with the Glass problem? I’ll bet that people wearing Glass will learn not to shoot those around them without asking or they’ll get in trouble; they’ll be scolded or shunned or sued, which is how we negotiate norms. I’d also bet that Google will end up adding a red light—the universal symbol for “You’re on!”—to Glass. And folks around Glass users will hear them shout instructions to their machines, like dorks, saying: “OK, Glass: Record video.”

That concern raised, Hurst escalates to the next: that pictures and video of you could be uploaded to Google’s servers, where it could be combined with facial recognition and the vastness of data about you. Facebook can’t wait to exploit this, he warns. But this is happening already. Every photo on my phone is automatically uploaded to Google; others do likewise to Facebook, each of which has facial recognition and information about us. Hurst acknowledges that we’re all recorded all day in public—remember: it is public—by security cameras. But the difference here, he argues, is that this data is held by a companies. Big companies + Big Data = Big problems, right? That’s the alarm Siva Vaidhyanathan raises:

But what’s to investigate? Should governments have investigated Kodak cameras when they came out? Well, Teddy Roosevelt did briefly ban cameras in Washington parks. In 2010, Germany’s minister of consumer protection, Ilse Aigner, decreed that tying facial recognition to geolocation would be “taboo”—though one could certainly imagine such a combination being useful in, for example, finding missing children. To ban or limit a technology before it is even implemented and understood is the definition of short-sighted.

Hurst also fears that the fuzz and the Feds could get all this data about us, these days even without warrants. I fear that, too—greatly. But the solution isn’t to limit the power of technology but to limit the power of government. That we can’t is an indication of a much bigger problem than cameras at our eyelids.

I agree with Hurst that this is worth discussing and anticipating problems to solve them. But let us also discuss the benefits alongside the perils, change to welcome balancing change we fear—the ability to get relevant information and alerts constantly, the chance to capture an otherwise-lost moment with a baby, another way to augment our own memories, and other opportunities not yet imagined. Otherwise, if we manage only to our fears, only to the worst case, then we won’t get the best case. And let’s please start here: We are not uncivilized perverts.

Yes, I’m dying to get a Google Glass and get my head around it and vice versa. But rest assured, I will ask you whether it’s OK to take a picture of you in private—just as I ask whether it’s OK to take or share your picture now or to tweet or blog something you say to me. We figured all that out. We will figure this out. We have before. No need to technopanic.

Screenshot 2013-03-06 at 2.41.47 PM

Clippings from The New York Times

Cross-posted from Medium.

LATER: A good post from Jürgen Geuter that raises the point I also wrote about in Public Parts: let’s concentrate on the use over the gathering of data; if we do the latter, we regulate what we’re allowed to know.

Roll over, Gutenberg

Germany, I fear, is not the land of innovation. It is a land of institutions.

This week the German Bundestag passed a law created by publishers — primarily Axel Springer and Burda — to force internet companies — read: Google — to pay for quoting — and thus promoting and linking to — their content. The legislation, the Leistungsschutzrecht, was known as the Google tax.

lsr_banner13In the end, compromise legislation exempts precisely what the publishers had been going after: snippets of text of the sort that search engines quote. The bill now generously says that single words or very few words — it is not precise in its definition — remain free. But of course that exception only proves the absurdity of the effort: Who could ever own a word or a phrase? Or a thought?

So now, if the bill passes the next house of the legislature, lawyers will make a fortune debating how short is too long. No matter the length, speech suffers. Don’t the publishers see that they live by the quote? Their content is made up of what other people say. Their content gains influence when other people quote it.

But that is beside their point. They want to tax Google. They say it is not fair — imagine a kindergartener stomping his little feet — that Google makes money as they lose money. They think they deserve a share, though the truth is that their content makes up very little of what people search for. And, besides, every time Google links to them it is up to the publishers to establish a relationship with that user and find value in it. That publishers have failed to do this almost two decades into the web era is not Google’s fault; it is their fault. Rather than innovating and finding the necessary opportunity in their disruption, these publishers — conservatives who otherwise would diminish government — go running to the Chancellor and her party to pass their Leistungsschutzrecht.

To be fair, this is not purely a German disease. It is a European ailment as well. In France publishers hide behind government’s skirt to blackmail Google into paying into a fund to support innovation by publishers who’ve not innovated. The French government is also looking at taxing the gathering of big data — a tax, then, on knowledge. Belgian publishers rejected Google’s links and then thought better of it and finally extorted Google into advertising in their publications to avoid that nation’s version of a Leistungsschutzrecht. The internet causes a certain insanity the world around. In the U.S., we had SOPA and PIPA, laws like the Leistungsschutzrecht meant to protect ailing industries — though they were defeated. Then there is ACTA, an international attempt to protect the copyright industry.

But there are more issues in Germany. It is leading the privacy technopanic in Europe. Government leaders have urged citizens to have pictures taken from public places of public views of the facades of buildings blurred in Google Street View; they label this their Verpixelungsrecht. A privacy extremist in one state in Germany has tried to outlaw Facebook’s “like” button. That same state tried to overrule Facebook’s requirement to use real names.

And another: In entrepreneurial circles, Germany is known as the land of internet copycats. Again and again, German entrepreneurs have copied American services and business models, though their real business model is to get bought by the American originals.

Mind you, I love Germany (though to many Americans, that seems like an odd statement). There’s nowhere I’d rather visit. I have many friends there. I have met many talented technologists there. I marvel at its book culture and at its lively — if also suffering — market for serious journalism.

But today I worry about Germany. It is an industrial wonder in a postindustrial age. Government and media are embracing each other to defend their old institutions against disruption and the opportunity that can come with it. As I wrote in my book Public Parts, I’m concerned that Germans’ will to be private, not to fail, and especially not to fail publicly put them at a disadvantage in an entrepreneurial age when failure is a necessary product of experimentation. I fear that entrepreneurs, investors, and internet companies will shy away from Germany’s borders given the hostility that is shown especially to American internet companies.

I am disappointed that the land of Gutenberg, the land that invented the ability to share knowledge and ideas at a mass scale and to empower speech is now haggling over the control and ownership of a few words. As they say in German, schade. What a shame.

[This post has been translated into German and adapted as an op-ed at Zeit Online here.]

Related: I respond to Albert Wenger, a wise and German VC, regarding the #LSR here:


Public is public…except in journalism?

Reporters and editors used to decide what was to be made public. No longer. More and more, the public decides what will be public … and that’s as it should be.

In today’s Times, David Carr concludes that he’s uncomfortable with a newspaper publishing a map of gun permit applicants. Yesterday on Twitter, Jim Willse, the best American newspaper editor I’ve ever worked with, got similarly sweaty.

I, too, struggled with this matter. But in the end and with respect, I think my friends are asking the wrong question. It is not up to journalists to decide that gun permits are public information. It’s up to us as citizens to decide that, as a matter of law. If there is something wrong with that, then change the law. If society is not comfortable with making that information public, then don’t try to make it somewhat public, public-with-effort (like TV stations’ campaign commercial revenue). There’s no half-pregnant. In the net age, there’s no slightly public.

I hate to see a news organization being condemned for trafficking in public information. I would also hate to see journalists end up campaigning to make less information public. Journalists of all people should be fighting to make more information public. In Public Parts, I argue that government today is secret by default and transparent by force when it must become transparent by default and secret by necessity. There are necessary secrets regarding security, criminal investigation, and citizens’ privacy.

Should gun permits be private then? Isn’t that by extension what my journalist friends are really asking when they want them to be less public? I say no. There is a public interest in this information being available and accessible. It allows the public, journalists and neighbors included, to keep watch on the process of government issuing permits. It enables the public, news organizations and others, to correlate data about permits with data about crime and safety. At a personal level, it enables me as a parent to know whether the homes where my children go play have arms — and to be able to discuss with the parents there whether their weapons are safely secured. These are matters of public safety, of public interest.

Now Carr and Willse are arguing that there is a difference between that information being available and making it more available by printing it in a newspaper, on a map. “Publishing is a discrete act, separate from whether something is public or not,” Carr says. “Our job as journalists is to draw attention, to point at things, and what we choose to highlight is defined as news.” That is the old editorial gatekeeping function trying to assert itself. Online, that question is becoming moot as there’s no longer a scarcity of space to control, to edit. Publishing information for all to see in print is different from making information available for those who seek it in search or by links. If the news organization doesn’t make this information more widely available, someone else can and likely will. I’ll argue that the town itself should be doing that. (And I’ll argue with Carr about the idea that journalists define news another day.)

Haven’t we heard that data viz is all the rage? Don’t we know Google’s mission to make the world’s knowledge accessible to all? Shouldn’t that be part of journalism’s updated mission? I say that news organizations should become advocates for open information, demanding that government not only make more of it available but also put it in standard formats so it can be searched, visualized, analyzed, and distributed. What the value of that information is to society is not up to the gatekeepers — officials or journalists — to decide. It is up to the public.

Now where I will agree strongly with Carr is that it is also journalism’s job to add value to that information. “And then it is our job to create context, talk to sources who bring insight and provide analysis,” he says. It’s legitimate to ask whether the paper with the map added such and sufficient value. I think this will be our primary job description going forward: adding value to flows of information that can now exist without our mediation. We should add value in many ways: contributing context, explanation, caveats (how the information can be out of date or flawed), education (how to verify the information), in some cases editing (the value The Times and Guardian added to Wikileaks data was not just distribution but also redaction of necessary secrets), and especially and always reporting: Why do all these people own guns? How are they storing them? What are they teaching their children about them? Have they ever used them? Are they trained in using them? Oh, there are many questions and answers that won’t be in that flow of data. That’s where the need for journalism and its future lies.

Both Carr and Willse want to make moral judgments about data. “Should data have a conscience?” Carr asks. It’s our use of data that needs to be governed by conscience. This is a lesson danah boyd taught me for Public Parts when it comes to privacy and data: It’s not the gathering of data we should regulate — or the technology employed to gather it. It’s the use of data we need to regulate. It’s one matter to know that I’m a middle-aged geezer, another to use that information to deny me employment. I would hate to see society and especially journalists find themselves advocating the regulation of knowledge.

Our default as journalists should be that more information is good because it can lead to more knowledge. We no longer hold the keys to the gate to that information. We can help turn information into knowledge. But we can’t do that with less information.

Again, I sympathize with Carr’s and Willse’s discomfort. I shared it. But as I tested the limits of my views on publicness and its value, this is where I came out.

We get the net—and society—we build

The next time you see someone on Twitter point to an argument and gleefully announce, “Fight! Fight!” and you retweet that, think about the net you are encouraging and creating. You’re breeding only more of the same.

Oh, we’ve all done it. At least I’ll confess that I’ve done it. I’ve been in fights online I’m ashamed of. Like kids left alone by the substitute teacher, we — many of us — exercised our sudden freedom by shooting spitballs around the room. Have we gotten that out of our systems yet? Isn’t it time to stop and ask what kind of net and society we’re creating here?

I’ve been the object of potshots from a cadre of young curmudgeons who attack me instead of my ideas. We give it a haughty name — the ad hominem attack — but it’s just a kind of would-be assassination, sniping at the person to shut off the idea. I’ve watched these attacks be retweeted as reward, over and over again. Some might say that’s what I get for being public. Hell, I wrote a book about being public. But I hope personal attack isn’t the price one has to pay for sharing thoughts. What chill does that put on public discussion?

I was waiting for another example of a “Fight! Fight!” tweet to write about this choice we have. But then today I read about something far, far worse in singer Amanda Palmer’s blog. She, too, was getting ready to write about being the object of hate online — something we briefly talked about in a conversation regarding social media a few weeks ago. But then Amanda searched and found the tragic, wasteful story of a girl who couldn’t take the abuse she’d received online and off and finally killed herself. That’s only partly a story about the internet. But it’s very much a story about damaged humanity. Go read Amanda’s post now and watch the video there if you can bear to. Especially read the comments: heartfelt stories from more victims of attacks who, thank God, are here to tell their tales and share their lessons.

In the U.K., people are being arrested for posting hate online — “malicious telecommunications,” it’s called, as if the “tele” makes it worse. In France, a government minister is demanding that Twitter help censor, outlaw, and arrest the creators of hate online. I side with Glenn Greenwald on this: Nothing could be more dangerous. “Criminalizing ideas doesn’t make them go away any more than sticking your head in the sand makes unpleasant things disappear,” says Greenwald.

Yes, this is not a trend that can be delegated to government and wished away with legislation or prosecution. Or to put it another way: This is not government’s problem.

This is our problem. Your problem. My problem. Every time we link to, laugh at, and retweet — and retweet and retweet and retweet — personal attacks on people, we only invite more of the same. And every time we do *not* call out someone and scold them for their uncivil behavior, we condone that behavior and invite more of it. Thus we build the net — and the society — we deserve.

Again, I’ll not claim purity myself. I’ve ridiculed people rather than ideas and I’m ashamed for my part in that.

And mind you, I won’t suggest for a moment that we should not attack ideas and argue about them and fight over them with passion and concern. We must argue strenuously about difficult topics like guns and taxes and war. That is deliberative democracy. That process and freedom we must protect.

But when argument over an idea turns to attack against a person, then it crosses the line. When disliking a person becomes public ridicule of that person, it is hate. Dealing with that isn’t the responsibility of government. It is our responsibility.

The next time you see a tweet ridiculing a person or linking to someone who does, please respond with a challenge: “Is this the world you want to encourage? What does this accomplish? What does this create?” A week or so ago, I finally did that myself — “Really?” I asked a Twitter fight announcer. “Is this what you want to encourage? Aren’t you ashamed?” — and I was only sorry I had not done it before.

It would be self-serving and trivial to point to personal examples of attacks that spread. Indeed, it is self-serving — and ultimately only food to the trolls — to respond yourself to attacks on you; that gives the attackers just what they want. But that should not stop me from giving support to others who are attacked by those who think that scoring snark shots will only get them attention (because to date, it does). The next time I see an attack on a person, I need to call it out. I’d ask you to do the same.

We are building the norms of our new net society. It can go either way; there’s nothing, absolutely nothing to say that technology will lead to a better or worse world. It only provides us choices and the opportunity to show our own nature in what we choose. Will you support the fights, the attacks, the hate? Or will you stand up for the victims and against the bullies and trolls and their cheering mobs who gleefully tweet, “Fight! Fight!”?

Please read Amanda’s post and the comments from her supporters — Gaga would call them her little monsters — and take their stories to heart. Whose side are you on? Which net and society will you build?