Here at Davos, I just left a media conversation with Brazilian President Dilma Rousseff at which I asked two questions relevant to the internet.
First, I asked under what circumstances she would consider granting asylum to Edward Snowden. She did not answer that question directly but said that the Brazilian government “has not been addressed” regarding an application for asylum, “therefore since I cannot possibly contemplate such a request you are working under a mistaken premise. The request was never formally submitted.” Interpret the subtleties of that as you may.
I also asked about controversial plans to require technology companies to store Brazilians’ data in Brazil, seeking her reaction to criticism that this will lead to a balkanized internet. She responded strictly in the context of criminal prosecution, saying that in an investigation into money laundering her justice department was denied access “precisely because it ran counter to the legislation of the country where the data was stored.”
“We cannot possibly accept that interference about data,” she continued. “It’s about our sovereignty…. We cannot find ourselves subject to the laws that prevail in third-party countries.” And then she added: “A compromise agreement is always possible.”
A few observations:
First, holding citizens’ data in Brazil makes it easier for the authorities to get data on those citizens for reasons good or bad.
Next, I’m surprised that she did not use this as an opportunity to continue her complaints about U.S. surveillance of Brazilian entities.
Instead, she put this as a matter of Brazilian sovereignty. That’s blunt but troubling. I’ve argued before that no nation should be able to claim sovereignty over the net.
If Brazil succeeds in imposing this data requirement, then it represents the further balkanization of the net. Brazil ends up with its own net, Iran does too, and so does China. The good-guy argument doesn’t wash for the architecture and precedent set by any good guy can be used by any bad guy.
Note also this week that Microsoft said it would honor customers’ requests to hold their data outside of the U.S. and the prying eyes of the NSA. At a practical level, it’s not hard to imagine that working for enterprise data; here at Davos, Salesforce.com’s Marc Benioff said his company can show a client the building and the rack where its data is held. But for consumer services, it is hard to imagine how, say, Bing could store, say, your search history outside the U.S. but mine inside.
And apart from those practical considerations, other tech executives said yesterday at Davos that the U.S. FISA court can still require a technology company to hand over data that is under its control, no matter whether that data is held in the U.S. or abroad.
This is a show of shadow puppets but one that could have serious, injurious impact on the net.
Back to Rousseff: The media conversation was to be off the record but after it was over she said that everything she said could be used on the record.
An odd event, it was. Asked one question about the economy of Brazil, she filibustered for half an hour, sounding — in the observation of another journalist — like a Chinese party official outlining the newest five-year plan.
Here’s a post I wrote for the Guardian arguing that the primary issue with the NSA is not privacy but government overreach and oversight.
I celebrate Judge Richard J. Leon’s opinion that the government’s mass collection of communications metadata is “almost Orewellian” and I decry Judge William H. Pauley III’s decision since that the NSA’s collection is both effective and legally perfectly peachy.
But I worry that the judges — as well as many commentators and Edward Snowden himself — may be debating on the wrong plane. I see some danger in arguing the case as a matter of privacy because I fear that could have serious impact on our concept of knowledge, of what is allowed to be known and thus of freedom of speech. Instead, I think this is an argument about authority — not so much what government (or anyone else) is allowed to know but what government, holding unique powers, is allowed to do with what it knows.
Indeed, the Fourth Amendment, which is often called upon in this argument, is explicitly about controlling authority:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
In the search for a legally protected right to privacy in the United States, begun with Brandeis and Warren in 1890, the Fourth Amendment has been interpreted as affording privacy protection as have the First Amendment (freedom of belief) and the Fifth (freedom against self-incrimination). In each case, though, the right is not so much for something — privacy — as against something — namely, government abuse.
Yet we continue to hold the NSA debate around whether communications metadata is public or private. In the past, such data was presumed to be public because once it was known by a third party, it could no longer be claimed as private. The information on an envelope — metadata to the contents inside: sender and recipient — must be known by a third parties along the way, mail carriers and sorters, to get to its destination. So it is not private. This same theory was applied to the telephone as the phone company has to know who’s placing and who’s receiving a call to complete it. Thus the government says it can seek such public information without affecting privacy.
Judge Leon argues, with insight, that scale affects the revelatory impact of metadata as we now use phones to do so much more than make calls:
Put simply, people in 2013 have an entirely different relationship with phones than they did thirty-four years ago…. Records that once would have revealed a few scattered tiles of information about a person now reveal an entire mosaic — a vibrant and constantly updating picture of the person’s life.
Yes, but my fear with Leon’s argument is that once we we say some amount of data is too much to have, then we will end up debating the line around too much knowledge and that is a line I never want to see drawn. If we start to say that bad things can happen merely if knowledge exists, then too soon we fall into the trap of controlling the extent of knowledge — who may know what and how much they may know and thus who may say what to whom. That is the basis of censorship and ultimately tyranny.
I also fear the impact of Leon’s argument on the notion of publicness. Once knowledge is public, it becomes a public good and the person who put it there does not gain the right to somehow withdraw it because of who ends up holding it or what they may do with it. This is why I object to European Commission Vice-President Viviane Reding’s notion of a right to be forgotten — for that gives someone the right to tell others what they may not know. I also object to the idea that there should be a presumption of privacy in public, for that would harm the journalist’s — that is to say, anyone’s — ability to report on what they witness in public, especially acts by public officials. It could also affect the ability of researchers to collect data and find unforseen connections and correlations.
Think of privacy this way: When I tell you something about myself, that fact is then public to that extent. What happens to it is now out of my hands; it is in yours. Thus, in Public Parts, I defined privacy as an ethic of knowing someone else’s information (and whether sharing it further could harm someone) and publicness as an ethic of sharing your own information (and whether doing so could help someone).
When I researched Public Parts, danah boyd sat me down and explained how I should understand the gathering versus the use of information.
“Privacy,” she says, “isn’t just about controlling the access to information but controlling how it’s used, how it’s interpreted…. If you walk into my office applying for a job, with one quick look I’m going to be able to get a decent sense of your gender, your race, your age.” Antidiscrimination law doesn’t forbid her from knowing these bits of information about me. Instead, it forbids her from using them against me in hiring. Of course, she could still deny me the job because of my gray hair. But if she is caught in a pattern of discriminating against applicants on the basis of age, she can be sued.
boyd pointed out an important consequence of restricting use: “If you can’t use the information, it makes a lot less sense to try to find ways to access it.”
So what we should be restricting — with legislation and open oversight by courts, Congress, the press, and ultimately the people — is the NSA’s ability to seek and use information against anyone — citizen or foreigner — without documented suspicion of a crime, due process, and a legal warrant. But don’t we already have that: Isn’t that what the Fourth Amendment prescribes? Well, of course, this is how we end up arguing whether collection of every bit of information my phone provides — whom I talk with and where I go and what I do when — is just collecting data or is the equivalent of searching me or surveilling my every move. Government should not be able to ask for that information unless it has due and just cause to. That surveillance of the innocent is government’s overreach of its authority.
But next we end up asking whether that data should be stored anywhere — whether government can decree that phone or internet or credit card companies should hold onto data so government could ask for it. That, I believe, should be governed by a separate set of principles, consumer principles that consider the benefits and risks to me for allowing such data to be held and that give me transparency into what is being done and reasonable control over it. That does not and cannot mean that I can exercise full control over any data to which I’m a party, for data is produced by interactions among parties, each of whom has interests and rights.
We should have this discussion on a level of principles. The best example of that: If our First Class mail carried by the US Postal Service is protected from government search except with a warrant, then all our private communication — by email, direct message, chat, Skype, or any invention to come — should receive similar protection. If metadata at a large scale — phone data — is problematic for government to hold then shouldn’t there be limits on it at a small scale — the Post Office (which is now photographing and logging every item it handles)? The problem is that these laws and cases were written to a technology — physical mail or POTS — when they should be written to a principle.
It is also important not to presume that metadata — or Big Data — is bad and dangerous any more than it is right to assume that technology is bad just because it could be misused. I enter into a transaction with Google’s Waze allowing it to know where I live and work so I can get the traffic between those points every day. I allow Googe to retain my searches — it’s easy to use incognito mode instead — because I value more personally relevant search results. I have been arguing that my local newspaper should gather signals about me as Google does so it could give me less noise and more relevance. I understand that Target has to communicate my debit card and pin data to complete a transaction but I expect them to hold that information securely. I also think that my cancer hospital, Sloan-Kettering, should collect data about how many penises — including mine — still function properly after prostate surgery there because that information and associated metadata about surgeons and age and other conditions could be valuable to the patients who follow. Of course, I expect that data to be held anonymously.
Each of these transactions enables the collection and use of data but is governed by sets of principles that take into account the transactors’ interests and rights and responsibilities, and those principles should be made public so customers can make decisions based on them. (See Doc Searls’ vendor relationship management as an attempt to codify that.)
Government’s access to that data must be determined, in turn, by a separate and much more stringent set of laws born of the principles set forth in the Bill of Rights and built with the knowledge that government has the means to use our information against us, in secret. Does the NSA’s mass collection, analysis, and use of communications metadata violate the Fourth Amendment? I think it does because it acts as surveillance over innocent citizens, treating all of us as criminals in government’s dragnet without probable cause or due process. Or as Jay Rosen puts it: “My liberty is being violated because ‘someone has the power to do so should they choose.’ Thus: It’s not privacy; it’s freedom.”
Privacy is important. It needs protection. But the primary issue here isn’t privacy. Nor is it the existence of any technology or of data. The issue with the NSA in all its activities revealed by Edward Snowden — not just the collection of phone metadata but also the wholesale hoovering of communication on the internet, the creation of backdoors in technology and other efforts to subvert security, the spying on other nations’ officials and companies — is government overreach and the absence of oversight. I am less concerned with what government knows about me than what we don’t know about government.
The Guardian asked me for commentary on the letter to the White House and Congress from eight tech giants about NSA spying:
Whose side are you on?
That is the question MP Keith Vaz asked Alan Rusbridger last week when he challenged the Guardian editor’s patriotism over publishing Edward Snowden’s NSA and GCHQ leaks.
And that is the question answered today by eight tech giants in their letter to the White House and Congress, seeking reform of government surveillance practices worldwide. The companies came down at last on the side of citizens over spies.
Of course, they are also acting in their own economic (albeit enlightened) self-interest, for mass spying via the internet is degrading the publics’, clients’, and other nations’ trust in the cloud and its frequently American proprietors. Spying is bad for the internet; what’s bad for the internet is bad for Silicon Valley; and — to reverse the old General Motors saw — what’s bad for Silicon Valley is bad for America.
But in their letter, the companies stand first and firmly on principle. They propose that government limit its own authority, ending bulk collection of our communication. They urge transparency and oversight of surveillance, which has obviously failed thus far. And they argue against the balkanization of the net and the notion that countries may insist that data respect national borders.
Bravo to all that. I have been waiting for Silicon Valley to establish whether it collectively is a victim or a collaborator in the NSA’s web. I have wondered whether government had commandeered these companies to its ends. I have hoped they would use their power to lobby for our rights. And now I hope government — from Silicon Valley’s senator, NSA fan Dianne Feinstein, to President Obama — will listen.
This is a critical step in sparking real debate over surveillance and civil rights. It was nice that technology companies banded together once before to battle against the overreaching copyright regime known as SOPA and for our ability to watch Batman online. Now they must fight for our fundamental — in America, our Constitutional — rights of speech and assembly and against unreasonable search and seizure. ’Tis a pity it takes eight companies with silly names to do that.
Please note who is missing off this list of signators: Google, Facebook, Twitter, Yahoo, Microsoft, Aol, Apple, LinkedIn. I see no telecom company there — Verizon, AT&T, Level 3, the companies allegedly in a position to hand over our communications data and enable governments to tap straight into internet traffic. Where is Amazon, another leader in the cloud whose founder, Jeff Bezos, now owns the Washington Post? Where are Cisco and other companies whose equipment is used to connect the net and by some governments to disconnect it? Where are the finance companies — eBay, Visa, American Express — that also know much about what we do?
Where is the letter to David Cameron, who has threatened prior restraint of the Guardian’s revelations, and to the members of the Parliament committee who last week grilled Rusbridger, some of them painting acts of journalism — informing citizens of their governments’ acts against them — as criminal or disloyal? Since they urge worldwide reform, I wish the tech companies would address the world’s governments, starting with GCHQ’s overseers in London.
And where are technologists as a tribe? I long for them to begin serious discussion about the principles they stand for and the limits of their considerable power. Upon learning that government had tapped into communications lines between their own servers, two Google engineers responded with a hearty “fuck these guys.” But anger is insufficient. It is not a pillar to build on.
Computer and data scientists are the nuclear scientists of our age, proprietors of technology that can be used for good or ill. They must write their own set of principles, governing not the actions of government’s spies but their own use of power when they are asked by those spies and governments — as well as their own employers — to violate our privacy or use our own information against our best interests or hamper and chill our speech. They must decide what goes too far. They must answer that question above — whose side are you on? I suggest a technologists’ Hippocratic oath: First, harm no users.
Official means of oversight of American and British spying have failed. So we are left with the protection of last resort: the conscience of the individual who will resist abuse of power or expose it once it is done.
At the Guardian Activate conference in New York last Wednesday, I moderated a heated panel discussion about the NSA affair with former U.S. Senator Bob Kerrey, a member of the 9/11 Commission; Prof. Yochai Benkler, codirector of the Berkman Center for Internet & Society at Harvard; and journalist Rebecca MacKinnon, a New America fellow.
“We do not have appropriate mechanisms to hold abuse accountable,” MacKinnon said, and to more or lesser degrees, the panelists agreed that oversight is at least too weak. Said Benkler: “The existing systems of oversight and accountability failed repeatedly and predictably in ways that were comprehensible to people inside the system but against which they found themselves unable to resist because of the concerns about terrorism and national security.” Kerrey: “I don’t think we’re even close to having unaccountable surveillance [but] I don’t think it’s good oversight.” I’ll count that as consensus. We then checked off the means of oversight.
* Executive-branch oversight is by all appearances nonexistent.
* Congressional oversight didn’t exist before Watergate, Kerrey said, and when it was established it was made intentionally weak. It should be conducted, he said, “under a constant, militant sense of skepticism.” The clearest evidence that the authority that exists is not being used, he said, is that in the Snowden affair, not a single subpoena has been issued from either the House or Senate select committees.
* The secret FISA courts have proven to be rubber stamps using invisible ink — their justices sometimes concerned or reluctant, Benkler said. But they have been largely ineffectual in any case.
* Journalistic oversight is the next resort. But as MacKinnon stressed, the work of the journalist investigating spying is threatened by the spies themselves as they collect metadata on any call and message and reconstitute raw internet traffic so that no reporters and no sources can be certain they are not being watched unless they find woods to walk in.
So we are left with the whistleblower. “What the whistleblower does is bring an individual conscience to break through all of these systems,” Benkler argued. “It can’t be relied upon as a systematic, everyday thing. It has very narrow and even random insights into the system. But it can be relied upon occasionally to break through these layers of helplessness within the system.”
But this oversight, too, is jeopardized by the severe penalties suffered by Chelsea Manning and the label of traitor pasted on Edward Snowden.
“There’s no question Snowden violated U.S. law,” Kerrey declared in our panel, “and there has to be consequences to that.”
Benkler disagreed, arguing the case for amnesty. “There is a law but the law is always affected by politics and judgment,” he said. “Clearly when someone opens up to the public a matter that is of such enormous public concern that it leads to such broad acceptance of the need for change and for reform, that person ought not come under the thumb of criminal prosecution.”
There we tried to find the line that enables acts of conscience and civil disobedience to keep watch on the powerful. Benkler imagined “a core principle that when a whistleblower discloses facts that actually lead to significant public debate and change in policy — that is to say a public rejection whether through judicial action or legislative action; a reversal — that is the core or heart of what needs to be protected in whistleblowing.”
Kerrey again disagreed, drawing a parallel between Edward Snowden and Klaus Fuchs, who handed secrets on the atomic bomb to the Soviets, Kerrey contended, also out of conscience. Benkler in turn drew a line between revealing information to the public, serving democracy, and revealing secrets to an enemy. Kerrey responded that Fuchs, like Snowden, caused public debate. Benkler thought the rule could be written; Kerrey did not. You can see that we failed to find the line.
But I want to take this discussion beyond whistleblowing — beyond the past tense — the the present tense of objecting to the work one is required to do before it is done. “At what point does conscience require a person to refuse to act in a certain way that they consider completely acceptable in the system they’re in but they find completely unacceptable to their conscience?” Benkler asked.
Kerrey countered: “I don’t think every time you get a team of people working on the danger [to national security], one person can say, ‘Oh, I don’t like what we’re doing,’ and as an act of conscience blow everything we’re doing and say we’re not going to be prosecuted.”
But we must find the room for conscience to act as the check on power without facing 35 years in prison or life in exile or irreversible jeopardy to our security. We must be able to expect the honest technologist working in the bowels of Google or telecom provider Level 3 or the NSA or GCHQ to define a line and refuse to cross it. Can we expect that?
In recent testimony before Congress, Gen. Keith Alexander said the NSA is the nation’s largest employer of mathematicians — or to be exact, 1,103 mathematicians, 966 PhDs, and 4,374 computer scientists.
Where is the code of ethics that governs their work in breaking into our communication or breaking the encryption we use to protect it? Where is the line they will not cross? Doctors have their codes. Even we journalists have ours (and though some apparently never imagined a clause relating to phone hacking, others found it for them).
We have heard two Google engineers tell the NSA to fuck off for — according to Snowden’s documents — infiltrating internal traffic between servers at Google and Yahoo.
Does this challenge to the NSA give us confidence that others at Google will tell the NSA “no”? But who said “yes” to Project MUSCULAR, in what company? Was that company commandeered by the the NSA and employees with security clearance or was the work done willingly? Why didn’t the technologists who spliced that line say “fuck you”, too? Will they be more willing to do that now that this work is known? And what will happen to those who do stop at the line?
On July 17, 1945, 155 scientists working on the Manhattan Project signed a petition to President Harry Truman urging him not to use the bomb on Japan. “Discoveries of which the people of the United States are not aware may affect the welfare of this nation in the near future,” they said.
First, listen to this superb and profoundly disturbing segment by On the Media producer Sarah Abdurrahman about how she and her husband and other guests at a Canadian wedding were detained and mistreated at the U.S. border crossings in spite of their citizenship — American — and because of their religion — Islam.
Welcome back. I told you it well done, didn’t I? I’d be screaming bloody murder at such treatment but Abdurrahman kept her journalistic cool and curiosity, trying to get the facts and understand our rights, asking questions, in spite of never getting answers. People have been saying lately that Verizon picked on the wrong person in me. Well, U.S. Customs and Border Protection could not have picked a worse person to detain: a smart, accomplished journalist with an audience.
I would hope that CBP is humiliated by this and will change, but our government isn’t humiliated by spying on the entire damned world and won’t change that, so I’ll give up my hope. Nonetheless, this story is the perfect bookend to the Guardian’s reporting on the NSA, showing a government that is out of control — because its citizens can no longer control it. Well done, OtM. Thank you, Sarah.
Now the bad news. Next came a story that did have me shouting at the radio as geographer Jim Thatcher condemned major tech companies with broad brush — without specifics, without evidence or proof, only with innuendo — for the possibility they could be redlining the world and diverting users away from certain areas. “It’s hidden what they’re doing,” he said. If it’s hidden, then how does he know they’re doing it? Not said. Microsoft had a patent that could do things like this but Thatcher acknowledged that “Microsoft may or may not” every use it. They could.
Brooke Gladstone laments Google’s purchase of Waze for $1.3 million because “we are being sold for our data, it’s an old story.” No, I was using Waze at the very moment I heard that because (1) I get data of great value back, helping me avoid not opium dens but traffic jams and (2) I generously want to share my data with others who have generously shared theirs with me. This is an example of a platform that does precisely what news organizations should do: help the public share its information with each other, without gatekeepers.
Next, Thatcher says with emphasis that “theoretically” Google could charge coffee shops for directing us to one over another. Then Thatcher acknowledges that it’s not happening. It could. And he dollops on a cherry of fear about technology and “for-profit” corporations.
Don’t you smell the irony in the oven, OtM? You properly and brilliantly condemn the CBP for detaining Americans because they are Muslims and because Muslims could do terrorism even when they don’t. Then, in the very next segment, you turn around and needlessly condemn technology companies because they could do things some guy imagines even though he admits they don’t.
Those are two sides of the same phenomenon: moral panic, the unsubstantiated suspicion that some apparently alien entity — Muslims or (OMG!) for-profit technology companies — could upset the social order, a fear often fanned by media.
Put down the fan, OtM, and learn the lesson from Abdurrahman’s superb story that your role — you of all media outlets — is to throw cold water on such unwarranted fright-mongering.
Mind you, these two segments were surrounded by two more very good reports: one that gives us a guide for what to ignore in breaking news (so as not to fan flames) and another about how — surprise, surprise, surprise — technology can lead to good ends. I remain a fan and loyal listener of OtM. And that is why I humbly offer you a map to guide you away from a dodgy neighborhood called technopanic.