The Good News and the Troubling News: We’re Not Going Dark

By Jonathan Zittrain
Monday, February 1, 2016, 7:00 AM

Just over a year ago, with support from the William and Flora Hewlett Foundation, Harvard’s Berkman Center for Internet & Society at Harvard University convened a diverse group of security and policy experts from academia, civil society, and the U.S. intelligence community to begin to work through some of the particularly vexing and enduring problems of surveillance and cybersecurity.

The group came together understanding that there has been no shortage of debate. Our goals were to foster a straightforward, non-talking-point exchange among people who do not normally have a chance to engage with each other, and then to contribute in meaningful and concrete ways to the discourse on these issues.

A public debate unfolded alongside our meetings: the claims and questions around the government finding a landscape that is “going dark” due to new forms of encryption introduced into mainstream consumer products and services by the companies who offer them. We have sought to distill our conversations and some conclusions in a report, which can be found here.

We also invited participants to draft individual comments — mine is below; Bruce Schneier and Susan Landau will share their own thoughts this week on Lawfare as well.


Two trends have dominated the U.S. foreign intelligence landscape for the past fifteen years.

The first arises from the terrorist attacks of 9/11. The attacks reshaped the priorities of the U.S. intelligence community, as extraordinary resources have been allocated to prevent and counter terrorism. Our national security establishment has pioneered new technological tools and new legal authorities (or interpretations of existing ones) in an effort to secure safety.

The second trend is the mainstreaming of the Internet and surrounding technologies built around and upon it, which has led to an unprecedented proliferation of data that can be analyzed by the intelligence services. In late 2001 there were no smartphones and no social media. Facebook and Twitter were still years away from capturing our imagination, our time — and our data. The more bits we generate, actively through typing and talking, and passively by sharing our location, our social relationships, and other information as we go about our lives, the more there is for vendors — and the governments to whom they answer — to potentially review, whether in bulk or individually.

The intersection of these trends led to what Peter Swire and Kenesa Ahmad in 2011 called “the Golden Age of Surveillance.” Since then, that high water mark for opportunities for surveillance has receded in places. Some communications and data previously accessible by governments through vendors is no longer so easily obtained, because some vendors have refined the technologies they offer to prevent even themselves from seeing the data the users generate and exchange with one another. Such technologies, including the use of encryption, are not new as a category, but their entry into mainstream usage perhaps is. Losing a tool, rather than never having had it to begin with, is no doubt highly salient for the director of the FBI and others charged with protecting security. They ask: if we have a warrant or other legal authority, why should previously-accessible information now be off-limits to us?

I empathize with the idea that just how much government can learn about us should not depend on the cat and mouse game of technological measure and counter-measure. Ideally, a polity would carefully calibrate its legal authorities to permit access exactly and only where it comports with the imperatives of legitimate security — and with basic human rights as recognized through the protections of conventions and constitutions. For one intriguing attempt to reconcile government use of technological hacking tools with appropriate privacy protections, you might read the proposal for “lawful hacking” that civil liberties-minded computer scientists Steven Bellovin, Matt Blaze, Sandy Clark, and fellow project participant Susan Landau have advocated.

But it is a very large step — a leap, even — to go beyond the legal demand for information already in a company’s possession, and beyond the use of technological tools to reveal what otherwise is obscure, to requirements on how technology must be deployed to begin with. I’ve written reasons why this leap is ill-advised. To try to constrain the generative Internet ecosystem in that way would be either futile or require that we, in the fitting words of the U.S. Supreme Court, “burn the house to roast the pig.” That turn of phrase was used by Justice Frankfurter to explain why a Michigan law banning books that could tend to “corruption of the morals of youth” violated the First Amendment, even if it was aimed at a laudable goal. Here, too, there are times we will rue the cleverness or luck of a criminal who benefits first from the Internet’s facilitation of communication and organization, and then from encryption to prevent his or her activities from being discovered or investigated. But this is not reason enough to require that foundational technologies be restricted or eliminated in general use — any more than the population of Michigan could rightly be restricted to reading only what is fit for children.

Most of the Don’t Panic report from our Berklett cybersecurity project isn’t about that. Given the spectrum of roles and viewpoints represented in the room, our focus was more on a factual (if speculative) question — are we really “going dark”? — than one of articulating and balancing values.  The answer, in the big picture, is no, even as it’s small solace to a prosecutor holding both a warrant and an iPhone with a password that can’t be readily cracked. (To be sure, many of those situations will also have an owner who could, after process, be ordered by a court to unlock the phone on pain of contempt.)

As data collection volume and methods proliferate, the number of human and technical weaknesses within the system will increase to the point that it will overwhelmingly likely be a net positive for the intelligence community. Consider all those IoT devices with their sensors and poorly updated firmware. We’re hardly going dark when — fittingly, given the metaphor — our light bulbs have motion detectors and an open port. The label is “going dark” only because the security state is losing something that it fleetingly had access to, not because it is all of a sudden lacking in vectors for useful information.

But exactly what should reassure government officials, and stay the momentum for major policy interventions into Internet technology development, is what should also trouble everyone: we are hurtling towards a world in which a truly staggering amount of data will be only a warrant or a subpoena away, and in many jurisdictions, even that gap need not be traversed. That’s why this report and the deliberations behind it are genuinely only a beginning, and there’s much more work to do before the future is upon us.