Note: This is part one of a four-part series where security expert Jon Callas breaks down the fatal flaws of a recent proposal to add a secret user 鈥 the government 鈥 to our encrypted conversations.
Twenty-five years ago, the FBI decided it needed a surveillance system built into the nation鈥檚 telephone network to enable it to listen to any conversation with the flip of a switch. Congress obliged by passing the Communication Assistance to Law Enforcement Act (), forcing telephone companies to rebuild their networks to be 鈥渨iretap ready.鈥 In the more than two decades since then, the FBI has been seeking both legislative and judicial approval to expand this authority to internet communications, insisting that its investigations have 鈥済one dark鈥 because of increasingly widespread use of encryption. Technologists and civil libertarians have so far been successful in opposing those efforts, warning that requiring technology companies to build a backdoor into our encrypted communications would compromise security for everyone and would empower not just the FBI, but repressive governments like China and Iran, to demand or gain access to private communications. But law enforcement and intelligence agencies have not given up.
The latest proposal for circumventing encryption comes from Ian Levy and Crispin Robinson of the UK's GCHQ (the sister agency to the NSA). Their proposal would enable surveillance on encrypted communications not by trying to mathematically attack or weaken the encryption, but simply by forcing service providers to secretly add an extra user 鈥 the government 鈥 to an encrypted conversation. The authors of say that their proposal would not 鈥渂reak鈥 encryption, but it would nonetheless have the same effect by creating a situation in which people are no longer confident they are securely talking to their partners. Encryption gives people secure and private communications that ensure that their conversations are between them and their partners alone. Creating the possibility that a secret user may be listening in on an otherwise securely encrypted conversation destroys that confidence, thereby chilling First Amendment protected speech. A proposal that keeps encryption while breaking confidentiality is a distinction without a difference.
Beyond that, my experience tells me that the GCHQ authors鈥 proposal will not work 鈥 and for more reasons than just the . Outside of the lab and in the real world, operating an encrypted service that also ensures secret government access faces some insurmountable obstacles. No technology can claim to be a 鈥渟olution鈥 without grappling with the massive global scale of the Internet, complex and often conflicting international legal requirements, and well-resourced and highly motivated adversaries seeking to exploit flaws in the technology.
Over the past thirty years of my career, at companies large (Apple) and small (Silent Circle), I have built encrypted software, hardware, and storage services. I am one of the founders of PGP Corporation, where we built secure email and disk encryption. I'm also one of the founders of Silent Circle, where we made apps for encrypted chat and phone calls, including secure conference calls as well as an extra-secure An-droid phone called Blackphone. I have taken raw encryption technologies from the lab to a product, as well as deployed the product worldwide to millions of people.
Coming up with an idea, as the GCHQ authors have, is the easy part. But as technology moves from idea to experiment to proof-of-concept to product to deployed, the problems multiply and finding solutions gets much, much harder.
If this proposal is going to work, then every company that implements it is going to have to build that capability into their product. And that will be no simple matter. It requires agreement and cooperation by all organizations making secure communications and all governments wishing exceptional access. The difficulty in this sort of technology is that the technology ultimately is about embodying policy and that policy is international politics and this is often impossible.
Here鈥檚 an example. If you and I are going to have dinner together, there are technical details ranging from picking the place, finding a time to eat, traveling, cooking (perhaps), and so on. If I bring someone along and you do, too, some of those details will get a little harder to nail down, not a huge deal, but dinner for two is easier to plan than dinner for four. Dinner for ten is a lot harder. Dinner for fifty 鈥攏ow that's really hard. Not because cooking is hard (though I know it is), but because doing anything for fifty people is hard in ways that doing the same thing for two people is not. Some simple problems of dinner, like dietary restrictions, can be a minor challenge with two people but a truly significant one with fifty.
Securely encrypted apps are used by millions of people. It may not literally be impossible to cook a single dish that meets the dietary restrictions of a million people, but it is not going to be what anyone really wants to eat.
In one of my past jobs, our engineering department was some one-hundred people total, with the core security technology team being three to five people at a given moment. This is common; core technologies are built by a small team that might go as large as ten, sometimes as small as one. The other ninety-five people in the engineering department were there to deal with the exacting technical problems associated with building a viable product. These kinds of problems are serious and challenging; even a massive group of engineers, like the one that GHCQ has the resources to assemble still faces these and other problems.
Deployment comes next. The core tech team must work with other development teams for good user experience, integration with an organization鈥檚 IT infrastructure, management tools, deployment and scaling tools, regulatory auditing and legal compliance, user expectations, managing judicial authorization and oversight, access compliance and auditability, proportionality, transparency, and also multi-lateral international versions of all the previous. Deploying that system on a world-wide basis? This is way beyond dinner for two.
The GCHQ authors ignore these necessities, but they are essential challenges to overcome as a product progresses from development to beta test to rollout to global deployment to mature system. 鈥淚t doesn鈥檛 scale,鈥 is a technology clich茅 that means something real 鈥 there are different, harder problems with making something work in the real world than there are with simply making it work.
The GCHQ proposal is drowning in problems of scale. Exceptional access 鈥 as governments propose 鈥 is the problem of making a system selectively secure. I can tell you, it's hard enough to make a secure system. It's vastly harder to make a system secure except for governments, and only available to governments that consist of 鈥渄emocratically elected representatives and [a] judiciary鈥 as the GCHQ authors imagine.
In their article, the GCHQ authors say, 鈥淲e also need to be very careful not to take any component or proposal and claim that it proves that the problem [of exceptional access] is either totally solved or totally insoluble.鈥 That may sound reasonable in the abstract, but in the case of exceptional access, the problems are nearly insoluble. (This problem is not new, and it鈥檚 not as if no one has ever considered it till GCHQ published its proposal.)
In the following series of essays, I show that the GCHQ proposal is necessarily unworkable at scale. Even aside from the technical obstacles with secretly adding a listener to an otherwise secure conversation (what some have called a 鈥済host user鈥), the proposal falters in a number of ways.
- Mandated exceptional access must work internationally, but the complex and often conflicting legal requirements and regimes in different countries mean a highly complex array of competing or conflicting technical requirements.
- The global communications system of telephones and the internet has developed in such a way that inserting surveillance capabilities that worked in the 1950s (the era of alligator clips) is no longer feasible. In fact, it has developed in a way that even the mechanisms of the 1990s are no longer applicable.
- To build anything that could allow a 鈥済host user鈥 requires that the access mechanism exists on users鈥 devices. But eventually, an exceptional access mechanism stored on peoples鈥 devices will be detected. This cannot be squared with a primary 鈥渆xceptional access鈥 requirement 鈥 that the surveillance is surreptitious. This dooms the proposal to failure, just as previous attempts at exceptional access have failed, and at great risk to global cybersecurity.
Further Reading
Here is some further reading on the GCHQ Ghost User proposal and exceptional access itself.
The GCHQ Proposal
Ian Levy and Crispin Robinson, ""
Specific Responses to the Proposal
Susan Landau, 鈥鈥
Matthew Green, ""
Bruce Schneier, 鈥鈥
Nate Cardozo and Seth Schoen, ""
Mustafa Al-Bassam, 鈥鈥
Open Technology Institute, "" (Daniel Kahn Gillmor and I are signers.)
A General Discussion of Exceptional Access
Josh Benaloh, ""
Cindy Cohn, ""
Seth Schoen, ""
Five Country Ministerial Quintet Meeting of Attorneys-General ""