Back to News & Commentary

The Recent Ploy to Break Encryption Is An Old Idea Proven Wrong

Clipper Chip
Clipper Chip
Jon Callas,
Senior Technology Fellow,
老澳门开奖结果
Share This Page
July 23, 2019

This is the fourth and final in a series of essays about a proposal by officials at Britain鈥檚 GCHQ about requiring encrypted communications platforms to be designed to secretly add an extra participant 鈥 the government 鈥 to a conversation. In the previous essay, I explained why network design and cryptography mean that the GCHQ proposal cannot listen from afar as their metaphor of crocodile clips implies. They must be on the participants鈥 devices, and yet must be secret, as listening in when everyone knows there鈥檚 a listener is comically silly. In this essay, I explain how criminals and terrorists would take advantage of that technological fact to evade the so-called 鈥済host user.鈥

Whenever you build a system, you have to test it in two ways. Quality assurance teams make sure that the system can be used correctly and produces the correct results when its users do the things you expect them (and instruct them) to do. In my career as a software engineer and security specialist, I led a team that did adversarial testing, also known as Red Teaming. Red Teams do unexpected, incorrect, devious, willfully obtuse, and downright malicious things to a system to see how it responds. Both of these kinds of testing are necessary before any system is deployed. Tools must both work when given the correct commands and respond well when given incorrect ones. We design technology to resist people who try to trick it into the wrong behavior.

In the early 1990s, the U.S. government had another proposal that would purportedly preserve secure communications for the 鈥済ood guys,鈥 and provide 鈥渨iretappability鈥 for the 鈥渂ad鈥 ones. This proposal was the notorious Clipper Chip, and it was finally abandoned because a flaw in its access system ensured that criminals could get around it. In brief, Clipper Chip telephone handsets would encrypt calls, but held 40 bits of the 80-bit encryption key in government hands. This gave the US government an easy 40-bit break of the encryption, while making everyone else have to do an 80-bit key search, which is daunting but not impossible today. That a phone was correctly escrowing half the key was signaled through a metadata a hint called the Law Enforcement Access Field, or LEAF. To the outside world, the handsets had rather strong encryption, but to US agents, the LEAF would make breaking the encryption much easier, only taking a few hours or days. At least, that was the idea.

In fact, (currently the ) did Red Team testing of the Clipper Chip and defeated its security. His analysis showed that one could forge a LEAF, and thus create a phone that would work alongside Clipper phones, and yet not give exceptional access to the government. If, for example, you had one of these forged phones and I had a Clipper phone, law enforcement would be able to decrypt my half of the conversation, but not yours. If we both have forged phones, we have opted out of Clipper鈥檚 access system altogether. This discovery lead to the Clipper proposal fading away 鈥 because it just didn鈥檛 work. Potential customers didn鈥檛 want one of these forged phones (how do you trust such a thing), and the government didn鈥檛 want a system where someone could opt-out by simply using a forged phone. If the proposal had been implemented it would have created two populations of users: the 鈥渟mart鈥 ones evading surveillance by using forged equipment, and the 鈥渄umb鈥 users who are using the conventional system.

The same thing would happen under the ghost user proposal. While law enforcement typically replies to such issues with the comment that most criminals are dumb, I believe that a system that permits intelligent criminals to operate with impunity, while everyday people can be spied upon, is an affront to nearly every principle of civil liberties, and certainly to the principles that the GCHQ authors use to justify their proposal, particularly those of fairness, proportionality, transparency, and trust.

Build Our Own Canary for This Coal Mine

Nate Cardozo and Seth Schoen of the EFF wrote an in which they show how the 鈥済host user鈥 in the GCHQ proposal can be detected with some sophisticated cryptographic techniques. Their article is clever and worth a read. I take a different approach to defeat the ghost user system, one that is directly analogous to the defeat of Clipper. I can write an alternate app that runs alongside the official installation of Whatsapp or other software and performs the same function as the official app yet tells the user all the other parties in the conversation. There is no way to prevent such an app because, for the reasons I explained in previous essays, the conversation keys have to be on the device in order for the conversation to be end-to-end encrypted. The 鈥渃lient鈥 software that operates on the computer or smartphone can always tell a user about all the participants and can report any user, ghost or not, entering or leaving the conversation. This app might do nothing more than tell me who the participants are, or alert on the addition or deletion of new devices. In security, we call this a 鈥渃anary app,鈥 after the proverbial canary in the coal mine. The way this canary app would work is, if suddenly it looks like Carol has just gotten a tablet, Alice might say, 鈥淐ongratulations on the new tablet, Carol.鈥 Carol replies, 鈥淲hat new tablet?鈥 and then the jig is up. They know that someone or something is pretending to be Carol鈥檚 new tablet.

A more sophisticated canary could simply reject the exceptional access request by refusing to negotiate the 鈥済host鈥 encryption key exchange or sending it bogus keys. A very clever app could send different messages to the ghost than to the real people using a chatbot. Clever people will think of other ways to troll the spies on the line, starting with sending them malware.

Canary apps can鈥檛 be prevented. They can be created from existing open-source apps or created from whole cloth by reverse-engineering the network communications. People will write, publish, and provide these apps so that people vulnerable to attacks by their government could protect themselves. Some criminals would install it, and there is no mechanism to prevent that.

Actual Bugs and Threats

There will be other security flaws in implementing a 鈥済host user鈥 architecture beyond those I鈥檝e identified. All software has bugs. Building a multi-user chat system is complex and there are many things to get wrong. For example, Apple鈥檚 multi-user FaceTime had an interesting bug in which someone could turn on another user鈥檚 microphone before they answered. Apple had to shut down multi-user functionality across the globe while the company fixed the problem.

In another example, the French government created a secure messenger called intended for trusted government actors to use instead of messengers like WhatsApp and Telegram, which the government did not entirely trust. A security researcher , thereby defeating the purpose of the app.

Software is hard to do correctly. It鈥檚 impossible to get it right the first time. Software that has a security goal that is in opposition to itself 鈥 be secure, but let certain parties break it 鈥 is even harder. It will be under attack from honest people who don鈥檛 want to be spied on. It will be under attack by criminals. It will be under attack by other governments who want to subvert the rules of exceptional access. For example, if the Chinese government learns to spy on UK citizens by pretending to be GCHQ, they will, and they aren鈥檛 going to tell anyone that they can.

If it Doesn鈥檛 Work, It Doesn鈥檛 Work

The government abandoned the Clipper Chip proposal because a researcher found that an adversary who wanted encrypted calls that could not be decrypted could cheat and do so. It wasn鈥檛 worth incurring the security problems and expense of the Clipper Chip when it couldn鈥檛 reliably give the government the access it needed. The same is true with the GCHQ proposal: some programmers will make canary apps that detect or thwart the spying. The more high-profile the target, the more justified the exceptional access, the more resources and incentives the target will have to fight back against a secret government user.

The GCHQ proposal could be called 鈥淐lipper 2.鈥 As with that discarded, flawed proposal, both citizens and government lose all while bad actors do as they wish with impunity. The GCHQ proposal introduces serious cybersecurity and public safety dangers without assuring government agents get the data they want. It creates an international surveillance free-for-all where smart criminals can decide to opt-out of government eyes while leaving the law-abiding without security. It permits and encourages brazen governments to move their international information security battles into the phones of every honest person everywhere in the world. Like Clipper, the Ghost User proposal must be put aside.

Further Reading

Here is some further reading on the issues in this essay.

The French Government 鈥淭chap鈥 app

Romain Dillet,""

, ""

Spyware, Malware, Stalkerware

Andy Greenberg, ""

Michael M. Grynbaum, ""

EFF鈥檚 Ghost Detector

Nate Cardozo and Seth Schoen, ""

Learn More 老澳门开奖结果 the Issues on This Page