Back to News & Commentary

Spies Want to Make the FaceTime Eavesdropping Bug Into a Feature

Back of iPhone with pair of Airpods
Back of iPhone with pair of Airpods
Daniel Kahn Gillmor,
Senior Staff Technologist,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
January 31, 2019

On Monday, we that Apple鈥檚 FaceTime video chat service suffers from a bug that permits other people to get audio and even video directly from your iPhone or Mac computer. This can happen without your permission and without the standard indication that the other person is listening and watching. Anyone with FaceTime could eavesdrop on any other FaceTime user by simply calling and performing a simple operation 鈥 and the victim鈥檚 device would start transmitting, even if they never accept the call.

This is a fairly catastrophic bug. Yet alarmingly, if major national spy agencies get their way, a comparable bug will become a standard feature in almost every popular communications product currently in use. As incredible as that might seem, we know this because they鈥檝e told us so.

The FaceTime bug is a failure in the user interface 鈥 the parts of the software that make the user aware of and in control of what the device is doing. FaceTime鈥檚 user interface fails in at least two ways that are related, but distinct. First, it sends audio and video to the attacker without the victim鈥檚 permission 鈥 the transmission starts without the victim approving it. Second, it does so without victim鈥檚 knowledge 鈥 the normal indication that an active call is underway is absent.

The engineering community has that user interface failures are a frequent cause of security failures and that these failures are often worse than others. There are , , and dedicated to working on trustworthy and secure user interfaces, and Apple itself has guidelines that reinforce the .

But officials from Britain鈥檚 Government Communications Headquarters (GCHQ) 鈥 a close surveillance partner of the U.S. National Security Agency 鈥 recently proposed that government agents be able to inject hidden participants into secure messaging services. This proposal has come to be known as the 鈥.鈥

Written by GCHQ鈥檚 Ian Levy and Crispin Robinson, it recommends institutionalizing an untrustworthy user interface when the government wants to spy on a conversation:

It鈥檚 relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who鈥檚 who and which devices are involved 鈥 they鈥檙e usually involved in introducing the parties to a chat or call鈥. In a solution like this, we鈥檙e normally talking about suppressing a notification on a target鈥檚 device鈥 and possibly those they communicate with.

In short, Apple 鈥 or any other company that allows people to privately chat 鈥 would be forced to allow the government to join those chats as a silent, invisible eavesdropper. Even the most secure apps like (which we recommend) and WhatsApp, which use end-to-end encryption, would be rendered insecure if they were forced to implement this proposal.

mytubethumb
play

%3Ciframe%20allow%3D%22accelerometer%3B%20autoplay%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20frameborder%3D%220%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FvSQQXS3q1k8%3Fautoplay%3D1%26version%3D3%22%20thumb%3D%22%2Ffiles%2Fweb19-protect-digital-privacy-thumb-560x315.jpg%22%20width%3D%22560%22%3E%3C%2Fiframe%3E

Privacy statement. This embed will serve content from youtube.com.

The Ghost proposal institutionalizes a significantly worse user interface failure than Monday鈥檚 FaceTime flaw. With the FaceTime bug, the vulnerable user at least gets an alert about an incoming call to know that something is happening, even if the user interface is misrepresenting the situation and violating the user鈥檚 expectations. With the Ghost proposal, the user has no way of even knowing that something is happening that violates their expectations.

The GCHQ authors claim that Ghost provides law enforcement with wiretap-like capability, and 鈥測ou don鈥檛 even have to touch the encryption.鈥 This is true, but only in the most disingenuous sense.

When people want encryption in their communications tools, it鈥檚 not because they love the mathematics. People care because of what encryption does. Encryption and other cryptographic protocols are necessary to protect people through properties like confidentiality, integrity, and authenticity. The Ghost proposal essentially says, 鈥淟et us violate authenticity, and you can keep encryption.鈥 But if you don鈥檛 know who you are talking to, what security guarantee is left?

Cryptography is necessary to ensure these properties, but it is not sufficient on its own. The entire system, from the cryptographic mathematics to the software implementation to the network protocols to the user interface, is critical to providing secure communications in an increasingly hostile online environment.

And let鈥檚 not forget: If companies like Apple are compelled to enable governments to participate silently in private conversations, that tool won鈥檛 be available only to democratic governments 鈥 it will be employed by the world鈥檚 worst human rights abusers to target journalists, activists, and others.

We should be clear: All software has bugs, and Apple鈥檚 software, as good as it is, is no exception. Although it too long for Apple to recognize the flaw, the company is now treating it with the gravity it deserves.

Since the vulnerability is accessed through Group FaceTime, Apple has those servers entirely offline until the FaceTime app itself can be fixed. But any connected FaceTime app is still currently vulnerable if Apple chooses to re-enable the Group FaceTime servers, so until an upgrade is shipped, people should probably . (This is a good reminder of why it鈥檚 important to install new software updates as soon as they鈥檙e available)

That such a serious flaw could be discovered in the software of a company known for prioritizing privacy should be a warning to anyone, including GCHQ and NSA, who advocates for intentional security flaws to facilitate government surveillance. It鈥檚 very difficult to engineer software correctly in the first place, and it's even more difficult to design it with intentional flaws, however limited. If a mechanism exists to deliberately make the user interface untrustworthy, it will be an attractive target for malicious hackers and other hostile actors. Who will be responsible for its inevitable abuse?

Any future discovery of a software flaw that enables eavesdropping, false identities, message tampering, or any other compromise of communications security should be treated the same way as this latest weakness: with serious emergency mitigations, followed as soon as possible by a software update that removes the flaw. And governments certainly shouldn鈥檛 consider adding such vulnerabilities on purpose.

Learn More 老澳门开奖结果 the Issues on This Page