Apple’s New ‘Child Safety’ Plan for iPhones Isn’t So Safe
Apple recently announced in how it handles photos on iPhones. These changes are a step toward significantly worse privacy for all iPhone users, and are especially troubling for activists, dissidents, whistleblowers, members of oppressed groups like LGBTQ people, people of color, those facing religious persecution, and other vulnerable people.
We are concerned that governments will exploit these changes to conduct far-reaching surveillance, and that the new system could normalize government spying on our personal phones and computers, leaving no remaining places where digital privacy is still possible. This should worry everyone: Privacy is central to our identities and our autonomy — our ability to control information and our sense of self, as well as our interactions with others, free from government intervention.
Apple announced several changes at once, but the most worrisome part is a system to scan for certain photos on iPhones — specifically, “child sexual abuse material,” or CSAM. It plans to do this scanning on iPhone users’ pictures before they are backed up to Apple’s iCloud servers. Future iPhones will use cryptography — — in an effort to ensure that, if everything works as advertised, Apple learns only when a user has uploaded a threshold number of images that match known CSAM.
, , and quickly pointed out that Apple is building a new form of surveillance, a tool that scans material on your phone (client-side scanning) rather than on its servers (server-side scanning). This poses serious risks to privacy and civil liberties. Apple can already scan and view images uploaded to iCloud if it wants to because the company has the keys to decrypt iCloud backups. Apple from an earlier attempt to lock itself out of users’ iCloud data. And it . Without encrypting these backups with keys the company does not have, they remain subject to government intrusion. Now Apple plans to scan material on the iPhone too.
Apple appears to be making three promises about this client-side CSAM scanning:
- Only iCloud Photos: Apple says the system will only scan images uploaded to iCloud via the Photos app.
- Only a single global CSAM database: Apple says the system cannot be used to target people, or to search for anything other than CSAM because the images will only ever be compared against a single, global database CSAM of “hashes,” or image fingerprints.
- Accountability through visibility: Apple says the public can audit whether Apple is holding to these commitments.
Here’s our take on those claims:
Accountability through Visibility
Apple is when it comes to people their software. The latest turn: It only recently litigation against a company called for distributing a tool that makes it easier for people to analyze the iPhone operating system. On forums where researchers obtain prototypes and software for analysis, Apple appears to have had a “” who shared with Apple the personal information of journalists who had relationships with leakers and sellers.
We don’t have confidence that Apple will support researchers trying to understand its scanning system. And even with strong transparency (fully , , visible software updates, etc.), what would happen if Apple were to renege on any of its promises, either for business reasons or under government coercion? Could anyone stop it, and would there be any meaningful consequences?
Only a Single Global CSAM Database
China recently legally songs. Ukraine recently upheld a . Hong Kong is prosecuting union members over production of a . Brazil ordered the arrest of tech executives based on their company’s refusal to comply with government orders related to or . Thailand prohibits images . How will Apple respond to demands that it find a way to search for these materials on iPhones? In a different context — personalized iPhone cases — Apple sometimes without legal justification, internal consistency, or transparency.
Given the widespread interests of governments around the world, we cannot be sure Apple will always resist demands that iPhones be scanned for additional selected material.
While Apple is building a tool to scan photos against a single, globally installed CSAM database, other databases could be added to the system. Apple could be compelled by legal or economic pressure to add a database of images labeled as “terrorism,” “hate speech,” or some other “objectionable” content as defined by regimes around the world. This system could be transformed into a tool for censorship and broad surveillance. It could be very hard for the company to resist making such a change in the face of legal obligation or political pressure.
Only iCloud Photos
Finally, once this system is deployed, Apple will be under pressure to scan more than just iCloud photos. Entities pushing for surveillance will argue that if Apple can gather information about some images on the iPhone, why not all of them? Governments likely want to know about photos shared in encrypted iMessage conversations, or even privately taken photos that were never shared with anyone. From photos, the pressure will grow to scan the other sensitive and revealing types of information stored on a phone, such as conversations, search histories, health data, and more.
If Apple acquiesces to these pressures, and iMessage conversations are scanned, other encrypted messaging systems would be under even more pressure to follow suit, scanning images and text before or after they are sent. As the operating system vendor, Apple could face demands to scan images sent or received through other encrypted messaging apps, since those apps use the OS to store images and display them to the user. Any of these outcomes would be a devastating blow to the public’s ability to conduct secure confidential communications.
What Happens on Your iPhone…
Some that this new plan is a precursor to Apple securing iCloud backups in a way that even Apple cannot see them, by end-to-end encrypting them with keys the company does not have. This speculation is exciting, but Apple has made no such representations. If Apple’s proposed system is offered as a tradeoff that gives the public end-to-end encrypted photos in iCloud, then we ought to be having that conversation in full. But the scheme as currently proposed is worse than the status quo, and has genuinely dangerous potential.
Society needs private space — for discussion, for communication, and for emotional and social growth and transition. Moreover, when governments can easily intrude into our “private sphere,” we lose more than just privacy and control over our information: free speech, security, and equality suffer as well. We have already seen all too often how governments misuse and abuse surveillance to target political opponents, disfavored groups, and protestors. Private companies building tools that could greatly expand surveillance pave the way for even greater abuses of power. The data on our mobile devices reflect our thoughts, our conversations, and what we see around us. Governments want to know this stuff. We have to be sure, then, that Apple is not creating a system that will ultimately be used to destroy the remaining private space contained on our devices.