On Tuesday, the government obtained a court order to hack into an iPhone as part of the FBI鈥檚 investigation into the San Bernardino shooters. While the government鈥檚 investigation is an important one, the legal order it has obtained crosses a dangerous line: It conscripts Apple into government service and forces it to design and build what is, in effect, a master key that could be used as a mold to weaken the security of an untold number of iPhones.
The resulting order is not only unconstitutional, but risks setting a precedent that would fundamentally undermine the security of all devices, not just the one iPhone being debated in the news.
A bit of background is necessary to understand this debate.
As part of its investigation, the FBI has apparently obtained an iPhone 5C used by one of the shooters. The bureau has said that the phone is encrypted and protected by a passcode, and that it needs Apple鈥檚 assistance to unlock the phone. Specifically, it has asked Apple to design and write custom software that would disable several security features on the phone.
While Apple has generally cooperated in the investigation, it has the FBI鈥檚 latest demand to write malware that would help the FBI hack the device. To its credit, Apple has poured incredible resources into securing its mobile devices. One consequence of that effort is that Apple does not have a ready way of breaking into its customers鈥 devices. In the of Apple鈥檚 CEO, Tim Cook: 鈥淲e have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.鈥
But the FBI is dismissive of that effort. According to its legal filing, the FBI believes that Apple could, if compelled, build a master key that would allow the FBI to try to break into iPhones like the one involved in the San Bernardino investigation. The FBI acknowledges that this would require Apple to write new software and then cryptographically 鈥渟ign鈥 that software (as the iPhone will accept only software updates signed by Apple).
A federal magistrate judge granted the FBI鈥檚 request the same day, but it gave Apple five days to object. Again to its credit, Apple has vowed to fight.
It is critically important that Apple win鈥攆or cybersecurity and for the fate of privacy in the digital age鈥攆or several reasons.
First, the government鈥檚 legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations鈥攂ut only in obtaining information or evidence the companies already have access to.
The difference between those cases and Apple鈥檚 is a radical one. If Apple and other tech companies鈥攚hose devices we all rely upon to store incredibly private information鈥攃an be forced to hack into their customers鈥 devices, then it鈥檚 hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there鈥檚 no way to ensure that it鈥檚 only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.
Second, this debate is not about one phone鈥攊t鈥檚 about every phone. And it鈥檚 about every device manufactured by a U.S. company. If the government gets its way, then every device鈥攜our mobile phone, tablet or laptop鈥攚ill carry with it an implicit warning from its manufacturer: 鈥淪orry, but we might be forced to hack you.鈥
Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple鈥檚 Cook points out, backdoors are uniquely dangerous: 鈥淥nce the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.鈥
That risk is only growing every day as the 鈥淚nternet of Things鈥 expands. For the government, every device connected to the Internet will be more than just a novel convenience鈥攊t will be a new window into your home. The fridge that responds to your verbal commands might have a backdoor to allow for remote listening. The TV that allows you to video chat with your family might be commandeered into a ready-made spy camera.
These are the real stakes of the debate: Either American companies are allowed to offer secure products to their consumers, or the U.S. government is allowed to force those companies to break the security of their products, opening the door for malicious hackers and foreign intelligence agencies alike. For the sake of both our privacy and our security, the choice is clear.
This post was originally published by .