FBI’s iPhone Request Threatens Innovation
Frustrated by its unsuccessful attempt to access the contents of an iPhone 5c that was used by one of the San Bernardino shooting suspects in December, the Federal Bureau of Investigation has obtained a court order requiring Apple’s cooperation. Apple intends to fight it.
Although the FBI insists this is an exceptional request, limited to a single iPhone, in fact it has grave consequences for the efforts of tech firms like Apple to combat online fraud, piracy and the potential vulnerability of critical infrastructure to cyber attack, and to develop a more safe and secure ecosystem for electronic transactions and recordkeeping.
Apple can unlock iPhones running older software pursuant to a court order, and it does. But in 2014 the company released the iOS 8 operating system that encrypts sensitive user data — including the passcode — by default. Even Apple can’t decrypt the user data.
Law enforcement officials worry that the growing use of encryption to protect information stored on devices and moving across advanced networks will make it harder to combat not just terrorism but all types of crime. Encryption “has an impact on national security,” testified FBI Director James Comey earlier this month at a Senate hearing, “but overwhelmingly this is a problem that local law enforcement sees.”
Apple cannot decrypt the passcode or other information stored on the iPhone 5c that was used by Syed Farook, and the FBI is not requesting that in this case. What the FBI is asking is for Apple to help it enter every conceivable four-digit passcode combination until it finds the passcode that unlocks the device. There are 10,000 possible combinations, and Farook may have enabled a security feature that would permanently erase the contents of his smartphone after only 10 incorrect passcode attempts.
But as Comey has made clear on numerous occasions including the recent Senate hearing, what he and other law enforcement officials ultimately want “is a world where [companies like Apple] are able to comply with court orders” requesting unencrypted data to help solve cases involving murder, car accidents, kidnapping, drugs, national security and other crimes. Civil litigants could also be expected to seek access to unencrypted data in the type of world Comey envisions.
Law enforcement agencies have used wiretaps for decades to conduct surveillance, but the public switched telephone network was highly centralized and tightly controlled. Telephones couldn’t store vast amounts of personal and proprietary data like computers, and they weren’t routinely hacked by criminals, spies and terrorists. Besides the increasing amounts of sensitive financial, medical and work-related information stored on mobile devices, they are also ideal surveillance tools. Without robust security, nothing would prevent a bad guy from activating the microphone, camera and/or location feature on an innocent person’s smartphone and conducting highly-intrusive surveillance in complete secrecy.
So far, no one has figured out how to build a “backdoor” for bypassing encryption or other security that couldn’t be exploited by the bad guys, although government and industry are working together to find better options than we have now. Were lawmakers to insist on backdoor technology, a recent Berkman Center report points out that “bad guys can easily switch to foreign encryption products that don’t have backdoors.” Clearly, at the moment, the costs to such technology exceed the benefits.
New technologies will continue to create both unforeseen challenges as well as opportunities for law enforcement officials. Inexpensive surveillance cameras are becoming ubiquitous. Bad guys can be found on social networking sites. The Internet of Things, just around the corner, will consist of a wide range of interconnected devices and vehicles equipped with sensors for collecting and transmitting sounds, images and other data.
The court order that requires Apple to disable security features on Syed Farook’s iPhone isn’t about encryption per se. But Apple intends to oppose it anyway because the principle is the same: Will technology companies be allowed to design and implement state-of-the-art security for protecting not only the sensitive personal information of consumers, but also for safeguarding things like our nation’s intellectual property and critical infrastructure from attack — or will improved security be subject to the surveillance requirements and capabilities of slow-moving law enforcement bureaucracies?
Hance Haney is a senior fellow at the Discovery Institute.