During the recent GOP debate, Sen. Marco Rubio said the FBI has asked Apple to disable “the auto-erase mode on one phone in the entire world.” But the FBI, which is seeking access to the San Bernardino shooter’s iPhone, has asked Apple to write software that could do more than that.
With minor tweaks, this software could also be applied to other iPhones, either at the request of the government or by hackers, multiple security experts say.
Rubio, Feb. 25: [T]he FBI made this very clear 48 hours ago — the only thing they are asking of Apple is that Apple allow them to use their own systems in the FBI to try to guess the password of the San Bernardino killer. Apple initially came out saying, “We’re being ordered to create a back door to an encryption device.” That is not accurate. The only thing they’re being asked to do … is allow us to disable the self-destruct mode that’s in the Apple phone so that we can try to guess using our own systems what the password of this killer was. … That is all they’re asking them to do is to disable the self-destruct mode or the auto-erase mode on one phone in the entire world.
Rubio is right on one point — in this particular case, the FBI is not asking Apple to create a “back door to an encryption device” in the traditional sense of the term. Back doors are usually thought of as holes inserted purposefully into the design of devices before they’re released. These back doors could then be used by government agencies like the National Security Agency or the FBI during investigations.
Steven M. Bellovin, a computer science professor at Columbia University and a security specialist, told us by email that what the FBI is asking for is “more like a crowbar to pry open the lock on the existing front door.” Whether it’s access through a back door or a front door, it still potentially leaves the wider public’s data at risk, he and other experts say.
However, Rubio was wrong when he said all the FBI is asking Apple to do is disable the “auto-erase mode on one phone in the entire word.” The FBI is asking Apple to disable other features as well.
Apple programmers also would have to write software that could be applied to other iPhones — potentially putting the wider public at risk from hackers and criminals and setting a precedent that could assist the federal government in at least four other cases it’s waging against the tech giant.
The Case
On Dec. 2, Syed Rizwan Farook and his wife, Tashfeen Malik, killed 14 people and injured 22 others at the Inland Regional Center in San Bernardino, California.
On Feb. 16, the U.S. District Court of California issued an order compelling Apple to assist the FBI in unlocking Farook’s iPhone.
On the same day, Apple responded in a public message to its customers, stating, among other things, that: “The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.”
On Feb. 19, the Department of Justice filed a “motion to compel” Apple to comply with the court’s order. The motion stated that: “The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple.”
The motion to compel also claims: “The Order requires Apple to assist the FBI with respect to this single iPhone used by Farook by providing the FBI with the opportunity to determine the passcode.”
On Feb. 25, Apple responded by filing a “motion to vacate” the order compelling it to assist the FBI that stated: “This is not a case about one isolated iPhone.”
To explain how Rubio oversimplified the case when he claimed that all Apple would have to do is disable the “auto-erase mode on one phone in the entire world,” we’ll walk through the science behind encryption, front doors and back doors that apply to this situation.
Back Doors
Encryption scrambles the data on a device so that only someone with the proper authorized codes can access that data. A back door to encryption is normally thought of as a purposeful hole in the hardware or software of a device, which the manufacturer — Apple in this case — would insert into a product before it’s released. This back door would allow someone without the proper authenticated codes to gain access to the device’s data.
According to multiple sources, including the Guardian, the White House has been “trying to broker a deal with companies such as Apple, Yahoo and Google, to ensure holes in encryption for the government to access mobile data, cloud computing and other data.” These kinds of requests date back to at least 1993, with the Clipper chip, a hardware-based means to a back door.
But in a 2015 report, security experts and computer scientists, including Bellovin and researchers at MIT, investigated “whether it is technically and operationally feasible to meet law enforcement’s call for exceptional access without causing large-scale security vulnerabilities.”
Based on its analysis of a number of case studies, the group found that the U.S. government’s “proposals are unworkable in practice, raise enormous legal and ethical questions, and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm.”
The experts also made a point of noting that they “take no issue here with law enforcement’s desire to execute lawful surveillance orders when they meet the requirements of human rights and the rule of law.” Rather, their “strong recommendation is that anyone proposing regulations should first present concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses and for hidden costs.”
According to USA Today, Michael Hayden, the former head of both the NSA and the CIA, “opposes proposals to force Apple and other tech companies to install ‘back doors’ in digital devices to help law enforcement.” But he said “ ‘the burden of proof is on Apple’ ” to show that writing code to help disable Farook’s phone would also “open the door to broader privacy invasions.”
Front Doors
How is this case similar to and different from creating a back door? According to Bellovin, the current Apple-FBI battle is more about using so-called “brute-force” front door access to the shooter’s phone. But it could still put the public at greater risk, he says.
Farook’s iPhone was a model 5C, which has a feature that deletes all of the phone’s data after 10 unsuccessful password attempts. As per the original court order, the FBI wants Apple to write software that would do three things: (1) bypass or disable this auto-erase function; (2) allow the FBI to submit passcodes to the device electronically (i.e., not by hand); and (3) ensure that when the FBI submits passcodes to the phone, Apple’s software prevents any additional delay between passcode attempts.
In this way, Rubio was wrong when he said the only thing the FBI is asking Apple to do is disable “the auto-erase mode on one phone in the entire world.”
The court order also states the software “will be coded by Apple with a unique identifier of the phone so that the [software] would only load and execute” on Farook’s phone. But Bellovin told us this code “can be tweaked to point to a different [iPhone] in a matter of seconds or minutes.”
Susan Landau, a cybersecurity policy specialist at the Worcester Polytechnic Institute in Massachusetts and coauthor on the aforementioned 2015 report, agrees with Bellovin and adds: “No phone is secure in the same way it was before this piece of software was written.”
Beyond One Phone
There are at least two reasons why the writing of this software goes beyond the scope of Farook’s iPhone.
First, hackers could get hold of this new software and use it to “unlock any [iPhone] of their choosing,” says Landau. To do so, they would also need Apple’s signing key, which Landau adds is “a highly protected secret as important to Apple as the nuclear codes are to the White House.” Without the new software, however, this particular risk wouldn’t exist.
Second, if the courts side with the FBI on this case, it would set a precedent that could assist the federal government in at least four other similar cases it’s currently waging against the company.
On Feb. 17, Mark Zwillinger, Apple’s attorney, filed a court brief that outlined 12 other occasions Apple has challenged FBI requests to unlock iPhones— four of which would require Apple to write new code.
For some phones — ones that have operating systems older than iOS 8 — Apple can unlock the phones without writing new code, as it has in the past. For this reason, there is already a precedent for Apple to fulfill some of the FBI’s requests — those that don’t require writing new code.
But from iOS 8 on, Apple designed its products such that new software would have to be written in order to unlock phones. By doing so, Apple made its products less vulnerable to hackers and criminals.
If Apple’s programmers write software to unlock Farook’s phone, which has iOS 9, it would set a precedent for the FBI to win at least four additional cases and make requests of a similar nature in the future.
At a House hearing on Feb. 25, FBI Director James Comey insisted that his agency is asking Apple “to write a piece of software that will work only in that phone” (at about the 2:40 mark). And Comey wrote in a Feb. 21 op-ed for Lawfare that the “San Bernardino litigation isn’t about trying to set a precedent.”
However, Comey acknowledged under questioning at another House hearing on Feb. 25 that the outcome of the Apple-FBI battle “potentially … will be instructive for other courts” (at the 37:10 mark).
The legal battle between Apple and the federal government is likely to take weeks, perhaps months, to settle. And we take no position on the merits of the case. But when it comes to the technology, Rubio was wrong when he said all Apple would have to do is disable “the auto-erase mode on one phone in the entire world.” Apple’s programmers would have to write code that does more than that and could be easily used on other iPhones. And as Comey himself admits, the case’s outcome could reach beyond just Farook’s phone to other legal battles.
Editor’s Note: SciCheck is made possible by a grant from the Stanton Foundation.