The FBI Could Have Gotten Into the San Bernardino Shooter’s iPhone, But Leadership Didn’t Say That

Deeplinks Blog | Lun, 04/02/2018 - 18:03

The Department of Justice’s Office of the Inspector General (OIG) last week released a new report that supports what EFF has long suspected: that the FBI’s legal fight with Apple in 2016 to create backdoor access to a San Bernardino shooter’s iPhone was more focused on creating legal precedent than it was on accessing the one specific device.

The report, called a “special inquiry,” details the FBI’s failure to be completely forthright with Congress, the courts, and the American public. While the OIG report concludes that neither former FBI Director James Comey, nor the FBI officials who submitted sworn statements in court had “testified inaccurately or made false statements” during the roughly month-long saga, it illustrates just how close they came to lying under oath.

From the onset, we suspected that the FBI’s primary goal in its effort to access to an iPhone found in the wake of the December 2015 mass shootings in San Bernardino wasn’t simply to unlock the device at issue. Rather, we believed that the FBI’s intention with the litigation was to obtain legal precedent that it could compel Apple to sabotage its own security mechanisms. Among other disturbing revelations, the new OIG report confirms our suspicion: senior leaders within the FBI were “definitely not happy” when the agency realized that another solution to access the contents of the phone had been found through an outside vendor and the legal proceeding against Apple couldn’t continue.

By way of digging into the OIG report, let’s take a look at the timeline of events:

December 2, 2015: a shooting in San Bernardino results in the deaths of 14 people, including the two shooters. The shooters destroy their personal phones but leave a third phone—owned by their employer—untouched.

February 9, 2016: Comey testifies that the FBI cannot access the contents of the shooters’ remaining phone.

February 16, 2016: the FBI applies for (and Magistrate Judge Pym grants the same day) an application for an order compelling Apple to develop a new method to unlock the phone.

As part of that application, the FBI Supervisory Special Agent in charge of the investigation of the phone swears under oath that the FBI had “explored other means of obtaining [access] . . . and we have been unable to identify any other methods feasible for gaining access” other than compelling Apple to create a custom, cryptographically signed version of iOS to bypass a key security feature and allow the FBI to access the device.

At the same time, according to the OIG report, the chief of the FBI’s Remote Operations Unit (the FBI’s elite hacking team, called ROU) knows “that one of the vendors that he worked closely with was almost 90 percent of the way toward a solution that the vendor had been working on for many months.”

Let’s briefly step out of the timeline to note the discrepancies between what the FBI was saying in early 2016 and what they actually knew. How is it that senior FBI officials testified that the agency had no capability to access the contents of the locked device when, the agency’s own premier hacking team knew capability was within reach? Because, according to the OIG report, FBI leadership doesn’t ask the ROU for its help until after testifying that FBI’s techs knew of no way in.

The OIG report concluded that Director Comey didn’t know that his testimony was false at the time he gave it. But it was false, and technical staff in FBI’s own ROU knew it was false.

Now, back to the timeline:

March 1, 2016: Director Comey again testifies that the FBI has been unable to access the contents of the phone without Apple’s help. Before the government applied for the All Writs Act order on February 11, Comey notes there were “a whole lot of conversations going on in that interim with companies, with other parts of the government, with other resources to figure out if there was a way to do it short of having to go to court.”

In response to a question from Rep. Daryl Issa whether Comey was “testifying today that you and/or contractors that you employ could not achieve this without demanding an unwilling partner do it,” Comey replies “Correct.”

The OIG report concluded that Director Comey didn’t know that his testimony was false at the time he gave it. But it was false, and technical staff in FBI’s own ROU knew it was false.

March 16, 2016: An outside vendor for the FBI completes its work on an exploit for the model in question, building on the work that, as of February 16, the ROU knew to be 90% complete.

The head of the FBI’s Cryptologic and Electronics Analysis Unit (CEAU)—the unit whose initial inability to access the phone led to the FBI’s sworn statements that the Bureau knew of no method to do so—is pissed that others within FBI are even trying get into the phone without Apple’s help. In the words of the OIG report, “he expressed disappointment that the ROU Chief had engaged an outside vendor to assist with the Farook iPhone, asking the ROU Chief, ‘Why did you do that for?’”

Why is the CEAU Chief angry? Because it means that the legal battle is over and the FBI won’t be able to get the legal precedent against Apple that it was looking for. Again, the OIG report confirms our suspicions: “the CEAU Chief ‘was definitely not happy’ that the legal proceeding against Apple could no longer go forward” after the ROU’s vendor succeeded.

March 21, 2016: On the eve of the scheduled hearing, the Department of Justice notifies the court in California, that, despite previous statements under oath that there were no “other methods feasible for gaining access,” they’ve now somehow found a way.

In response to the FBI’s eleventh-hour revelation, the court cancels the hearing and the legal battle between the FBI and Apple is over for now.

The OIG report comes on the heels of a report by the New York Times that the Department of Justice is renewing its decades-long fight for anti-encryption legislation. According to the Times, DOJ officials are “convinced that mechanisms allowing access to [encrypted] data can be engineered without intolerably weakening the devices’ security against hacking.”

That’s a bold claim, given that for years the consensus in the technical community has been exactly the opposite. In the 90’s, experts exposed serious flaws in proposed systems to give law enforcement access to encrypted data without compromising security, including the Clipper Chip. And, as the authors of the 2015 “Keys Under Doormats” paper put it, “today’s more complex, global information infrastructure” presents “far more grave security risks” for these approaches.

The Department’s blind faith in technologists’ ability to build a secure backdoor on encrypted phones is inspired by presentations by several security researchers as part of the recent National Academy of Sciences (NAS) report on encryption. But the NAS wrote that these proposals were not presented in “sufficient detail for a technical evaluation,” so they have yet to be rigorously tested by other security experts, let alone pass muster. Scientific and technical consensus is always open to challenge, but we—and the DOJ—should not abandon the longstanding view, backed by evidence, that deploying widespread special access mechanisms present insurmountable technical and practical challenges.

The Times article also suggests that even as DOJ officials tout the possibility of secure backdoors, they’re simultaneously lowering the bar, arguing that a solution need not be “foolproof” if it allows the government to catch “ordinary, less-savvy criminals.” The problem with that statement is at least two-fold:

First, according to the FBI, it is the savvy criminals (and terrorists) who present the biggest risk of using encryption to evade detection. By definition, less-savvy criminals will be easier for law enforcement to catch without guaranteed access to encrypted devices. Why is it acceptable to the FBI that the solutions they demand are necessarily incapable of stopping the very harms they claim they most need backdoors in order to stop?

Second, the history in this area demonstrates that “not foolproof” often actually means “completely insecure.” That’s because any system that is designed to allow law enforcement agencies all across the country to expeditiously decrypt devices pursuant to court order will be enormously complex, raising the likelihood of serious flaws in implementation. And, regardless of who holds them, the keys used to decrypt these devices will need to be used frequently, making it even harder to defend them from bad actors. These and other technical challenges mean that the risks of actually deploying an imperfect exceptional access mechanism to millions of phones are unacceptably high. And of course, any system implemented in the US will be demanded by repressive governments around the world.

The DOJ’s myopic focus on backdooring phones at the expense of the devices’ security is especially vexing in light of reports that law enforcement agencies are increasingly able to use commercial unlocking tools to break into essentially any device on the market. And if this is the status quo without mandated backdoor access and as vendors like Apple take steps to harden their devices against hacking, imagine how vulnerable devices could be with a legal mandate. The FBI likes to paint encryption in an apocalyptic light, suggesting that the technology drastically undermines the Bureau’s ability to do its job, but the evidence from the Apple fight and elsewhere is far less stark.