Wednesday, February 24, 2016

Keep the Back Door Locked

Sure, I want to stop bad guys, but requiring Apple to make their phones vulnerable is not the right approach. The current public discourse on the Apple vs. FBI "open the phone" is really a conflated mix of two issues: (1) the FBI wants help to crack open a known criminal's phone and (2) whether or
not Apple should be required to create law enforcement back-doors into their
products. Lets separate the two issues.

(1) Should the FBI be given
access to Farook's iPhone contents?

I think most people agree the FBI should have the data.
Bill Gates made a statement on these issues on Tuesday morning, and made his
position pretty clear: "Apple has access to
the information, they're just refusing to provide the access, and the
courts will tell them whether to provide the access or not." If Apple
does indeed have access to the information, the right way forward is
for the FBI to seek the court's order requiring
Apple to release the information. This isn't new. In fact, the FBI have a court order in hand.

Does Apple really have access to
the data on Farook's iPhone? Is it able to comply with the court
order? Tim Cook's messaging indicates they do not, and Apple is
pushing back saying that they will not comply
with the part of the court order that goes beyond this simple data turnover: the part that says "give the FBI a tool to help us hack the phone quickly." This is where the
discourse gets concerning; this tool could be considered a backdoor.
It's not as egregious as "give us a master key", but it is certainly
bypassing the iPhone's owner's security mechanism in a way not intended
by the manufacturer.

(2)
Should Apple create a tool for the FBI that enables easy hacking of Farook's phone?

If you read carefully into the court order,
the court asks apple to provide a tool
that will only work on the specific subject device -- not all iPhones. The specific ask reads:

"Apple shall assist in enabling the search of a cellular telephone, [make, model, serial number, IMEI] on the Verizon Network, (the "SUBJECT DEVICE") pursuant to a warrant of this court by providing reasonable technical assistance to assist law enforcement agents in obtaining access to the data on the SUBJECT DEVICE."

This reads like a natural extension of "hand over the contents of this phone." It sounds quite reasonable, much like ordering a building superintendent to
unlock a specific criminal's apartment for a search. This doesn't immediately seem different from the first issue (give us
access to Farook's data).

But it is.

If you keep reading, the court orders Apple to provide the FBI with a tool to override some of the security features in the phone. Ordinarily, Apple would not have a fast way to "unlock the apartment."
They have provided people with secure phones that keep data private
from everyone, including from Apple. But in this case
the court is ordering Apple to do the FBI's job: engineer something new
to reverse their phone's security. This is like asking the door
lock manufacturer to make you a lock-picking machine for the apartment's lock.
Doesn't the FBI usually just pick the lock or kick
in the door? The courts don't compel the lock maker to make a
lock-picking machine to do it.

There's urgency here to get everyone to pitch in to stop terrorism, and I understand this concern. Irrational bad guys are really scary. But this
order is not routine! It is an ask to do something very abnormal to aid law enforcement. Assume it's a good idea: we all want to help the FBI unlock the phone, and so Apple makes the tool. Now
what? Can such a tool be constructed so it cannot be used on other
iPhones? In my opinion, and in Apple's, it cannot. The existence of this tool
threatens the security of all iPhone users when it is not
limited to this individual device. If the tool fell into the wrong
hands, it may be used by criminals or even the terrorists the FBI is
trying to stop.

Where does this lead?

This neutralizes any benefits from encryption, and not just on iPhones. For a moment, lets assume
this tool can be safely created to work against only one device. The requests wouldn't stop at Apple's
compliance with a single phone. The court order could lead to companies being required to defeat their own customers' security any time
law enforcement requests it. This is a very dangerous precedent. Nick Weaver's analysis is frightening: imagine if device manufacturers had to do "the dirty work" of hacking
into their own products at any time. Currently, law enforcement must do
the often substantial work to break a device, but if they can just get a
court order and require someone else to put
in the effort that removes any incentive to investigate carefully before
pursuing a subject's data.

While the order itself does
not create a technological backdoor, it creates one through legal
precedent. Apple is right to appeal and ask the courts to think a bit harder about this order. Encryption is the only thing that provides any sort of confidentiality on the wild web, and we should not throw it away to decrypt one phone. I'm not sure where it is, but somewhere we need to draw the line somewhere between
"never help the FBI catch terrorists" and "make it trivial to defeat your customers' security" -- a balance where law enforcement officers' hands are not tied and encryption still works for the good guys.