It’s About Security, not Privacy

TL;DR

The mainstream press has been devoting a significant amount of
coverage to the case of the FBI vs. Apple Computer. They often
characterize the trade-off as between Privacy vs. Security. The
Privacy of the individual vs. the Security of society. However this
comparison is flawed. The real trade-off is between Security and
Security. Between making the job of the FBI a little easier in
exchange for introducing significant security vulnerabilities in the
core fabric of our increasingly electronic and computerized world.

A subtext of this trade-off is the reality that security enforced
by procedures is fragile at best and useless at worst. Read on for a
more detailed treatment of this statement.

Its Not About Privacy

When we talk about privacy, we worry about many things. In particular
the revelations made by Edward Snowden have caused people to be
concerned about widespread government surveillance. Whether or not
those fears are justified or not is beyond what I want to discuss
here. But I will assert that the Apple vs. FBI case is not about
surveillance, nor is it really about Privacy. The demand the
Government is making here is to facilitate accessing a specific
single phone, when a Court Order is issued. Of course there has been
discussion about whether or not it is about this particular phone, or
about any number of phones for whom the Government has a subpoena or
Court Order to access. However, no matter how many phones may be affected, it is about specific devices involved in specific crimes
where specific orders have been written. It is not about being able
to perform mass surveillance on unidentified phones not mentioned in
a Court Order. What the FBI is asking for would not facilitate
such mass surveillance.

Why is it about Security?

Review: What the FBI is asking for

In a recent Court Order obtained by the FBI, the FBI is demanding
that Apple Computer write software designed to circumvent an
important security control in iOS, the operating system of the
iPhone. The particular iPhone in question belongs to the City of San
Bernardino and was in possession of the San Bernardino shooter.
Although he destroyed his personal phone (and that of his
wife) he did not destroy this phone, his work phone. However this
phone is locked, which also means that its internal memory is
encrypted. The PIN used to unlock the phone is part of the key
encrypting the phone. The FBI could try to “brute-force” the PIN by
trying every possible combination. However there are two protections
built into the phone that makes this difficult. The first one is that
after a certain number of incorrect tries, the phone will stop
accepting new attempts for a period of time, ultimately requiring an
hour between each retry. The second mechanism which may or may not be
enabled on this phone, is a feature to erase the phone’s memory after
a small number of incorrect attempts to guess the PIN.

Apple cannot decrypt the phone without knowing the PIN, and the FBI
is not asking Apple to do so. Instead the FBI is asking that Apple
build a special version of iOS to be installed on this particular
phone which will disable the time delay penalty for incorrect PIN
guesses and also disable the automatic erase feature, if it is
enabled. The FBI would also like Apple to create a mechanism to
permit PIN guesses to be submitted via the phone’s USB port. So a
computer connected to the phone can cycle through PIN guesses instead
of having a person attempt to enter each guess by hand on the phone
keypad. The FBI has also stated in the Court Order that Apple can
build in protections in this special version of iOS to ensure that it
only runs on this particular phone 1.

It is worth mentioning that this does not ensure that the FBI will
actually successfully guess the PIN. If it is a simple PIN, like a
four digit (numeric only) PIN, then they will certainly get
in. However if it is a 16 characters password including alphabetic
characters as well as digits, it may take a very long time to find
the correct PIN even with removing the artificial incorrect guess
delay and having an electronic port to make guesses.

Why Only Apple Can Do This

One of the important security features of the iPhone (and Android
phones as well) is that the phone will only execute an operating
system (aka iOS) if it is digitally signed by its manufacturer. In
this case, Apple Computer. In order to “sign” a software update, Apple
has to use a secret “private key” which can create the appropriate
digital signature. By only loading signed software updates, an iPhone
knows that it is running authentic software from Apple, and for
example not malicious software designed to steal information or make
phone calls to expensive telephone numbers or any number of other
malicious activities.

Critical to this security is keeping this special signing key a
secret. If a malicious actor can steal this key, they can create
bogus iPhone software and distribute it to unsuspecting victims.

Keeping Secrets is HARD

Keeping secrets like the Apple signing key is very difficult. The
secret is itself very small, probably only 1000 bytes at max. At this
small size it can be ex-filtrated via a thumb drive, an e-mail
message, uploaded to a social media account using a web browser. You
name it.

Disclosure of the secret does not deprive the owner access
to the secret. In other words if the key is stolen, Apple would not
necessarily know it! Very valuable secrets are worth significant
investment on the part of a bad actor to steal it.

How to Keep a Signing Key Secret

Fortunately there are technology solutions that help protect secret
signing keys. Most high value signing keys are not stored on a
computer, but within a specially designed Hardware Security Module
(HSM). An HSM stores the key (the key itself may well have been
created inside the HSM and has never been outside of it). 2 When you want to sign a
document, like a software update, you submit it to the HSM which then
creates the signature and returns it. To improve security the HSM
will typically require the insertion of one or several special “Crypto Ignition
Key” (CIK). The CIKs themselves are actually data storage devices
which contain a key of their own. The HSM will often only have a part
of the signing key inside it. The rest is delivered by a combination
of the CIKs. By distributing the different CIKs to different
individuals, you can ensure that multiple people are required to
perform a signature.

Secure Signing is Expensive

Requiring an HSM and the presence of several individuals, each with a
CIK, means that creating a signature is an expensive process. The
individuals involved in signing a software update of an iPhone are
probably senior people, who are likely very busy. They probably do
not carry their CIKs with them, but store them in a vault, which they
will have to visit to get them. Although this is a time consuming,
expensive process, it is likely tolerable to the individuals involved
because it is a very infrequent event. Only when a new software
release is ready for shipping does this ceremony need to be
performed. So we are talking about something like once or twice per
year.

Secure Signing Doesn’t Scale

Secure Signing doesn’t scale. And this is important. If you
understand human nature, you will know that if you require people to
perform a time consuming function frequently, they will look for ways
to reduce the amount of time required. One obvious approach is to
carry the CIK (or leave it in your desk) rather then put it in a
vault. I know of one situation where a person stored the CIK for an
important signing key in a coffee cup on a shelf in his office. This
way if a signing was necessary and he wasn’t around, he could tell a
co-worker where to find it (even though he was supposed to be
personally involved in the signing operation).

If the FBI Wins, we all lose

If the FBI wins, we will need secure signing at scale. Why? Well as
others have said, and the Government has admitted, there are more
then one iPhone in the queue to be decrypted. Although the FBI has
stated that the Court Order is about just one phone, the reality is
that it is about many. For once the FBI wins this order, the
precedent is set to require Apple to install similar software on
other phones.

However the FBI in the order in front of us has stated that Apple can
build the new software image to only operate on the one phone in
question. So whatever software Apple writes, and more importantly
signs, will only operate on one phone. So when a Court Order is
submitted for the next phone, Apple will have to modify the software
and sign yet another software update, for this next phone. However,
signing is expensive and hard, when done the way that Apple is likely
doing it 3 so over time if Apple continues to receive
orders, they will need to have a less expensive process to sign
software updates.

However a less burdensome process will likely devolve from one where
several people are required to sign an update (with special purpose
hardware) to one where one person, out of several of a group, will be
able to sign an update on their own. These people will likely not be
senior officers, but lower level employees. Over time, unless great
care is taken, the signing process will devolve to one where stealing
the signing key becomes feasible. Either bribe one of the people who
have access to the key, or compromise their credentials, break into
the physical location, whatever is necessary. My point is that a key
used frequently, and a key used to sign updates needed to answer
Government orders will be used frequently, is an insecure key.

When this key is compromised the security of every iPhone is
significantly weakened. This is why this is about security. Make no
mistake, if the Government forces Apple to unlock this one iPhone,
the SECURITY of every iPhone will be affected.

In some ways, Apple’s Signing key is a backdoor, in that it can be
used to subvert the security of the iPhone. At this point Apple
probably understands this as well. Fortunately there are ways to close
this particular backdoor. One simple way is to require that a phone
be unlocked in order to install a software update.

Adding the requirement that a phone be unlocked prior to a software
update will protect against the situation where the FBI has
possession of a phone involved in a crime, it will not handle the
case where the FBI wishes to access a phone belonging to a criminal
who still has the phone. It isn’t too hard to image that the FBI
might require Apple to tailor a special software update just for this
one phone which will permit them unlock it in the future (or just
plain extract data from it “on the fly”). Apple can then be required
to deliver this specially crafted software update to just this one
device using the normal software update mechanism (or a modification
of it).

We can argue about whether or not the FBI should compel Apple to
invent such a “live” mechanism (though if they win this Court Order,
I believe that the door is open to such a requirement) but I’ll leave
that for another day. The point I’ll make here is that such a
mechanism would again require Apple to sign updates frequently,
putting their signing key at risk and making it a backdoor once again.

Conclusion

In general requiring “exceptional” access to devices, be they smart
phones, tablets or computers requires introducing mechanisms that
decrease the overall security of all such devices. In this post I
have mostly focused on the vulnerabilities of signing keys and how
exceptional access makes them easier to steal and abuse. However
this is but one way that security is weakened. In general exceptional
access requires more complex mechanisms to implement then security
systems that do not provide for it. Complex systems fail in complex
ways. Complex systems by their very nature are less secure because it
is harder to reason about their security properties. You can read
more about this in “Keys under Doormats.”

The state of systems Security today is pretty abysmal. Not a day goes
by that you do not read about yet another breach, at commercial
organizations and at government organizations. We should be working
hard on improving security for the vast majority of people who are law
abiding and innocent rather then decreasing security for all in the
hope of making the investigation of the small number of cases a
little easier.

Footnotes:

1 Each phone has a unique serial
number which is known for this phone. The modified iOS software can check
this serial number and refuse to run on any other phone.

2 Of
course hardware can fail, so most HSM’s have a mechanism to backup
the stored signing key in a secure fashion.

3 I am in fact not privy to exactly what Apple is doing
to sign software, but I assume that they are using a secure mechanism
like I described above