Privacy and the New Math

Among the countless essays and posts I've read on the fight over crypto that's
been going on between Apple and
the FBI, one by the
title above by T.Rob Wyatt
in Medium stood out so well that I asked if he'd like to help me adapt it into
an article for Linux Journal. He said yes, and here it
is.—Doc

In the Apple vs. FBI case, the real disputes are between math and architecture,
and between open and closed. Linux can play an important role in settling those
disputes, because it is on the right side of both.

Apple's case is for crypto, an application of math. The FBI's case is for a way
through the crypto. The term for that architectural hole is a "back
door".
Since the key to that door would be the FBI's alone, with no way for others to
tell how or when they'll use it (unless the FBI shares it), the FBI's side is the
closed one.

To unpack this, let's look at the case.

At a PR level, the FBI would like an outcome it says is consistent with our
moral outrage over the mass murders in San Bernardino by terrorists who were killed
by police and left behind an Apple iPhone 5c that the FBI wants Apple's help opening.
The phone's owner was a guy whose rights we wouldn't care much about even if he
were still alive, so there is little public interest served in keeping the
information private.

The FBI would also like to solve what it calls the "Going Dark
Issue".
Specifically, the growing use of encryption on the Internet is "eroding law
enforcement's ability to quickly obtain valuable information that may be used to
identify and save victims, reveal evidence to convict perpetrators, or exonerate
the innocent."

Getting into the dead perp's iPhone and solving the "going dark
problem" both
require ways for the FBI to "see the light", we might say, through the crypto. Or,
literally, for math to work one way for everybody using crypto and another way
for the FBI.

While the FBI contends that the country's safety
depends on "law enforcement's
lawful intercept and evidence collection needs", in fact, it also depends on the
math we call crypto to keep commerce and infrastructure up and
running. To serve
both these needs, math has to work differently for the FBI than for everyone
else. But math works the same for everyone, and taking action inconsistent with
this principle leads predictably to bad outcomes.

In order both to break security for the government's benefit and continue to use
it for infrastructure and commerce, the government must keep the tools and
methods that enable such breakage secret at all costs. But if you have a secret
that breaks digital security, you don't use digital security to secure it. You
use vaults, guns and worse. Once you have such a capability, keeping it secret
requires tipping the balance of power away from individuals and toward the
government.

The ability of individuals to keep an expressed thought secret is one of the
checks and balances that nudges the power differential toward homeostasis
somewhere below Citizens 0, Government 100. Breaking crypto in commercial
products eliminates the ability of citizens to keep their expressed thoughts
secret and in doing so eliminates an important constraint on government power
escalation. Because math works the same for everyone, eliminating one
individual's security from government intrusion eliminates it for everybody.

The phone is the most intimate personal data repository in widespread use on the
planet. If checks and balances fail to protect the phone, the power differential
from a practical standpoint is already at Citizens 0, Government 100. So this
isn't about breaking a phone. It's about breaking a system. Once it's
broken, it
stays broken.

Tangential to this is the argument that cracking this one phone doesn't
compromise all the others. That too is provably false, and quite easily
so.

See, a security model includes not just the crypto but all of the trust anchors
and controls in the system. The high profile breaches in the news are almost
never due to breaking the crypto, but rather from breaking one or more trust
anchors in the security model.

Resilience against brute-force attack is a critical control in the iPhone's
security model. This is because the cryptography is impenetrable, but
human-chosen passwords are surprisingly easy to crack. According to Apple's
iOS
Security guide (dated September 2015):

Only 10 attempts to authenticate and retrieve an escrow record are allowed.
After several failed attempts, the record is locked and the user must call Apple
Support to be granted more attempts. After the 10th failed attempt, the HSM
cluster destroys the escrow record and the keychain is lost forever. This
provides protection against a brute-force attempt to retrieve the record, at the
expense of sacrificing the keychain data in response.

Designing the phone to wipe the data after some number of failed attempts
compensates for the human tendency to pick really bad passwords. Defeating that
control—which is what the government wants—breaks the security
model.

For this to be okay requires that math works differently for this one phone than
for all others—or for the math to work differently for government than for
everyone else. But, since math works the same for everyone, the government must
keep the hack secret at all costs, including escalation to a Citizen 0,
Government 100 power differential if necessary. It would, after all, be
"in the
interest of national security" to do so, and that always supersedes the
interest of any one individual.

The FBI's case also requires that we fully trust it not to mess up. Yet it
appears it already has in the San Bernardino case, says Apple:

One of the strongest suggestions we offered was that they pair the phone to a
previously joined network, which would allow them to back up the phone and get
the data they are now asking for. Unfortunately, we learned that while the
attacker's iPhone was in FBI custody the Apple ID password associated with the
phone was changed. Changing this password meant the phone could no longer access
iCloud services.

Authorities want Apple to create a modified version of iOS that disables an
auto-erase feature—triggered after 10 incorrect passcode
entries—and
removes the forced delays between passcode guesses. The FBI would then conduct a
brute-force passcode crack from a personal computer at high speeds to uncover
the passcode—which unlocks the device—and to examine all the data
there.

Apple calls this a back door. The FBI insists it is not. In Hollywood, a back
door gives an attacker direct login to a system, but in real life, the term refers
to an intentional weakness in the security model. Or, in the words of the Jargon
File, "a hole in the security of a system deliberately left in place by
designers or maintainers". Removing the auto-wipe triggered by too many failed
password attempts is a hole in the security model big enough to drive a
simulated truck through. Point goes to Apple on this one.

Corporations are much more susceptible to government coercion than a distributed
Open Source community, such as the ecosystem that has grown up around Linux. And
Apple itself may not be entirely clean and consistent on the matter of
safeguarding individual privacy. Stewart A. Baker,
a former official with the
Department of Homeland Security and the National Security Agency, wrote a blog
post in February titled "Deposing Tim Cook",
in which he suggests that Apple may
make compromises for the Chinese government that it won't for the US one.

I think the Justice Department and the FBI are on their own here. You know, the
secretary of defense has said how important encryption is when asked about this
case. The National Security Agency director and three past National Security
Agency directors, a former CIA director, a former Homeland Security secretary
have all said that they're much more sympathetic with Apple in this case. You
really have to understand that the FBI director is exaggerating the need for
this and is trying to build it up as an emotional case, organizing the families
of the victims and all of that. And it's Jim Comey and the attorney general is
letting him get away with it.

Whichever way this case is decided, it is clear that the US and many other
governments around the world would eliminate the right and ability of their
citizens to keep a secret. The government's ability to coerce corporations casts
doubt on the integrity of the code those corporations produce. The "Open with a
capital O" in Open Source is itself a security control that resists attacks by
preserving the integrity of the code. If someone compromised open-source code,
it would be possible to find it and back out the changes.

In recent years, governments have made an enemy of personal privacy, regarding it
as a vulnerability within the state and a potential refuge for terrorism.
That's why many vendors of secure hardware and software have fled their home
countries and relocated to privacy-friendly jurisdictions.

When people stop worrying so much about the merits of a specific case and
consider that the FBI (and, if it succeeds, the whole government) wants to
destroy our underlying security models, geography won't matter because math
works the same everywhere on the planet. At that point, the resilience of the
security model and the supporting code will be the most important consideration.
We will have a migration toward privacy-friendly, open-source technology, and
Linux is the leading expression of that.

If the US government succeeds in its bid to break Apple's security model, its
next step is to prohibit Apple from fixing the vulnerability. After that comes
mandated back doors and a general prohibition on unbreakable information
systems. Those sanctions would be relatively easy to enforce on domestic
corporations but much more difficult against a worldwide development community.
The good news is that this is the easiest call to action ever: Just keep doing
what you do. Participation in the Linux community is the most important security
control in the whole open-source model.

It's interesting to think about how much and how easily we suspend our disbelief
when it comes to security. Consider, for example, the entire
Star Wars
franchise. When R2D2 needs to do some research, the physical data ports are all
compatible. So are the protocols at all communication layers. Access is
unlogged. All the systems involved provide sensitive confidential details to
anonymous queries, and neither the queries nor command and control traffic are
alarmed. Even my worst consulting clients are ten times better at security than
The Empire.

As unbelievable as the security was in Star Wars, George Lucas stopped well
short of asking us to believe that math works differently for the Empire than it
does for the rebels. The US government asks that of us and more. Even if we
are inclined to accept the government's proposition, it's one thing to put up
with that level of surrealism for an hour or so in a theater. It's something
else entirely when the future of privacy is at stake.