Give Up the Ghost: A Backdoor by Another Name

Government Communications Headquarters (GCHQ), the UK’s counterpart to the National Security Agency (NSA), has fired the latest shot in thecrypto wars. In a post to Lawfare titledPrinciples for a More Informed Exceptional Access Debate, two of Britain’s top spooks introduced what they’re framing as a kinder, gentler approach to compromising the encryption that keeps us safe online. This new proposal from GCHQ—which we’ve heard rumors of for nearly a year—eschews one discredited method for breaking encryption (key escrow) and instead adopts a novel approach referred to as the “ghost.”

But let’s be clear: regardless of what they’re calling it, GCHQ’s “ghost” is still a mandated encryption backdoor with all the security and privacy risks that come with it.

Backdoors have a (well-deserved) horrible reputation in the security community. But that hasn’t dissuaded law enforcement officials around the world from demanding them formore than two decades. And while the Internet hasbecomeamore dangerous placeforaverage users, making encryption more important than ever, this rhetoric has hardly changed.

Whathaschanged is the legal landscape governing encryption and law enforcement, at least in the UK. 2016 saw the passage of theInvestigatory Powers Act, which gives the UK the legal ability to order a company like Apple or Facebook to tamper with security features in their products—while simultaneously being prohibited from telling the public about it.

As far as is publicly known, the UK has not attempted to employ the provisions of the Investigatory Powers Act to compromise the security of the products we use. Yet. But GCHQ’s Lawfare piece previews the course that the agency is likely to take. The authors lay out six “principles” for an informed debate, and they sound pretty noncontroversial.

Privacy and security protections are critical to public confidence. Therefore, we will only seek exceptional access to data where there’s a legitimate need, that access is the least intrusive way of proceeding and there is appropriate legal authorisation.

Investigative tradecraft has to evolve with technology.

Even when we have a legitimate need, we can’t expect 100 percent access 100 percent of the time.

Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.

Transparency is essential.

So far so good. I absolutely agree that law enforcement should only act where there’s a legitimate need and only when authorized by a court, in a way that evolves with the tech, that doesn’t have unrealistic expectations, that doesn’t enable mass surveillance, that doesn’t undermine the public trust, and that is transparent.

But unfortunately, the authors fail to apply the principles so carefully laid out to the problem at hand. Instead, they’re proposing a way of undermining end-to-end encryption using a technique that the community has started calling the “ghost.” Here’s how the post describes it:

It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved – they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.

Applying this idea to WhatsApp, it would mean that—upon receiving a court order—the company would be required to convert a 1-on-1 conversation into a group chat, with the government as the third member of the chat. But that’s not all. In WhatsApp’s UX, users canverify the security of a conversationby comparing “security codes” within the app. So for the ghost to work, there would have to be a way of forcing both users’ clients to lie to them by showing a falsified security code, as well as suppress any notification that the conversation’s keys had changed. Put differently, if GCHQ’s proposal went into effect, consumers could never again trust the claims that our software makes about what it’s doing to protect us.

The authors of the Lawfare piece go out of their way to claim that they are “not talking about weakening encryption or defeating the end-to-end nature of the service.”

Hogwash.

[T]he ghost will require vendors to disable the very features that give our communications systems their security guaranteesin a way that fundamentally changes the trust relationship between a service provider and its users.

They’re talking about adding a “feature” that would require the user’s device to selectively lie about whether it’s even employing end-to-end encryption, or whether it’s leaking the conversation content to a third (secret) party. Is the security code displayed by your device a mathematical representation of the two keys involved, or is it a straight-up lie? Furthermore, what’s to guarantee that the method used by governments to insert the “ghost” key into a conversation without alerting the users won’t be exploited by bad actors?

Despite the GCHQ authors’ claim, the ghost will require vendors to disable the very features that give our communications systems their security guaranteesin a way that fundamentally changes the trust relationship between a service provider and its users. Software and hardware companies will never be able to convincingly claim that they are being honest about what their applications and tools are doing, and users will have no good reason to believe them if they try.

And, as we’ve seen already seen, GCHQ will not be the only agency in the world demanding such extraordinary access to billions of users’ software. Australia was quick to follow the UK’s lead, and we can expect to see similar demands, fromBraziland theEuropean Unionto Russia and China. (Note that this proposal would beunconstitutionalwere it proposed in the United States, which has strong protections against governments forcing actors to speak or lie on its behalf.)

The “ghost” proposal violates the six “principles” in other ways, too. Instead of asking investigative tradecraft to evolve with technology, it’s asking technology to build investigative tradecraft in from the ground floor. Instead of targeted exceptional access, it’s asking companies to put a dormant wiretap in every single user’s pocket, just waiting to be activated.

We must reject GCHQ’s newest “ghost” proposal for what it is: a mandated encryption backdoor that weakens the security properties of encrypted messaging systems and fundamentally compromises user trust.

GCHQ needs to give up the ghost. It’s just another word for an encryption backdoor.

Related Updates

In his latest announcement, Facebook CEO Mark Zuckerberg embraces privacy and security fundamentals like end-to-end encrypted messaging. But announcing a plan is one thing. Implementing it is entirely another. And for those reading between the lines of Zuckerberg’s pivot-to-privacy manifesto, it’s clear that this isn’t just about privacy. It’s...

San Francisco - Technology is supposed to make our lives better, yet many big companies have products with big security and privacy holes that disrespect user control and put us all at risk. The Electronic Frontier Foundation (EFF) is launching a new project called “Fix It Already!” demanding repair...

Today we are announcing Fix It Already, a new way to show companies we're serious about the big security and privacy issues they need to fix. We are demanding fixes for different issues from nine tech companies and platforms, targeting social media companies, operating systems, and enterprise platforms on...

The good news: TLS 1.3 is available, and the protocol, which powers HTTPS and many other encrypted communications, is better and more secure than its predecessors (including SSL). The bad news: Thanks to a financial industry group called BITS, there’s a look-alike protocol brewing called ETS (or...

More lessons from "Facebook Research"Last week, Facebook was caught using a sketchy market research app to gobble large amounts of sensitive user activity after instructing users to alter the root certificate store on their phones. A day after, Google pulled a similar iOS “research program” app. Both of...

This article was first published on Lawfare. The most recent purportedly serious proposal by a Western government to force technology companies to provide access to the content of encrypted communications comes from Ian Levy and Crispin Robinson of the Government Communications Headquarters, or GCHQ, the U.K.’s equivalent of...

Tracking is everywhere on the Internet. Over the past year, a drumbeat of tech-industryscandals has acclimated users to the sheer number of ways that personal information can be collected and leaked. As a result, it might not come as a surprise to learn that emails, too...

We saw 2017 tip the scales for HTTPS. In 2018, web encryption continues to improve. EFF has begun to shift its focus towards email security, and the security community is shifting its focus towards further hardening TLS, the protocol that drives encryption on the Internet. By default, all Internet...

This wasn’t a great year for those of us whose job it is to defend the use of encryption. In the United States, we heard law enforcement officials go on about the same “going dark” problem they’ve been citing since the late 90s, but even after all these years...