Give Up the Ghost: A Backdoor by Another Name

Give Up the Ghost: A Backdoor by Another Name

Government Communications Headquarters (GCHQ), the UK’s counterpart to the National Security Agency (NSA), has fired the latest shot in thecrypto wars. In a post to Lawfare titledPrinciples for a More Informed Exceptional Access Debate, two of Britain’s top spooks introduced what they’re framing as a kinder, gentler approach to compromising the encryption that keeps us safe online. This new proposal from GCHQ—which we’ve heard rumors of for nearly a year—eschews one discredited method for breaking encryption (key escrow) and instead adopts a novel approach referred to as the “ghost.”

But let’s be clear: regardless of what they’re calling it, GCHQ’s “ghost” is still a mandated encryption backdoor with all the security and privacy risks that come with it.

Backdoors have a (well-deserved) horrible reputation in the security community. But that hasn’t dissuaded law enforcement officials around the world from demanding them formore than two decades. And while the Internet hasbecomeamore dangerous placeforaverage users, making encryption more important than ever, this rhetoric has hardly changed.

Whathaschanged is the legal landscape governing encryption and law enforcement, at least in the UK. 2016 saw the passage of theInvestigatory Powers Act, which gives the UK the legal ability to order a company like Apple or Facebook to tamper with security features in their products—while simultaneously being prohibited from telling the public about it.

As far as is publicly known, the UK has not attempted to employ the provisions of the Investigatory Powers Act to compromise the security of the products we use. Yet. But GCHQ’s Lawfare piece previews the course that the agency is likely to take. The authors lay out six “principles” for an informed debate, and they sound pretty noncontroversial.

Privacy and security protections are critical to public confidence. Therefore, we will only seek exceptional access to data where there’s a legitimate need, that access is the least intrusive way of proceeding and there is appropriate legal authorisation.

Investigative tradecraft has to evolve with technology.

Even when we have a legitimate need, we can’t expect 100 percent access 100 percent of the time.

Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.

Transparency is essential.

So far so good. I absolutely agree that law enforcement should only act where there’s a legitimate need and only when authorized by a court, in a way that evolves with the tech, that doesn’t have unrealistic expectations, that doesn’t enable mass surveillance, that doesn’t undermine the public trust, and that is transparent.

But unfortunately, the authors fail to apply the principles so carefully laid out to the problem at hand. Instead, they’re proposing a way of undermining end-to-end encryption using a technique that the community has started calling the “ghost.” Here’s how the post describes it:

It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved – they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.

Applying this idea to WhatsApp, it would mean that—upon receiving a court order—the company would be required to convert a 1-on-1 conversation into a group chat, with the government as the third member of the chat. But that’s not all. In WhatsApp’s UX, users canverify the security of a conversationby comparing “security codes” within the app. So for the ghost to work, there would have to be a way of forcing both users’ clients to lie to them by showing a falsified security code, as well as suppress any notification that the conversation’s keys had changed. Put differently, if GCHQ’s proposal went into effect, consumers could never again trust the claims that our software makes about what it’s doing to protect us.

The authors of the Lawfare piece go out of their way to claim that they are “not talking about weakening encryption or defeating the end-to-end nature of the service.”

Hogwash.

[T]he ghost will require vendors to disable the very features that give our communications systems their security guaranteesin a way that fundamentally changes the trust relationship between a service provider and its users.

They’re talking about adding a “feature” that would require the user’s device to selectively lie about whether it’s even employing end-to-end encryption, or whether it’s leaking the conversation content to a third (secret) party. Is the security code displayed by your device a mathematical representation of the two keys involved, or is it a straight-up lie? Furthermore, what’s to guarantee that the method used by governments to insert the “ghost” key into a conversation without alerting the users won’t be exploited by bad actors?

Despite the GCHQ authors’ claim, the ghost will require vendors to disable the very features that give our communications systems their security guaranteesin a way that fundamentally changes the trust relationship between a service provider and its users. Software and hardware companies will never be able to convincingly claim that they are being honest about what their applications and tools are doing, and users will have no good reason to believe them if they try.

And, as we’ve seen already seen, GCHQ will not be the only agency in the world demanding such extraordinary access to billions of users’ software. Australia was quick to follow the UK’s lead, and we can expect to see similar demands, fromBraziland theEuropean Unionto Russia and China. (Note that this proposal would beunconstitutionalwere it proposed in the United States, which has strong protections against governments forcing actors to speak or lie on its behalf.)

The “ghost” proposal violates the six “principles” in other ways, too. Instead of asking investigative tradecraft to evolve with technology, it’s asking technology to build investigative tradecraft in from the ground floor. Instead of targeted exceptional access, it’s asking companies to put a dormant wiretap in every single user’s pocket, just waiting to be activated.

We must reject GCHQ’s newest “ghost” proposal for what it is: a mandated encryption backdoor that weakens the security properties of encrypted messaging systems and fundamentally compromises user trust.

GCHQ needs to give up the ghost. It’s just another word for an encryption backdoor.

Related Updates

Thanks to the success of projects like Let’s Encrypt and recent UX changes in the browsers, most page-loads are now encrypted with TLS. But DNS, the system that looks up a site’s IP address when you type the site’s name into your browser, remains unprotected by encryption. Because...

EFF, ACLU, and Stanford cybersecurity scholar Riana Pfefferkorn filed a petition in November 2018 asking a California federal court to make public a ruling that apparently denied a request by the Justice Department to force Facebook to break the encryption of its Messenger application in order to facilitate...

EFF is back this year at Vegas Security Week, sometimes affectionately known as Hacker Summer Camp. Stop by our booths at BSides, Black Hat, and DEF CON to find out about the latest developments in protecting digital freedom, sign up for our action alerts and mailing list, and...

Last week, news broke of a large financial settlement for the massive 2017 Equifax data breach affecting 147 million Americans. While the direct compensation to those harmed and the fines paid are important, it’s equally important to evaluate how much this result is likely to create strong incentives to...

Certbot has a brand new website! Today we’ve launched a major update that will help Certbot’s users get started even more quickly and easily. Certbot is a free, open source software tool for enabling HTTPS on manually-administered websites, by automatically deploying Let’s Encrypt certificates. Since we introduced it in...

San Francisco—The Electronic Frontier Foundation, ACLU and Stanford cybersecurity scholar Riana Pfefferkorn asked a federal appeals court today to make public a ruling that reportedly forbade the Justice Department from forcing Facebook to break the encryption of a communications service for users.Media widely reported last fall that a...

This week the federal Government Accountability Office (GAO) issued an update to its 2016 report on the FBI’s use of face recognition. The takeaway, which they also shared during a Congressional House Oversight Committee hearing: the FBI now has access to 641 million photos—including driver’s license and...

Fresno – On Wednesday, May 22, at 9 am, the Electronic Frontier Foundation (EFF) will argue that criminal defendants have a right to review and evaluate the source code of forensic DNA analysis software programs used to create evidence against them. The case, California v. Johnson, is on appeal...