Google Launches Key Transparency While a Trade-Off in WhatsApp Is Called a Backdoor

The Guardian ran a sensational story on Friday claiming a backdoor was discovered in WhatsApp, enabling intelligence agencies to snoop on encrypted messages. Gizmodo followed up saying it's no backdoor at all, but reasonable, intended behavior. So what's really going on here?

The lost phone, lost message dilemma

The issue at question is WhatsApp's answer to the question of what applications should do when someone's phone number changes (or they reinstall their app, or switch phones).

Suppose Alice sends a message to Bob encrypted with Bob's key K1. Alice's message is stored encrypted at the server until Bob can connect and download it. This behavior is required for any app that allows asynchronous communications (meaning you can send a message to somebody while they are offline), which nearly all popular messaging apps support.

Unfortunately, Bob just dropped his phone in a lake. Later on, Bob gets a new phone and reinstalls WhatsApp. On this new phone, the app will create a new key K2. There are two possible behaviors here:

Fail safe: The server can delete the queued message, since it was encrypted with K1, which no longer exists. Bob will never see the message. If Alice has turned on key change notifications, she will be warned that Bob is using a new key. She will be told that her message was not delivered and given the option to re-send it. This is what Signal does.

Proceed: The server will tell Alice's phone that Bob has a new key K2, and to please re-encrypt the message for K2. Alice's phone will do this, and Bob will get the message. If Alice has turned on key change notifications, she will then be warned that Bob's key had changed. This is what WhatsApp does.

Note that the second behavior makes the service seem more reliable: it's one less way a message can fail to be delivered.

The issue here is that the second behavior opens a security hole: Bob need not have actually lost his phone for the server to act as if he has lost it. Acting maliciously, the server could pretend that Bob's new key is a key that the server controls. Then, it will tell Alice about this new key, but will not give Alice a chance to intervene and prevent the message from being sent. Her phone will automatically re-send the message, which the server can now read. Alice will be notified and can later attempt to verify the new fingerprint with Bob, but by then it will be too late.

By contrast, the first behavior of failing safe prevents this potential attack vector. As far as reliability, however, it also introduces a case in which messages could fail to be delivered.

What to do if you use WhatsApp

If you are a high-risk user whose safety might be compromised by a single revealed message, you may want to consider alternative applications. As we mention in our Surveillance Self-Defense guides for Android and iOS, we don't currently recommend WhatsApp for secure communications.

But if your threat model can tolerate being notified after a potential security incident, WhatsApp still does a laudable job of keeping your communications secure. And thanks to WhatsApp's massive user base, using WhatsApp is not immediate evidence of secretive activity.

If you would like to turn on WhatsApp's key change notifications, go into Settings → Account → Security, and slide “Show security notifications” to the right.

In defense of security trade-offs

The difference between WhatsApp and Signal here is a case of sensible defaults. Signal was designed as a secure messaging tool first and foremost. Signal users are willing to tolerate lower reliability for more security. As anybody who's used Signal extensively can probably attest, these types of edge cases add up and overall the app can seem less reliable.

WhatsApp, on the other hand, was a massively popular tool before end-to-end encryption was added. The goal was to add encryption in a way that WhatsApp users wouldn't even know it was there (and the vast majority of them don't). If encryption can cause messages to not be delivered in new ways, the average WhatsApp user will see that as a disadvantage. WhatsApp is not competing with Signal in the marketplace, but it does compete with many apps that are not end-to-end encrypted by default and don't have to make these security trade-offs, like Hangouts, Allo, or Facebook Messenger, and we applaud WhatsApp for giving end-to-end encryption to everyone whether they know it's there or not.

Nevertheless, this is certainly a vulnerability of WhatsApp, and they should give users the choice to opt into more restrictive Signal-like defaults.

But it's inaccurate to the point of irresponsibility to call this behavior a backdoor.

This is a classic security trade-off. Every communication system must make security trade-offs. Perfect security does no good if the resulting tool is so difficult that it goes unused. Famously, PGP made few security trade-offs early on, and it appears to be going the way of the dodo as a result.

Ideally, users should be given as much control as possible. But WhatsApp has to set defaults, and their choice is defensible.

Detecting bad behavior more easily with Key Transparency

Coincidentally, Google just announced the launch of its new Key Transparency project. This project embraces a big security trade-off: given that most users will not verify their contacts' key fingerprints and catch attacks before they happen, the project provides a way to build guarantees into messaging protocols that a server's misbehavior will be permanently and publicly visible after the fact. For a messaging application, this means you can audit a log and see exactly which keys the service provider has ever published for your account and when.

This is a very powerful concept and provides additional checks on the situation above: Bob and anyone else with the appropriate permissions will know if his account has been abused to leak the messages that Alice sent to him, without having to verify fingerprints.

It's important to note that transparency does not prevent the server from attacking: it merely ensures that attacks will be visible after the fact to more people, more readily. For a few users, this is not enough, and they should continue to demand more restrictive settings to prevent attacks at the cost of making the tool more difficult to use. But transparency can be a big win as a remedy against mass surveillance of users who won't tolerate any reduction in user experience or reliability for the sake of security.

Adding key transparency will not prevent a user from being attacked, but it will catch a server that's carried out an attack.

We are still a long way from building the perfect usable and secure messaging application, and WhatsApp, like all such applications, has to make tradeoffs. As the secure messaging community continues to work towards the ideal solution, we should not write off the current batch as being backdoored and insecure in their imperfect but earnest attempts.

Related Updates

Fresno – On Wednesday, May 22, at 9 am, the Electronic Frontier Foundation (EFF) will argue that criminal defendants have a right to review and evaluate the source code of forensic DNA analysis software programs used to create evidence against them. The case, California v. Johnson, is on appeal...

If you are one of WhatsApp’s billion-plus users, you may have read that on Monday the company announced that it had found a vulnerability. This vulnerability allowed an attacker to remotely upload malicious code onto a phone by sending packets of data that look like phone calls from a...

EFF is proud to announce its newest investigative team: the Threat Lab. Using a combination of research skills, the Threat Lab will take a deep dive into how surveillance technologies are used to target communities, activists, or individuals. The Threat Lab is a multidisciplinary unit that’s part of our Technology...

In his latest announcement, Facebook CEO Mark Zuckerberg embraces privacy and security fundamentals like end-to-end encrypted messaging. But announcing a plan is one thing. Implementing it is entirely another. And for those reading between the lines of Zuckerberg’s pivot-to-privacy manifesto, it’s clear that this isn’t just about privacy. It’s...

San Francisco - Technology is supposed to make our lives better, yet many big companies have products with big security and privacy holes that disrespect user control and put us all at risk. The Electronic Frontier Foundation (EFF) is launching a new project called “Fix It Already!” demanding repair...

Today we are announcing Fix It Already, a new way to show companies we're serious about the big security and privacy issues they need to fix. We are demanding fixes for different issues from nine tech companies and platforms, targeting social media companies, operating systems, and enterprise platforms on...

The good news: TLS 1.3 is available, and the protocol, which powers HTTPS and many other encrypted communications, is better and more secure than its predecessors (including SSL). The bad news: Thanks to a financial industry group called BITS, there’s a look-alike protocol brewing called ETS (or...

More lessons from "Facebook Research"Last week, Facebook was caught using a sketchy market research app to gobble large amounts of sensitive user activity after instructing users to alter the root certificate store on their phones. A day after, Google pulled a similar iOS “research program” app. Both of...

This article was first published on Lawfare. The most recent purportedly serious proposal by a Western government to force technology companies to provide access to the content of encrypted communications comes from Ian Levy and Crispin Robinson of the Government Communications Headquarters, or GCHQ, the U.K.’s equivalent of...

Tracking is everywhere on the Internet. Over the past year, a drumbeat of tech-industryscandals has acclimated users to the sheer number of ways that personal information can be collected and leaked. As a result, it might not come as a surprise to learn that emails, too...