The security of the Facebook-owned WhatsApp messaging service may not be as strong as previously believed, with a reported discovery of a backdoor that potentially allows Facebook see the contents of encrypted messages [updated with statement from WhatsApp].

WhatsApp has used end-to-end encryption on all communications between its users since April last year, with one-on-one messages encrypted by default since 2014. The app uses the Signal protocol from Open Whisper Systems to handle the encryption process, a protocol that Facebook's own Messenger app also employs.

Usually, unique security keys are traded between the users to confirm the communications are secure before sending messages. University of California cryptography and security researcher Tobias Boelter toldThe Guardian WhatsApp is capable of forcing apps to create new encryption keys for offline users.

Once new keys are created, the sender's app can be made to re-encrypt unreceived messages and resend them, allowing messages to be read once intercepted.

The users are not necessarily aware of the change in security keys, as the message sender would be notified if they had enabled encryption warnings in the app's settings. Message recipients are not warned of the changed key by the app at all. The potential Whatsapp backdoor is of grave concern to privacy advocates, due to the possibility of governments leveraging it to monitor communications between persons of interest.

WhatsApp responded to the allegations with the following statement:

"The Guardian posted a story this morning claiming that an intentional design decision in WhatsApp that prevents people from losing millions of messages is a 'backdoor' allowing governments to force WhatsApp to decrypt message streams. This claim is false.

"WhatsApp does not give governments a 'backdoor' into its systems and would fight any government request to create a backdoor. The design decision referenced in the Guardian story prevents millions of messages from being lost, and WhatsApp offers people security notifications to alert them to potential security risks. WhatsApp published a technical white paper on its encryption design, and has been transparent about the government requests it receives, publishing data about those requests in the Facebook Government Requests Report."

Boelter informed Facebook of the backdoor vulnerability in April 2016, with Facebook replying that it was aware of the issue, it was "expected behavior" for the app, and it wasn't being worked on by the social network. The report has verified the backdoor continues to exist in the most recent releases of the app.

A spokesperson for WhatsApp responded to the report, noting the security notifications options in the settings menu, by suggesting it is there as a matter of convenience.

"We know the most common reasons this happens are because someone has switched phones or reinstalled WhatsApp," the representative claims. "This is because in many parts of the world, people frequently change devices and SIM cards. In these situations, we want to make sure people's messages are delivered, not lost in transit."

The potential backdoor is of grave concern to privacy advocates, due to the possibility of governments leveraging it to monitor communications between persons of interest. When asked if the backdoor had been used to access messages, and if it was done on the orders of a government agency, the WhatsApp spokesperson directed the publication to Facebook's Government Requests Report.

Co-director and founder of the Centre for Research into Information, Surveillance, and Privacy, calls the backdoor "a goldmine for security agencies" and a "huge betrayal of user trust." Open Rights Group executive director Jim Killock said that companies claiming to offer end-to-end encryption "should come clean if it is found to be compromised - whether through deliberately installed backdoors or security flaws."

Governments and security agencies have wanted access to encrypted communications in messaging apps for quite a while, with end-to-end encryption becoming more of a reason to use certain apps than ever before.

Apple's iMessage uses end-to-end encryption to protect messages, preventing it from reading the content at all. Apple has however acknowledged it periodically uploads metadata for a message, including phone numbers, dates, and times, with law enforcement able to subpoena the company for access to that information.

Facebook might as well be called Bigotbook now anyway, good thing the real cool people are moving on from that shithole app, and Zuckerberg's the meek ass wiper/kisser can just kiss my ass goodbye. His tolerating propaganda for money means he's like a cynical as frack Gobbel.

This is a great example of why it's hard to trust anybody other than Apple.

Having said that... even Apple has to cave eventually if the government pushes hard enough. At least with Apple, I trust that they will hold out as long as they can and do their best to convince governments to do the right thing.

The "Can't prove a negative" logical fallacy will forever deem any 'bulletproof' encryption technology from being 100% trustworthy.

Can any company prove that their product can't be hacked*?

Not exactly, but that isn't what most cryptographers try to prove. Instead, they prove that reversing a given cryptogram to the source plaintext would take X bytes of memory and Y computation operations and Z energy to pass through all the required states. The discipline used to express the relative quality of cryptosystems and the attacks against them is called computational complexity theory.

Given unlimited resources, anything but a one-time-pad can be cracked by brute force. The question isn't whether it's possible, only how hard it is.

There is a fun bit of math related to the "unlimited resources" part. Barring some discovery that upends our understanding of physics, we know that an ideal conventional computer (perfect, molecular-scale transistors with no spontaneous flips and no heat) couldn't even count to 2^256 given all the energy in the solar system (including energy obtained from breaking down all the mass in the solar system). That's just counting out all of the possible keys. It doesn't include trying to use them. A universe-sized ideal conventional computer given another universe worth of energy wouldn't be able to test all of the possible 256 bit keys against a single message. And that's not even getting into the time it would take to try.

Farcebook is squarely in the pocket of the NaziPubliKKKlanner American Fascist Party. Half of their so-called "community standards" department came out of the Bush maladministration, and their main objective has been to give the Reich-Wing Propaganda Office's paid trolls a protective blanket. Anybody thinks they won't roll over at the first request for information is deluding themselves.

My, that sounds a lot like fake news.

Probably, but the results is pretty much the same as if that was true, it could be a flock of nazi trained butt monkeys with keyboards (the good ol' monkeys with typewriters in a room trope updated) for all I known, same result, making money from bigotry and propaganda.

Also, even if this was fake news, doesn't matter, if enough moron facebook users think its possible, it will be passed on and gain steam.Facebook has made itself an instrument to be used by anyone skilled at propaganda and disinformation; seemingly, the extreme right (and state actors supporting them), have become quite skilled at that game.

The "Can't prove a negative" logical fallacy will forever deem any 'bulletproof' encryption technology from being 100% trustworthy.

Can any company prove that their product can't be hacked*?

Not exactly, but that isn't what most cryptographers try to prove. Instead, they prove that reversing a given cryptogram to the source plaintext would take X bytes of memory and Y computation operations and Z energy to pass through all the required states. The discipline used to express the relative quality of cryptosystems and the attacks against them is called computational complexity theory.

Given unlimited resources, anything but a one-time-pad can be cracked by brute force. The question isn't whether it's possible, only how hard it is.

There is a fun bit of math related to the "unlimited resources" part. Barring some discovery that upends our understanding of physics, we know that an ideal conventional computer (perfect, molecular-scale transistors with no spontaneous flips and no heat) couldn't even count to 2^256 given all the energy in the solar system (including energy obtained from breaking down all the mass in the solar system). That's just counting out all of the possible keys. It doesn't include trying to use them. A universe-sized ideal conventional computer given another universe worth of energy wouldn't be able to test all of the possible 256 bit keys against a single message. And that's not even getting into the time it would take to try.

Right - but that is assuming that the only option is brute force. Cryptographic methods often end up being cracked via other, previously unrecognized, vulnerabilities which, being initially unrecognized, are not the basis for the estimated complexity.

Translation: "We at WhatsApp did not create this backdoor merely to allow spying. It's main purpose in our heart of hearts is to make sure your precious messages are not lost while you change out your SIM card 10 times in the middle of a chat. The spying backdoor is simply a byproduct of this wonderful design feature and therefore should not be considered a backdoor at all.