“World’s most secure smartphone” looks like snake oil, experts say

Encrypted phone concept a good one, but secrecy and FUD inspire skepticism.

Do you want a phone that secures all of your data and communications, and can't be hacked by even the savviest of criminals and governments? Of course you do. But if you're a realist, you'd probably say that while strong security can be achieved with discipline, perfect security doesn't exist.

Yet, perfect security was the promise of a company called QSAlpha when it recently sent me an e-mail titled "Un-hackable Superphone to be Unveiled via Kickstarter." QSAlpha is seeking $2.1 million to build a phone it dubs the Quasar IV. Pledges starting at $395 would reserve backers a phone estimated for an April 2014 delivery.

A draft of the Kickstarter page and an accompanying video shared with Ars calls it the "world's most secure smartphone," featuring "unprecedented security with a military-grade encryption." Those kinds of claims—coupled with a lack of technical detail—make security experts who reviewed the Kickstarter page suspicious.

The phone a ninja would use

QSAlpha says it started by asking the question, "If a ninja had a phone, what would it look like?"

"The essence of digital security is the ability to operate in stealth mode, moving about undetected, leaving no trace in the digital world, the same way that a ninja leaves no trace in the real world," CEO and founder Steve Chao said in the video.

QSAlpha describes Chao as "an internationally recognized pioneer in digital security, communications and augmented reality." He said he previously built a now-defunct mobile search engine, Cgogo, which was used by China Mobile.

His encryption technology is called "Quatrix." Besides encrypting phones, QSAlpha says it plans its own app store where developers can distribute applications signed by Quatrix.

A preview of the Kickstarter link was temporarily live but is offline as of this writing. QSAlpha was planning to start the crowdfunding campaign this week, but said because of a backlog at Kickstarter it's been delayed until September 12.

Based on Android 4.3 and with various modifications to improve security, the Quasar IV uses a hardware-level encryption module. The contents of the phone are encrypted, and your communication with other users of Quasar IV phones are encrypted as well using public and private keys. Android phones can already be encrypted using a standard setting, but Chao says Quasar does it better.

"Instead of having a third-party hosting or giving out the public key, we have managed to create what we call a seed public key matrix that produces all possible keys up to 10^77 of keys for all the users out there," Chao told Ars in a phone interview. The chip contains both the public key matrix and the user's private key. When initiating communications with another user, the Quasar IV uses the recipient's identity to calculate their public key and encrypt the data. Once received, the user decrypts it with their private key.

All e-mails, text messages, and VoIP calls can be encrypted, he said. Quasar IV can still communicate with users of other phones and devices, but not in an encrypted manner.

If you don't use a Quasar phone, you're basically doomed, QSAlpha CTO Ben Vaughan said in one of the company's videos. "Every time you visit a website, every time you send an e-mail, and every time you make a phone call, you are exposing yourself to criminal activity," he said.

Experts: Proprietary encryption can't be trusted

While QSAlpha's general descriptions of its technology sound reasonable, claims of being "un-hackable" are bound to draw derision from security experts. After talking to Chao, Ars reached out to two experts to get neutral opinions on Quasar IV, speaking to Steve Thomas, who recently discovered a critical flaw in Cryptocat, and cryptographer Jean-Philippe Aumasson.

Thomas was blunt in his response. "The only thing you need to know about this phone is 'proprietary encryption.' If you aren't aware, this means it's probably broken. Don't trust me, just Google 'proprietary encryption,'" he told Ars in an e-mail. "They also 'absolutely guarantee the security of their personal data.' That sounds like something a cryptographer would never say."

According to the OWASP (Open Web Application Security Project) Guide to Cryptography, "Proprietary encryption algorithms are not to be trusted as they typically rely on ‘security through obscurity’ and not sound mathematics. These algorithms should be avoided if possible."

Aumasson agreed with Thomas, noting that claims like "unprecedented security" without detailed technical explanations do not inspire confidence. Aumasson wrote:

Overall, the tone and content of this [Kickstarter] page suggest that it hasn't involved credible security experts.

That said, the idea of a "crypto phone" with a hardware root of trust is good, and would bring better security compared to software-only solutions (things like Silent Circle).

However at this point "Quasar IV" does not provide sufficient technical details to rigorously assess its security, and the marketing tone and FUD on that page suggest that it's unlikely to be a reliable technology, in my opinion.

For example, they write: "Both algorithms [RSA and Diffie-Hellman] are on the verge of being 'cracked' (proven to be vulnerable to attack) by academic mathematicians, according to researchers who presented at the Black Hat security conference in Las Vegas in August."

This is plain wrong, and shows that the authors do not know what they are talking about.

Don't trust us—trust Fox News

We passed Thomas' and Aumasson's concerns to QSAlpha's spokesperson. The spokesperson told us that "The RSA and Diffie-Hellman example does not come from QSAlpha—it comes from a news story recently published on Fox News."

QSAlpha hasn't yet provided any further details in response to Thomas and Aumasson, although they tell us the company has a "security team that will be publishing technical papers on how Quatrix works." They also pointed to an ad they published in the New York Times last year with a message encrypted using their system. The fact that no one has decrypted this message is supposed to be proof of the Quasar IV's effectiveness:

When I interviewed Chao before this followup exchange, he explained that the company uses proprietary encryption schemes "because we have to secure the safety of all app developers and consumers on the Quasar device."

"We use a combination of ECC and our own algorithms to create those public and private seed key files," he said. "Once a file has been encrypted by AES-256, that key has been re-encrypted again using our own algorithm. It's a combination of several algorithms for trusted identity authentication and also file encryption. This is different from popular services out there such as Samsung Safe, which uses AES-256, Blackberry uses just ECC. We use a combination of such."

Chao said QSAlpha's approach of tying the private key to a chip that also calculates all possible public keys is superior to any system that relies on a third-party server to store and authenticate public keys. With Quatrix, "there's no way to get hold of a third-party server to forge other peoples' identities," he said. On Quasar, all authentication happens offline, he said.

Quatrix's encryption can't be broken by conventional computers, he claimed. "The only possibility is if you use a quantum computer and try to break it from different angles, there's a possibility there," he said. "Using conventional computers, I would say it's impossible. It's very difficult. We are looking for ways to protect the system from quantum computers, and we will likely release a paper [on that topic] at the end of this year."

Aumasson wasn't impressed by the use of ECC and AES-256 in the absence of more details, saying, "More often crypto fails due to a poor combination and usage of good building blocks, than because of 'weak' algorithms." This isn't the first time Aumasson has seen companies "trying to impress with either many algorithms, very long keys, or exotic concepts, and in general it does much worse than a minimalist approach based on a robust and scientific architecture."

“Largest phone manufacturer” on board

QSAlpha claimed on its draft Kickstarter that it's "secured a relationship with the world’s largest phone manufacturer to bring the product to the market."

"We have also fully validated every single component in the Quasar IV and teamed up with a reputable component procurement company to ensure timely delivery to the manufacturer for final assembly," the company wrote.

The specs of the device do sound appealing. They include a quad-core 2.3GHz Snapdragon 800 processor, 3GB of RAM, and 64 or 128GB of "encrypted local storage" along with 128GB of encrypted cloud storage. QSAlpha promises dual-rear facing 13MP cameras for "a new generation of augmented reality experiences," and a front-facing 8MP camera. It will support LTE, CDMA, and GSM. Gorilla Glass 3 will protect a 5" display with 1920x1080 resolution.

That's all good, but if you're looking for a phone that prevents any and all security threats, this probably isn't it—nor is such a thing likely to exist anytime soon.

UPDATE on Sept. 17: QSAlpha's crowdfunding project has gone live (now on Indiegogo rather than Kickstarter), with additional details on its encryption scheme.

Promoted Comments

Assuming a rather small (for PK crypto) 1024 bits per key, that's ~10^80 bits of storage space required. Clearly then, as no central key-servers are used, their chip doesn't use a lookup table. Instead, the public key must be derived from a person's unique identifier. This then means that there is no way to revoke a public-private key pair (other than blacklisting the user ID, but that can't be done offline).

Let's assume then that IDs are like phone numbers, and you don't care about old (i.e., revoked) numbers because all you want is other people to be able to contact you. That is, the keys are not being used for authentication. So authentication must be handled by a secondary layer of cryptography. A secondary layer which doesn't appear to exist and thus has to be done by the users (a la the woefully underused digital signatures for e-mail).

Going back to the deriving public keys from user IDs point though. This means that the private key must also be derived from the user ID, since the public and private keys must have a known relationship (e.g., the only two divisors of some incredibly large prime number). Consequently, if this derivation algorithm can be reverse-engineered, all communications ever sent using their system are no longer secure.

Tell me why I should trust a proprietary, untested key derivation algorithm that has none of the features of a well-designed, practical public-key crypto system?

And tell me why I should trust my security to a company that doesn't understand, or even see, these issues?

What's next? "Obviously Al Gore is right on Climate Change, so sell your beachfront property now..." Both of these broad and unsubstantiated generalizations have no business on Ars. Both Al Gore and Fox News present technical issues couched in political terms, so I wouldn't believe any article that cites either as a source without further independent research.

Jon, please keep the politics out of Ars commentary, and focus on the sound and thoughtful analysis of technical issues, and debunking of suspicious claims like "unbreakable encryption" for which Ars is well known.

I think it would behoove you to have a moment of self examination.

If a non-political comment about the reliability of Fox News' tech reporting sets off your political indignity alarm to the point where you feel compelled to write a rant about said writer's political bias, then your worldview might just be overly politicized.

If a ninja had a phone it wouldn't have a backlit screen, for one thing.

A phone would be a terrible accessory for a ninja. Imagine you're sneaking through the rafters of the evil overlord's lair... and suddenly your phone rings... and it's your mom... wanting to know why you haven't called in weeks... and did you know that Tina, from high school, is single again... you should really call her........

Unless you screw up and your random pad isn't really random. Or your method of archiving/distributing the one-time pad is insecure (you're not going to memorize it, after all). Or you didn't realize that the frameworks you use automatically archive all working files to /tmp, so there's a copy of the unencrypted file there.

That last one is kind of a stretch, but hopefully you get my point. There's any number of errors you can make in cryptography that for any other type of program would never be noticed or, if noticed, would be just mildly embarrassing.

Usually when sales in my company made such bold-sweeping claims without no actual backing data, it means my company doesn't even really have the said product the sales are supposedly selling. Just saying...

Even if they were able to deliver on this claim, it is the wrong way to go about it. It sounds ok on paper and people are obviously upset about Prism but no one is going to drop extra money for a feature that only works with a specific brand/model of phone. Also the specs are way too solid for $400 which reeks of another Chinese cheapo (no Play Store, etc).

It is not a bad idea - just needs to be implemented by a company such as Google or Apple to have enough reach to make it viable.

By definition, an "unhackable" phone is hackable. In order to make something secure, certain barriers must be put up to protect certain things that are hackable. Since there is no such thing as complete software security, the phone must be hackable.

d) if the 'seed' is based on a user's identity, doesn't that mean the range of possible keys is many many orders of magnitude less than 10^77?

e) is the hardware on this phone secure?

f) who do the makers believe will lack the ability to hack this phone? do they include nation-state actors such as the NSA, FAPSI (Russia) etc?

g) what exactly is being encrypted? and is all the software down to firmware level signed exclusively by the makers?

h) they claim you can't be followed, but - as recent Ars Technica stories show - all cellphones leave records of when and where they are registered to cell towers, and may be individually tracked due to the radio emissions they continually make

i) using the word 'ninja' in marketing your vaporware hardly imbues your company with credibility

Assuming a rather small (for PK crypto) 1024 bits per key, that's ~10^80 bits of storage space required. Clearly then, as no central key-servers are used, their chip doesn't use a lookup table. Instead, the public key must be derived from a person's unique identifier. This then means that there is no way to revoke a public-private key pair (other than blacklisting the user ID, but that can't be done offline).

Let's assume then that IDs are like phone numbers, and you don't care about old (i.e., revoked) numbers because all you want is other people to be able to contact you. That is, the keys are not being used for authentication. So authentication must be handled by a secondary layer of cryptography. A secondary layer which doesn't appear to exist and thus has to be done by the users (a la the woefully underused digital signatures for e-mail).

Going back to the deriving public keys from user IDs point though. This means that the private key must also be derived from the user ID, since the public and private keys must have a known relationship (e.g., the only two divisors of some incredibly large prime number). Consequently, if this derivation algorithm can be reverse-engineered, all communications ever sent using their system are no longer secure.

Tell me why I should trust a proprietary, untested key derivation algorithm that has none of the features of a well-designed, practical public-key crypto system?

And tell me why I should trust my security to a company that doesn't understand, or even see, these issues?

I had a secure smartphone once. It was called a PalmPilot. It couldn't take calls, it couldn't access wifi. But, as long as I didn't sync it, it couldn't be compromised. Unfortunately, that meant losing all my notes, addresses and calendar appointments every time the AAA batteries ran out. The price of freedom from NSA snooping and crackers I guess.

ID-based encryption (https://en.wikipedia.org/wiki/ID-based_encryption):"... Identity-based systems allow any party to generate a public key from a known identity value such as an ASCII string. A trusted third party, called the Private Key Generator (PKG), generates the corresponding private keys. To operate, the PKG first publishes a master public key, and retains the corresponding master private key (referred to as master key). Given the master public key, any party can compute a public key corresponding to the identity ID by combining the master public key with the identity value. To obtain a corresponding private key, the party authorized to use the identity ID contacts the PKG, which uses the master private key to generate the private key for identity ID.As a result, parties may encrypt messages (or verify signatures) with no prior distribution of keys between individual participants. This is extremely useful in cases where pre-distribution of authenticated keys is inconvenient or infeasible due to technical restraints. However, to decrypt or sign messages, the authorized user must obtain the appropriate private key from the PKG. A caveat of this approach is that the PKG must be highly trusted, as it is capable of generating any user's private key and may therefore decrypt (or sign) messages without authorization. ..."

So if I understand it correctly the Private Key Generator (PKG) is going to be a vaporware company exposed to the US legislation and it will rely on obscurity and FUD to build the most secure smartphone in the world. What should I say? Thanks, but no, thanks?

Presumably there's more to pay later, though? Otherwise that's ridiculously cheap for the specs they promise. It's one thing for a massive company like Google to sell phones at or near the cost of the parts, but I can't see that working for a start-up.

Having access to the "source" isn't that big a deal (though debug symbols are useful). You want to validate the machine code, optimising compilers often break the "source". In fact, FIPS certification had to be changed to deal with open source projects as up till then they only wanted compiled binaries. Also you are assuming somebody who knows what they are doing has actually checked it before you have used it.

Assuming a rather small (for PK crypto) 1024 bits per key, that's ~10^80 bits of storage space required. Clearly then, as no central key-servers are used, their chip doesn't use a lookup table. Instead, the public key must be derived from a person's unique identifier. This then means that there is no way to revoke a public-private key pair (other than blacklisting the user ID, but that can't be done offline).

As "zero-knowledgeness" says above, it sounds like ID based encryption so yes the basic system would have no revocation method. But really such systems should use sealed and tamper resistant hardware I am not sure that's ever going to be viable for a phone that could be lost on a train with only a screen lock to protect it though.

Quote:

Let's assume then that IDs are like phone numbers, and you don't care about old (i.e., revoked) numbers because all you want is other people to be able to contact you. That is, the keys are not being used for authentication. So authentication must be handled by a secondary layer of cryptography. A secondary layer which doesn't appear to exist and thus has to be done by the users (a la the woefully underused digital signatures for e-mail).

No the point of ID encryption is you don't need to authenticate anymore (as the sender) joe.blogs@madeup.com is both your address and your public key, the problem of authentication is moved between Mr blogs and his ID authority.

Quote:

Going back to the deriving public keys from user IDs point though. This means that the private key must also be derived from the user ID, since the public and private keys must have a known relationship (e.g., the only two divisors of some incredibly large prime number). Consequently, if this derivation algorithm can be reverse-engineered, all communications ever sent using their system are no longer secure.

Tell me why I should trust a proprietary, untested key derivation algorithm that has none of the features of a well-designed, practical public-key crypto system?

And tell me why I should trust my security to a company that doesn't understand, or even see, these issues?

There are several public key based ID schemes, reverse engineering is not an issue for them (nor should it be for any secure crypto). The question is have these Muppets done it correctly, probably not...

d) if the 'seed' is based on a user's identity, doesn't that mean the range of possible keys is many many orders of magnitude less than 10^77?

e) is the hardware on this phone secure?

f) who do the makers believe will lack the ability to hack this phone? do they include nation-state actors such as the NSA, FAPSI (Russia) etc?

g) what exactly is being encrypted? and is all the software down to firmware level signed exclusively by the makers?

h) they claim you can't be followed, but - as recent Ars Technica stories show - all cellphones leave records of when and where they are registered to cell towers, and may be individually tracked due to the radio emissions they continually make

i) using the word 'ninja' in marketing your vaporware hardly imbues your company with credibility

a)Long. If you could calculate a trillion (10^12) keys a second, it would still take 10^65 seconds, which is 3 * 10^57 years, which is much longer than the universe existed (4.5*10^9)

b) That is exactly the issue the experts are worried about

C) Yes. The public/private key exchange is expensive, but once you have a symmetric AES key, you can do it in a specialized hardware chip, which is very fast. If you have to do it on the CPU it is slower, but still, AES is designed to be fast

d) Really depends on the ID generation algorithm

e, f, e h) Anybodies guess. They claim it is alright, but we have only their word for it.

Additionally, this pretty much is already possible with open-source (or nearly) software without any funny business.

1) Get a nexus or other rootable phone. Reflash it if you don't trust google much. Root it (for part #5 below)2) Set it up with an account not related to your personal account.3) Encrypt the storage on the phone (a default feature in Android)4) Install all the apps from The Guardian Project, and also TextSecure and PGP, and use them.5) After rooting, you can make all traffic go through Tor/Orbot.6) turn off all location sharing and data syncing with google, if you don't trust them at all.

Of course, the carriers still know where you are, but that is inevitable with a cell phone.