Posted
by
samzenpus
on Wednesday September 09, 2015 @05:50PM
from the in-the-future dept.

Tokolosh writes: An article in Scientific American discusses the actions needed to address the looming advent of quantum computing and its ability to crack current encryption schemes. Interesting tidbits from the article: "'I'm genuinely worried we're not going to be ready in time,' says Michele Mosca, co-founder of the Institute for Quantum Computing (IQC) at the University of Waterloo..." and "Intelligence agencies have also taken notice. On August 11, the US National Security Agency (NSA) revealed its intention to transition to quantum-resistant protocols when it released security recommendations to its vendors and clients." Another concern is "intercept now, decrypt later", which presumably refers to the giant facility in Utah.In related news, an anonymous reader points out that the NSA has updated a page on its website, announcing plans to shift the encryption of government and military data from current cryptographic schemes to new ones that can resist an attack by quantum computers.

One-Time Pads, implemented and used correctly, are provably secure. That can never change, not even given infinitely-fast computers (which Quantum Computers aren't), because the proof demonstrates that the ciphertext gives you NO information about the plaintext. No amount of computation can extract information from an empty set.

However, one-time pads are also pointless. Oh, there are some very isolated contexts in which they can be usefully applied, but they're useless for nearly everything we use cryptography for today. The one-time pad scheme requires securely distributing the pad, which must be as large as the message to be sent. If you have some channel you can use to distribute the pad securely, why not just use that channel to send the message?

Symmetric cryptography (e.g. AES) improves on the one-time pad by reducing the size of the key material from as large as the message (possibly many gigabytes) to something very small. Say, 16 bytes. So you give up provable perfect secrecy in exchange for only needing a way to securely distribute 16 bytes.

Asymmetric cryptography (e.g. RSA) improves on symmetric cryptography by eliminating the need for every pair of potential communications endpoints to securely exchange symmetric keys. Instead, every potential recipient can publish a its "public" key to every potential sender. This distribution of public keys does need to be secure, but the security requirement is weaker. Symmetric keys need to be kept secret, public keys do not; instead we only need to ensure their integrity, that the potential sender got the actual public key of the potential recipient.

Asymmetric cryptography can be used to further reduce the scope of this problem by using its ability to digitally sign certificates, proving the legitimacy of a given public key assuming (a) the recipient of the certificate has securely received the public signing key and (b) the private signing key is not compromised and is only used to sign legitimate public keys. Thus, the key distribution problem is reduced to a "bootstrapping" problem; we just have to get Certificate Authority key(s) securely. In practice we do this by distributing the bootstrapping keys in system software.

However, asymmetric cryptography has a lot of issues compared to symmetric cryptography. One of the largest is that the public/private key pairs must have some particular mathematical relationship with each other and with every message encrypted or signed. Thus, asymmetric cryptography is deeply dependent on the existence of "one-way" mathematical operations: operations that can be efficiently computed in the forward direction but are intractable in the reverse direction. We don't actually know that any such operations exist, though we have a bunch that we know how to compute efficiently in one direction but not the other. These one-way operations tend to be touchy, though; small errors in constructing messages and performing computations can compromise the security (for example, consider the critical importance of correct padding of RSA plaintexts before encryption; do it wrong and you can potentially hand the adversary your private key).

There are also lots of practical issues with asymmetric cryptography. It's relatively slow and expensive (some techniques more than others), and that opens it up to more side channel attacks and other practical attacks. Then there are issues with the CA system; it's awesome that we can reduce the key distribution problem from one that requires secure pairwise exchanges between billions of devices to broad distribution of a few hundred bootstrapping keys... but that means those bootstrapping keys are incredibly important and every link in the bootstrapping chain becomes an extraordinarily tempting target for extra-cryptographic compromise (e.g. "rubber hose cryptanalysis").

"However, one-time pads are also pointless. Oh, there are some very isolated contexts in which they can be usefully applied, but they're useless for nearly everything we use cryptography for today. The one-time pad scheme requires securely distributing the pad, which must be as large as the message to be sent. If you have some channel you can use to distribute the pad securely, why not just use that channel to send the message?"

One time pads are definitely not as practical as the other methods. However the

"However, one-time pads are also pointless. Oh, there are some very isolated contexts in which they can be usefully applied, but they're useless for nearly everything we use cryptography for today. The one-time pad scheme requires securely distributing the pad, which must be as large as the message to be sent. If you have some channel you can use to distribute the pad securely, why not just use that channel to send the message?"

One time pads are definitely not as practical as the other methods. However the issue of pad exchange is only as much of a problem as it is for symmetric key. Once the key or pad is beyond the realm of easy memorization it doesn't really matter whether it's 128 bits or many gigs, they both fit easily on a micro SD card.

No, they're not equally difficult. There are many cases where it's relatively easy to establish a small secure channel but not a large one. Using asymmetric crypto is an obvious one, and one that we use all the time. Another is pre-shared keys, such as keys embedded in devices at factories, or keys entered into different devices, possibly directly or by using a secure hash to stretch a shorter password. Or consider the case of a device with a small amount of secure storage and a large amount of unsecure or

... "radioactive" and "nuclear" were the popular buzzwords. In general, people knew next to nothing about these things and thus they made highly interesting topics for speculative fiction. Today, that new buzzword is "quantum." The world didn't end in the 1950's and the world won't end now. Technology will grow, people will learn it, and we'll move on with the times. Nothing to see here.

1) No one is predicting the end of the world2) People are predicting the end of traditional cryptography

This is important, because it will take a very long time to upgrade consumer level COTS hardware to have encryption that will withstand a quantum attack. That means doing everything in the clear. Even if we do it all in one week, that's a week's worth of everything in the clear.

I know people will post about how TLS is already broken, and garbage about the NSA already being a

It would stand to reason that there will also come of this, quantum encryption which is not crackable by quantum computing.

Ying and Yang are restored.

Yes, but the problem is the "record now" and "decrypt later" concept. To be secure, you have to know how long the data you are passing can be expected to remain obscured. How long does it take to decrypt it by doing a brute force - try every possible key - approach? If the data you are protecting goes stale in a year, you need to be assured that a persistent attacker won't decrypt your transmission in that time. For a lot of data being passed around, the stale dates are like 30 years in the future, which is a serious problem.

If advances in quantum computing happen and we get the huge jump in processing power they expect, what's currently a brute force time of years can become days or hours. This makes the recorded stuff from 5 years ago very valuable to the spooks who can now decrypt it overnight. And scares the daylights out of the folks who need that data to stay obscured for 30 years.

So, yes, future stuff will be harder to brute force because the same advances in computing power that make brute forcing possible faster will make encryption faster too, but having a treasure trove of easy to decrypt stuff recorded is what is feared.

It doesn't even need to be (and probably won't be) a generalized speed up in rate of computation. Most algorithms used for cryptography NOW are susceptible to a speed up in factorization. And that's one thing quantum computers are practically guaranteed to speed up. It's less clear that they will provide much general speed-up...and there's some evidence that that's unlikely until algorithms are redesigned, in which case you'd get better results on cheaper hardware if you just designed it for parallel pro

I still like to take Grover's algorithm as an example of a somewhat mind-bending quantum computing application. Its typical use case is the search problem, i.e. searching for a particular value inside some form of storage like a database, and it can do so in O(sqrt(N)), which is quite simply impossible in classical computing since you have to visit every entry at least once (hence O(N)) to perform a full search. Now, Grover's algorithm is probabilistic in nature, so you may need to repeat the algorithm to determine the correct value, but value verification for such problems is generally simple (since the problems are generally at most NP-complete and NP-completion implies a P-space verification algorithm given a solution) and the number of repetitions is expected to be constant.

Of course, you can note that parallelization legitimately can compete with Grover's algorithm for very large datasets and very large thread counts, but on a purely theoretical level I still find it fascinating.

Instead of imagining loading an explicit data set (which would indeed be pretty pointless), imagine an algorithm (suitable for quantum execution) outputting the data set. E.g. using it to speed up AES cracking (hence the need for 256-bit AES to get just ~128-bit security) or chess-playing algorithms (at least, if suitable quantum-compatible chess heuristics can be found).

It is my sincerest belief that any people who think that any data that is relevant today genuinely needs to stay obscured for 30 years have sticks up their asses that need to be removed as soon as possible in the best interests of themselves and everyone they should ever meet.

It is my sincerest belief that any people who think that any data that is relevant today genuinely needs to stay obscured for 30 years have sticks up their asses that need to be removed as soon as possible in the best interests of themselves and everyone they should ever meet.

Then I think you are burying your head in the sand. I can see legitimate reasons where things will need to remain obscured for more than a lifetime, especially in specific cases where national security and defense secrets are involved. How old are the LGM-30 Minuteman missiles? The last one was purchased in 1970 or so, which makes them 40+ years old and I'm pretty sure you don't want to publish their specifications on the internet for all to see even now given we still depend on them for defense purposes..

Or; they've expressed ideas or committed acts which currently are legal; but may be illegal in 30 years. You never know what a government 30 years from now is going to make illegal: PARTICULARLY if they are handed a cheap, quick, and convenient way to prosecute it.

If advances in quantum computing happen and we get the huge jump in processing power they expect, what's currently a brute force time of years can become days or hours. This makes the recorded stuff from 5 years ago very valuable to the spooks who can now decrypt it overnight. And scares the daylights out of the folks who need that data to stay obscured for 30 years.

Problem with this is there has to be a rational basis to support beliefs. Responding out of fear alone without any capability to characterize risks associated with (in)action is simply a waste of time.

There is value in preparing for unforeseeable events. Having more cipher suites at the ready to easily allow you to jump ship if a key exchange, cipher, hash were compromised is useful.

Schemes such as layering complete systems such that if ECC is compromised your message is still protected by RSA... or if AE

Given sufficiently powerful quantum computers, RSA is not secure at all. RSA depends in the fact that it's effectively impossible to factor the product of two large primes, but there's a quantum computing algorithm to factor such numbers easily. It isn't a baseless fear. I don't know about ECC.

AES-256 is not crackable by brute force using quantum computers. I don't think any advance in computers will do it in. It's possible there is some way to break it, but it seems unlikely.

Modern key lengths are generally not vulnerable to brute force. The old DES had a 56-bit key (there were eight more bits, but those were checksum-type bits that could be generated from the 56), and that was, eventually, subject to brute force. I wouldn't trust a 64-bit key. A 128-bit key cannot be brute-forced using just the resources in this solar system. A quantum computer can, perhaps, halve the effective size of the key, so I'd recommend a 256-bit key to be safe. That cannot be brute-forced using

This is exactly the sort of situation where the NSA could be the most useful/helpful to us - but no one in tech will trust them to provide actually secure encryption protocols because of their elliptic curve shenanigans.

I'll admit that I *would* need to check back through my records to be explicit. I'm not even sure it wasn't DES. This isn't something I track carefully, and so I don't tend to remember details, but only highlights. This http://www.rt.com/usa/rsa-nsa-... [rt.com] could be the story I'm not-quite remembering, or it could have been one of the others.

So I don't know which story I'm remembering, but a simple google search for "NSA cryptographic key weakening" turns up a bunch. In only some of them does the NSA appear

I suppose it was bound to come to this, but even if they intercepted petabytes of data, how are they going to decrypt it uber fast when storage media is slow even by today's computers' standards?

It would be an incredibly fast process, but first you have to find the needle in the hay field and then splice it open, and whilst the latter would be solved by quantum computers, the former is still in the works.

RSA factorization using today quantum registers is more than useless; The last year largest number processed was: 56,153.
The quantum decoherence is faster when the number of particle increases; And to defeat the RSA some huge quantum registers are required.
The only question: is a quantum machine that can process useful computing operation is even possible?

> RSA factorization using today quantum registers is more than useless; The last year largest number processed was: 56,153. The quantum decoherence is faster when the number of particle increases; And to defeat the RSA some huge quantum registers are required. The only question: is a quantum machine that can process useful computing operation is even possible?

The trouble here is the gulf between what is claimed: Quantum Effect ComputersAnd what is delivered: Atomic Level Noise Computers

So if their key size is large enough that noise * number of atoms in planet won't solve it in any short interval of time, then even one of these pseudo quantum computers the size of the planet won't break the keys.

It's looking very likely that there are much simpler methods of implementing quantum computing algorithms using optical means and the NSA might be further along the path of that research than academia.

They have shown that the exact same room-temperature nuclear magnetic resonance (NMR) experiment used to factor 143 can actually factor an entire class of numbers, although this was not known until now. Because this computation, which is based on a minimization algorithm involving 4 qubits, does not require prior knowledge of the answer, it outperforms all implementations of Shor's algorithm to date, which do require prior knowledge of the answer. Expanding on this method, the researchers also theoretically show how the same minimization algorithm can be used to factor even larger numbers, such as 291,311, with only 6 qubits.

On top of this, in the same paper the researchers demonstrated the first quantum factorization of a "triprime," which is the product of three prime numbers. Here, the researchers used a 3-qubit factorization method to factor the triprime 175, which has the factors 5, 5, and 7.

The previous record was 143, and they did 56,153. And it works on *classes* of numbers, and moves into interesting new triprime territory.

That leads me to believe your comment is dildos. This technique vastly improves on previous methods, and the research is ongoing. Quantum computing is really just beginning (okay, maybe it's 20 years old, or 50), but the progress made in 2 years is quite remarkable.

I'm currently assuming that no existing hardware will be safe in 10 years. If I'm wrong, no harm done.

My objection is to your assertion that acting as if no current cryptography will be safe in 10 years has no cost. If you really act that way, it will cost you quite a lot. You will be unable to take actions you would otherwise be able to take, because, for instance, you have to assume anything you transmit over the Internet, even if you encrypt it, can be intercepted and later read.

Separately, you're also wrong that there is cause to act in such a way, because quantum an

Right now we have machines with a few cubits, analogous to a 1960 IC. It wouldn't surprise me too much if, in six years, we had machines with 2300 qubits. Maybe it'll be called the Intel Q4004.:)

In six years assuming anyone is still willing to waste their time and money there will very likely be "topological" quantum computers with 2300 qubits and they will be just as useless as desktop computers at cracking RSA. Real machines with 2300 entangled qubits would be able to perform operations that would not even be remotely possible in the current life of countless trillions of universes if every atom in every universe were a transistor operating at a trillion trillion trillion thz. It's completely bullshit.

As you probably know, for decades after, transistor counts doubled every TWO years. If the cubit count doubles every two years, that's going to be a problem for cryptography.

Moores law is a reflection of market forces. Doubling was enabled by halving cost enabled by market pressure to reduce costs enabling people to afford more capabilities for the same cost which fueled a never ending feedback loop.

There is no analogue to QC and BTW number of entangled qubits are NOT doubling every year.

We don't know if that's possible, but we didn't know that 386 was possible in 1970.

Nonsense it was then and mostly continues today to be an engineering problem.Nobody has any idea how to scale out QC without being drowned out by noise.

I don't think it's quite that simple. While my intuition tells me that quantum error correction can't work once the number of states becomes too large, when I tried to prove that mathematically the results showed that I was wrong. (That is, they showed that the *particular* argument I was attempting to use was wrong, not that QEC can definitely work.

I'm also doubtful that quantum mechanics is really linear at that sort of scale - historically, linear theories have always proved to be only approximations.

It is probably possible. Importantly, it's something that is waiting for an engineering solution- the physics checks out, Shor's algo checks out. When you are that close to losing an encryption scheme, it's past time to move on.

I think one of the big problems I have with this all is the idea that "encryption in general" is weak to Quantum Computing. In fact, we really don't know that. We know that a few public key things, factoring being the big one, aren't really that hard if you have a quantum comput

My take is that such a machine may well be impossible in any practical sense. Even if it it possible, from looking at the speed (or rather extreme slowness) of the progress made in Quantum Computing, it may well take several hundred years to get there. And when we are there, we can simply make the numbers larger. Currently, RSA is recommended to use 4096 bits for new designs, that is a factor of 256 larger than what can be "cracked" by quantum computers. Remember count that people have been working on this

Quantum computers capable of cracking the higher keysizes that we we have now will never exist, and thus this concern is pointless. People who think otherwise aren't aware of the physics involved, and how the only people left researching making this shit don't believe it will ever work the way people want, they just keep going because it is their bread and butter now. Gotta feed the family.

And we'll never fly heavier than air machines and we'll never split the atom and we'll never reach the Moon and and and... The thing with prediction the future is that more often than not you put your foot into your mouth (which is in itself a prediction of the future and therefore...).

And we'll never find the Philosopher's Stone.And astrology won't tell you your future.And bloodletting isn't going to cure your plague.And we'll never discover the Fountain of Youth.And she's not a witch.And your child isn't a changeling.And the Northwest Passage doesn't exist. (Maybe it will one day if we keep turning up the planet's thermostat, but it certainly didn't when we were looking for it.)

I completely agree. I know some people that worked in that area 20 years ago, and they knew back then that Quantum Computing is extremely unlikely to ever scale without exceptionally huge penalties and hence is useless as actual computing machinery. The physics makes larger machines exponentially more difficult to build. Digital computers can sub-divide problems into smaller problems. That makes them scale well, but not even they scale linearly with effort, just look at the absence of increased CPU-speeds o

Also commercially viable flying cars, actually intelligent machines, a home-helper robot for everybody, a viable, self-sufficient Lunar or Martian colony, faster-than-light travel, and other complete drivel those that misuse science and technology as a surrogate for religion are willing to believe in.

The algorithms at risk to quantum computing attacks (RSA, etc.) are essentially used just for key exchange. Unless you have an offline channel, you need these to communicate your one-time pad. Besides which, when using a one-time pad, the parties have to store it in at least two places before use, greatly increasing the time that these precious bits are at risk of being leaked or stolen.

Once key exchange has been accomplished, modern protocols rely on block or stream ciphers, which are not known to be vu

Unless it has to happen in the real universe. In fantasy-land, this is certainly true, you just need a fairy to wave her wand over it.

Seriously, for short-enough messages, even the venerable Enigma is completely secure. (I remember something like 4000 bits of encrypted message per key-setting being required in order for a break becoming feasible, given unlimited computing power.)

But nil-whits will repeat any such bullshit, because they mistake science for being about opinions and beliefs.

They are not talking about breaking AES or Two Fish encryption. They are worried about breaking the key agreement. Currently when a communication channel is set up the two parties agree on a key for encrypting the communication. This is normally done by Diffie-Helman (D-H) key agreement or one party could select a key and then give it to the other party using the other parties RSA public key. Both RSA and D-H are based on the difficulty of solving math problems that quantum computing should be able to easily solve..
Your AES encrypted file on your hard disk is safe. What the NSA is doing is storing your conversations and the key agreement. Years from now they might crack the key agreement and then decrypt your communication...
Things like Elliptic curve Diffie Helman are secure. So your Black Berry communications will still be safe, not sure who else widely uses EC (your ZigBee electric meter in the USA and UK)

Thank you for the links and for correcting me. Looks like I have some reading to do this morning and some looking into how secure Elliptic Curves over GF(2^n) are. The paper describes an attack on curves over GF(P) while most implementations I've seen in production devices use GF(2^n).

Both RSA and D-H are based on the difficulty of solving math problems that quantum computing should be able to easily solve....
Things like Elliptic curve Diffie Helman are secure.

My impression, after studying some of the actual math, is that nothing is proved secure. So far, mathematicians haven't found any revolutionary shortcuts to such problems -- we are only safe until someone finds them. For example, as the sibling post mentions, there are EC versions of Shor's algorithm. Besides, there isn't anything magical about EC itself, it's like a different number system from a mathematician's point of view (field structure).

Every decade has its futuristic buzzword.
In the '60s it was "flying cars".
In the '80s it was "artificial intelligence".
Now it's "quantum computing".
Don't you find relevant that the people claiming that quantum computing is just around the corner, and will break any encryption known are the same people who are trying to build quantum computers (and are seeking funds to do it?)

I'm doing mostly security work these days, and really the situation is very bad.

Military and government computers are most vulnerable to various non-algorithmic vulnerabilities in the hardware/firmware, which gets little scrutiny and nary an update. Some of these are likely backdoors that the NSA itself probably paid (carrot & stick) to have installed. Meanwhile they have buildings full of people who are paid to do nothing to find breaks in our infrastructure, but tell nobody about them, courtesy of t

The government wants to keep their data safely encrypted? But I thought they were saying that only bad people with bad things to hide need to use encryption? Are they admitting to being bad guys doing bad things?

Already "they" are decrypting everything, or so you should assume. If "they" had working effective quantum computers they wouldn't tell us. And all the stuff they do and say now would be the smokescreen hiding the quantum computers. In WW2 Churchill had to let the German subs intercept the convoys despite the secretly decrypted Enigma messages telling him where the subs were. And the WW2 decryption remained secret for decades. It's just the same now. IF(!) "they" have effectively working quantum computers t

Actually, convoys could and were diverted based on Enigma intercepts. Patrol plane pilots were told to patrol in a certain area on a certain day, and not to ask questions. The rule was not to do anything that couldn't happen in the absence of Enigma. The German Navy did indeed suspect that the Allies were reading their communications, and went to an Enigma with an extra rotor. That caused some problems, but the Allies cracked it also.