Cryptographic primitives usually assert some security level given as number of operations to mount an attack. Hash functions, for example, give different security levels for collision attacks, preimage attacks and second preimage attacks. From these, "safe" key sizes are derived for different primitives.

There are many different recommendations for safe key sizes and many different means of estimating future capabilities in performing computation. For example, www.keylength.com has a lot of these recommendations combined.

What I'm looking for, however, is the amount of simple operations that can be obviously seen as out of reach for all humanity for the foreseeable future - or actually, the lowest such value that is still believable.

It is very obvious that 2^256 simple operations is something that will never be reached. It is also very obvious that 2^64 simple operations can be reached as it already has been. Many of the recommendations seem to calculate 2^128 as a number that would be safe for 30 years or more. So the value I am looking for is likely between 2^128 and 2^256. I am guessing 2^160 or 2^192 might be safely out of reach.

But I want concrete arguments that can be easily reasoned about. I'd love to see arguments that are based on simple laws of physics or relations to concrete constants about the universe. For example, Landauer's principle could be used.

Note: the actual simple operations used are not relevant here - they might be operations on a quantum computer, or hash invocations, or whatever.

Of course this depends on how long a single "simple operation" takes, and how many of them you can do in parallel. An application of a SHA-265 hash takes more than a simple addition (one which fits your register size).
–
Paŭlo EbermannAug 11 '11 at 18:25

Well, obviously. But I'm looking for a count of operations that will not ever be reached, regardless of how simple it is. So counting values in a loop is good enough, or something even simpler. For example, in Landauer's principle, the unit of calculation is the change of a single bit.
–
NakedibleAug 11 '11 at 19:05

2 Answers
2

As a starting point, we will consider that each elementary operation implies a minimal expense of energy; Landauer's principle sets that limit at 0.0178 eV, which is 2.85*10-21 J. On the other hand, the total mass of the Solar system, if converted in its entirety to energy, would yield about 1.8*1047 J (actually that's what you would get from the mass of the Sun, according to this page, but the Sun takes the Lion's share of the total mass of the Solar system). This implies a hard limit of about 6.32*1068 elementary computations, which is about 2225.2. (I think this computation was already presented by Schneier in "Applied Cryptography".)

Of course this is a quite extreme scenario and, in particular, we have no idea about how we could convert mass to energy -- nuclear fission and fusion converts only a tiny proportion of the available mass to energy.

Let's look at a more mundane perspective. It seems fair to assume that, with existing technology, each elementary operation must somehow imply the switching of at least one logic gate. The switching power of a single CMOS gate is about C*V2 where C is the gate load capacitance, and V is the voltage at which the gate operates. As of 2011, a very high-end gate will be able to run with a voltage of 0.5 V and a load capacitance of a few femtofarads ("femto" meaning "10-15"). This leads to a minimal energy consumption per operation of no less than, say, 10-15 J. The current total world energy consumption is around 500 EJ (5*1020 J) per year (or so says this article). Assuming that the total energy production of the Earth is diverted to a single computation for ten years, we get a limit of 5*1036, which is close to 2122.

Then you have to take into account technological advances. Given the current trend on ecological concerns and the peak oil, the total energy production should not increase much in the years to come (say no more than a factor of 2 until year 2040 -- already an ecologist's nightmare). On the other hand, there is technological progress in the design of integrated circuits. Moore's law states that you can fit twice as many transistors on a given chip surface every two years. A very optimistic view is that this doubling of the number of transistor can be done at constant energy consumption, which would translate to halving the energy cost of an elementary operation every two years. This would lead to a grand total of 2138 in year 2040 -- and this is for a single ten-year-long computation which mobilizes all the resources of the entire planet.

So the usual wisdom of "128 bits are more than enough for the next few decades" is not off (it all depends on what you would consider to be "safely" out of reach, but my own paranoia level is quite serene with 128 bits "only").

A note on quantum computers: a QC can do quite a lot in a single "operation". The usual presentation is that the QC performs "several computations simultaneously, which we filter out at the end". This assertion is wrong in many particulars, but it still contain a bit of truth: a QC should be able to attack n-bit symmetric cryptography (e.g. symmetric encryption with a n-bit key) in 2n/2 elementary quantum operations. Hence the classic trick: to account for quantum computers (if they ever exist), double the key length. Hence AES with a 256-bit key, SHA-512... (the 256-bit key of AES was not designed to protect against hypothetical quantum computers, but that's how 256-bit keys get justified nowadays).

I really only wanted to say, wow. This is a beautiful answer, well done Thomas. :-)
–
ChristofferAug 11 '11 at 19:15

Great answer! Just the kind of reasoning I was looking for. However, I'm not looking for "safely" out of reach, I'm looking for "not achievable by humanity by any means". Assuming no faster than light travel, limiting resources to our solar system seems reasonable - however, I think a figure far less than 2^225 could be reasoned that would still be reasonably out of reach of all humanity even if the fate of the entire species would depend on it.
–
NakedibleAug 11 '11 at 19:26

Mightily impressive answer, wish I could upvote more than once. But even assuming hypercomputing is out of reach, don't forget reversible computing--if you can perform 2^31 of your operations reversibly, you might be able to to reach that 2^256 operations on a classical computer, using a black hole for negentropy and throwing in the whole solar system piece by piece.
–
user502Aug 12 '11 at 15:43

I (personally) don't think reversible computing will ever appear, so just like FTL, I don't consider it.
–
NakedibleAug 14 '11 at 15:38

1

That interpretation sure exists, and is even widespread; some people want bigger keys because they assume that key bits are somehow consumed gradually by attacks. This assumption, however, is not supported by facts. In reality, breaks tend to be all-or-nothing. In some specific cases, bigger keys can be weaker (for instance, AES-256 turned out to be substantially weaker than AES-128 against the rather esoteric, and fortunately non applicable in practice, related-key attacks).
–
Thomas PorninFeb 3 at 12:08

Note: the actual simple operations used are not relevant here - they might be operations on a quantum computer, or hash invocations, or whatever.

Well, a quantum computer is the reason no one can tell you the "amount of simple operations that can be obviously seen as out of reach for all humanity for the forseeable future". By definition a quantum computer performs the opposite of "actual simple operations"; it allows one to bypass some large portion of "simple operation" space through quantum sleight-of-hand. Once the computer that bypasses portions of that simple operation space exists, then your question about "how big does the space need to be" becomes unpredictably irrelevant.

That's the theory, anyway. We haven't reached the level of future where quantum computers can do what we think they should be able to do. Although I'm comfortable saying that such a quantum computer does and does not exist in a box somewhere.

Quantum algorithms may reduce the amount of simple operations required to achieve a certain goal - such as cracking a cryptographic primitive - but I am not interested in that. A quantum operation is still an operation and quantum computers are still just computers.
–
NakedibleAug 11 '11 at 19:03

1

I very nearly upvoted this for the final sentence alone.
–
Michael KjörlingMay 2 '13 at 15:03