+1. Using big key lengths for "offline" asymmetric crypto (like PGP) is often applied, but for "online" key-exchanges, a 2048-bit key for 30-year security is sufficient for most applications, and doesn't annoy the user with a 2-minute wait during the SSL handshake.
–
SecurityMattDec 13 '12 at 13:55

13

Keep in mind that asymmetric cyphers are usually used only to protect symmetric session keys, so this increase in asymmetric cypher decryption time is not that dramatic in practice.
–
Alexander ShcheblikinDec 13 '12 at 14:28

1

But when you bear in mind that Browsers are already resorting to dirty tricks like DNS prefetching and just-in-time compiling javascript, you'll realize that the cost of going from 4096-bit keys to 65536-bit keys matters. And also, since the best known attacks on 2048-bit RSA is more work than brute-forcing the AES-256 key, there's no cryptographic benefit to doing it either.
–
SecurityMattDec 13 '12 at 18:23

@SecurityMatt Any source for that claim? The claims I heard is that breaking 2048 bit RSA is about as hard as breaking a 112 bit symmetric algo, not harder than breaking 256 bit encryption.
–
CodesInChaosJan 26 '14 at 14:03

I dug out my copy of Applied Cryptography to answer this concerning symmetric crypto, 256 is plenty and probably will be for a long long time. Schneier explains;

Longer key lengths are better, but only up to a point. AES will have 128-bit, 192-bit, and 256-bit key lengths. This is far longer than needed for the foreseeable future. In fact, we cannot even imagine a world where 256-bit brute force searches are possible. It requires some fundamental breakthroughs in physics and our understanding of the universe.

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38 × 10−16 erg/K, and that the ambient temperature of the universe is 3.2 Kelvin, an ideal computer running at 3.2 K would consume 4.4 × 10−16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our sun is about 1.21 × 1041 ergs. This is enough to power about 2.7 × 1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.

But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

The boldness is my own addition.

Remark: Note that this example assumes that there is a 'perfect' encryption algorithm. If you can exploit weaknesses in the algorithm, the key space might shrink and you'd end up with effectively less bits of your key.

It also assumes that the key generation is perfect - yielding 1 bit of entropy per bit of key. This is often difficult to achieve in a computational setting. An imperfect generation mechanism might yield 170 bits of entropy for a 256 bit key. In this case, if the key generation mechanism is known, the size of the brute-force space is reduced to 170 bits.

Isn't that just the energy limit for irreversible computation? If you compute with reversible gates, you can theoretically use much less energy.
–
tzsDec 17 '12 at 21:49

6

@tzs the physics involved represent the bare minimum energy requirement just to represent information in the most efficient sense that quantum mechanics will allow, well beyond even theoretical technology, not even counting the energy for computing, gates, and all that.
–
tylerlDec 18 '12 at 2:58

6

@opensourcechris correct, you would have to cycle through exactly half of them on average, so 2^255 keys need to be checked, which is still around 10^12 times more than the energy released in a supernova.
–
lynksDec 18 '12 at 22:31

2

@tylerl That minimum energy requirement is per irreversible operation. Since quantum computers do more with a single operation, you need to double the number of bits. Then there is the issue of reversible computing. If you build a computer where each operation is reversible, you can go beyond those limits. But it's completely unclear if reversible computers can be built and if they can be applied to cryptography problems like this.
–
CodesInChaosSep 9 '13 at 9:06

Currently, brute-forcing 128 bits is not even close to feasible. Hypothetically, if an AES Key had 129 bits, it would take twice as long to brute-force a 129 bit key than a 128 bit key. This means larger keys of 192 bits and 256 bits would take much much much longer to attack. It would take so incredibly long to brute-force one of these keys that the sun would stop burning before the key was realized.

That's a big freaking number. That's how many possibly keys there are. Assuming the key is random, if you divide that by 2 then you have how many keys it will take on average to brute-force AES-256

In a sense we do have the really big cipher keys you are talking of. The whole point of a symmetric key is to make it unfeasible to brute-force. In the future, if attacking a 256bit key becomes possible then keysizes will surely increase, but that is quite a ways down the road.

The reason RSA keys are much larger than AES keys is because they are two completely different types of encryption. This means a person would not attack a RSA key the same as they would attack an AES Key.

Attacking symmetric keys is easy.

Start with a bitstring 000...

Decrypt ciphertext with that bitstring.

If you can read it, you succeeded.

If you cannot read it then increment the bitstring

Attacking an RSA key is different...because RSA encryption/decryption works with big semi-prime numbers...the process is mathy. With RSA, you don't have to try every possible bit string. You try far fewer than 2^1024 or 2^2048 bitstrings...but it's still not possible to bruteforce. This is why RSA and AES keys differ in size.[1]

To sum up everything and answer your question in 1 sentence. We don't need ridiculously big symmetric keys because we already have ridiculously big symmetric keys. 256 bit encryption sounds wimpy compared to something like a 2048 bit RSA Key, but the algorithms are different and can't really be compared 'bit to bit' like that. In the future if there is a need to longer keys then there will be new algorithms developed to handle larger keys. And if we ever wanted to go bigger on current hardware, it's simply a time tradeoff. Bigger key means longer decryption time means slower communication. This is especially important for a cipher since your internet browser will establish and then use a symmetric key to send information.

Processing time, pure and simple. Everything in security is a balancing act between the need for security (keeping the bad people out), and useability (letting the good people in). Encryption is a processing expensive operation even with dedicated hardware for doing the calculations.

It simply isn't worth going beyond a certain level of security for most purposes because the trade offs become exponentially harder to use while offering almost no tangible benefit (since the difference between a billion years and a hundred billion years isn't that significant in practical terms).

Also, as for RSA vs AES, that's the nature of symmetric versus asymmetric cryptography. Put simply, with symetric cryptography (where there is one shared key), there is nothing to start guessing from, so it is very hard. For asymmetric cryptography such as RSA, you are disclosing a piece of information (the public key) that is related to the decryption key(the private key). While the relationship is "very hard"tm to calculate, it is far weaker than having no information to work from. Because of this, the larger key sizes are necessary to make the problem of getting a private key from a public key harder while trying to limit how much harder the math problems are for encryption and decryption.

In a way, algorithms using such "insanely large" keys already exist. It's called one-time pads. Nobody really uses them in practice, though, since they require a key the length of the message you wish to encrypt and key material can never be reused (unless you want the ciphertext to become trivially breakable). Given that the purpose of encryption is to convert a large secret (the plaintext) into a small secret (the key), OTPs are only useful in very specific and highly specialized scenarios. You might as well transmit the plaintext securely, because you will need just as secure a transmission channel for the key material.

In all fairness, OTPs do have one specific use case. That is, when you need provable security in an environment where you have access to a secure communications channel at one point but need to transmit a message securely at some later time.

The OTP is provably secure because properly used and with properly generated key material, any decrypted plaintext is equally likely, and nothing about one part of the key material (is supposed to) give you any insight into other parts of the key material or how to decrypt other parts of the message. That's easy in theory, but awfully difficult to pull off in practice. You are looking at short, high-grade military or possibly diplomatic secrets, at most.

For most people, 128- to 256-bit symmetric or 2048- to 4096-bit (assuming something like RSA) asymmetric keys is plenty enough, for reasons already described by Rell3oT, Alexander Shcheblikin, lynks, and others. Anyone wanting to attack a 256-bit-equivalent key is going to attack the cryptosystem, not the cryptography, anyway. (Obligatory XKCD link.) PRNG attacks have broken otherwise properly implemented and theoretically secure cryptosystems before, and one would be a fool to think that has happened for the last time.

Adding more evidence to the "because it slows things down unnecessarily" answers, it seems like AES execution time doesn't grow as fast as RSA when key length goes up (and RC6 grows even more slowly), but it's still a 16% execution time increase to double key length, according to http://www.ibimapublishing.com/journals/CIBIMA/volume8/v8n8.html .

Processing time was already mentioned. Even in that respect the time required to generate an RSA key should be mentioned separately, since it is MUCH more costly for longer keys, since you need to find prime numbers of roughly half the size of the desired RSA key.

Another topic is space, i. e. the amount of data generated. Asymmetric and symmetric ciphers operate on blocks. Even a signature over one byte need the number of bits the RSA key has, i. e. 256 byte for a 2048 bit key. Same situation with block sizes of the symmetric algorithms, which als grows with the key length. While this also seems not an argument at first sight, some severely restricted devices as smart cards are supposed to handle this (still the most secure container for the private key) and there are many applications which need to store a lot of signatures, certificates, cryptograms etc. Actually this is one of the reasons for elliptic curve cryptography, since it is cramming more security into fewer bits.

@MichaelKjörling: I actually can't recognize the benefit of a block size argument if the whole algorithm family is changed. Within one algorithm family larger key sizes mean larger block sizes, but this is also no strict increase. The original question addressed key sizes and did not suggest to change algorithms too, so what is your point?
–
guidotAug 12 '14 at 10:25

OK, well, even within the exact same algorithm (and out of the examples already cited), we've still got AES which can use several different key sizes but always the same block size. Or Blowfish, which has variable key size and fixed block size.
–
Michael KjörlingAug 12 '14 at 11:11

The OP asked: "So in other words what is the consequences of choosing a cipher key that is too large...?" A 256-bit key is plenty strong, as proven by the comments here; however, a very secure key (which is a good thing) will simply cause a malicious person(s) to find a weakness elsewhere in the system.