On the same PC - barely, if you encrypt it on one PC, and archive it and encrypt once more, it makes more sense.
–
Andrew SmithAug 3 '12 at 11:41

@AndrewSmith I haven't understood. What are the differences if I use one or two computers?
–
Surfer on the fallAug 3 '12 at 12:45

For example, you encrypt the data on the PC and encrypt it once. You copy it to the archive and encrypt it again. Data on your PC will disintegrate in 3 years, the data in the archive in the next 20. Assuming that your information goes over internet and gets recorded, you need to assume that at least one of the ciphers is decodable (the unknown factor), and you hide two private keys in two secure physical locations, so in case of well organized robbery, the data will stay secure as it's less likely to recover multiple keys, which this information can be even more protected.
–
Andrew SmithAug 3 '12 at 13:13

4

There are a few scenarios where multiple encryption makes sense. For example one layer for each hop of TOR.
–
CodesInChaosAug 3 '12 at 13:40

3 Answers
3

Personally, I would avoid multiple encryption protocols most of the time. It adds significant extra implementation complexity without making your data any more secure in the real world, unless the encryption protocol you are using is ultimately broken or becomes computationally feasible at a later date to break.

Granted, I will disagree with others who claim that by doing so you have a larger attack surface and increase your vulnerabilities. While the attack surface technically does increase (you can attack blowfish; you can attack AES), since you must successfully attack both your security has not decreased. (Assuming your message is multiply-encrypted in a nested fashion (anything else doesn't make sense) with independent keys/passphrases like multiply_encrypted_file = Blowfish(AES(file)). If an attacker gets a hold of encrypted_file it is not in any way weaker than getting hold of encrypted_file = AES(file) though you should beware of exposing yourself to known-plaintext attacks which could weaken security if you used the same key/passphrase at all levels and have a guessable header/structure of the file after the first level of decryption). Even if they find a exploitable flaw in Blowfish encryption, they still only can reverse that and then find an AES encrypted file.

However, I do use multiple layers of encryption on an almost daily basis when there is a legitimate reason for it and it provides extra security. For example, I often need to connect to work computers from my home, but for security the work computers are on a private intranet, firewalled off from the outside world.

To connect, I first create a VPN tunnel over the public intranet to a public facing VPN-server that verifies my identity that acts as a gateway to the intranet. Then all my network traffic sent over the internet between my house and work encrypted using IPsec protocol by VPN to the VPN server which decrypts it and forwards it to the local machine as if it was on the local intranet. However, I may then want to connect to something at work using ssh or https. This provides a layer of encryption for the local intranet at work, so my coworkers could not say eavesdrop in on my network connections. However, to someone at my ISP capturing packets the data they see has been multiply encrypted: VPN_encryption(ssh_encryption(actual_data_to_be_transferred)). Again, I'm not using the ssh protocol to encryption to make my data more secure against my ISP eavesdropping; but in no way does it make it easier for my ISP to eavesdrop).

EDIT:
Some keeping arguing that implementation would be much tougher than standard encryption, but not necessarily. To demonstrate, I first implement Blowfish/AES in python using pycrypto:

I don't see how the multiply-encrypted implementation has more of an attack surface and is in anyway weaker than a singly-implemented one. The implementation to the outside world can still be enter a password to decrypt a stored file.

Thanks, but a doubt again: you don't use ssh to make your data more secure, but technically they are more secure, since IPsec and ssh keys are different. Is this wrong?
–
Surfer on the fallAug 3 '12 at 16:07

Our concerns weren't with the attack surface of the algorithms, but rather the attack surface of the implementations.
–
PolynomialAug 3 '12 at 17:17

@Polynomial - I have written a basic implementation/example for encrypting and decrypting Blowfish_CBC and AES_CBC based on a single passphrase, above. Again, I don't think that multiple_encryption really gives much practical gain, but I see no way how multiple encryption makes it more vulnerable in any way. Please explain the larger attack surface of the above implementation. (Assume an application asks for a remembered passphrase and will encrypt/decrypt a stored file by calling these functions; exactly same attack surface as if the app asked for one passphrase to decrypt AES).
–
dr jimbobAug 3 '12 at 21:08

One could argue that if there is no apparent benefit, why do it? The code might be simple to implement, but one must always account for human error. Introducing more code means the potential for more bugs and if there is no benefit - don't do it.
–
Terry ChiaAug 4 '12 at 4:19

@TerryChia - There is an obvious benefit; if you fear some encryption algorithm in use now will break (e.g., in the 1970s DES was the predominant encryption standard approved by NIST; in the mid 1990s it was shown to be insecure). Even though the data I typically encrypt doesn't need this level of security doesn't ensure others will not need it (say the info needs to be unbreakable for two hundred years). The chaining code is as simple to implement as key strengthening a password hash, and I remain unconvinced that it introduces a larger attack surface.
–
dr jimbobAug 6 '12 at 5:08

AES128-CBC is strong. If you're implementing it properly, and using strong random keys and unique random IVs, you're very safe. The US government (NSA) certifies AES for use in securing top-secret documents. I somewhat doubt that your security requirements are anywhere near theirs, so you should consider AES more than strong enough alone. If you're really paranoid, move up to 256-bit AES.

Chaining algorithms only provides more provable security if you use independent keys for each. Using the same key for all ciphers means you still only have to bruteforce one key, though at 128 bits it's been predicted that we might never posess the equipment to do so.

Chaining multiple algorithms makes some sense in ridiculously high-security high-paranoia long term storage situations like TrueCrypt volumes, where the deployment environment is completely unknown. However, unless you're likely to be storing military secrets that'll be shipped into a hostile country, and happen to be a professional cryptographer, I'd stick with just using AES.

Yes, the attack surface is still larger for independent keys. Essentially, the more stuff that you add to a scheme, the larger the attack surface. In respect to AES, Bruce said the security margin for AES128 would be higher than AES256 for that single attack, which is not effective against the full-round version of AES. The current best key-recovery attack on AES is still 2^126.1 operations for 128-bit and 2^254.4 operations for 256-bit, which is completely infeasible by any future standard.
–
PolynomialAug 3 '12 at 9:08

1

@Polynomial, what about in a situation when you need to encrypt data that will be stored in a way that is accessible to everyone (ciphertext that is)? How can I ensure confidentiality for at least 20 years?
–
MatrixAug 3 '12 at 9:10

1

@Matrix Even with ridiculously optimistic estimates of the increase in computing power that might appear over the next 100 years, it would still take several orders of magnitude longer to crack a 256-bit key than the amount of time the entire universe has existed. You cannot guarantee that AES will remain unbroken for that long, but are you really in a situation where you couldn't just re-encrypt it with whatever new standards are available in 10 years time?
–
PolynomialAug 3 '12 at 9:12

1

@Polynomial, absolutely, data could be re-encrypted with newer algorithms over time, but nothing prevents a possible attacker from backing up ciphertext as it is now and trying to brake it 20 years from now.
–
MatrixAug 3 '12 at 9:38

1

@AndrewSmith Which is why, if it is to be done, it should be implemented by a qualified cryptographer.
–
PolynomialAug 3 '12 at 11:40

It is extremely rare that the algorithm itself is a vulnerability. Even weak algorithms such as RC4 are often the strongest part of the system in which they are used. Vulnerabilities usually hide in how you apply encryption (padding, IV randomness, integrity checks...) and in key management. These seemingly peripheral activities are quite hard to do right, and have repeatedly proven to be the number one source of weakness in any encryption system. By encrypting three times, you have tripled the opportunities for such nasty weaknesses.

Very good answer:I have a doubt. let's pretend I get first cipher encrypting some text with AES, then I encrypt first cipher with Blowfish and I get the second cipher. If I had implementation failures attacker could decrypt, exactly as with AES only encryption. Why do you say I have doubled oppurtinities for failure? Then, implementation risks can be exploited only if they have the direct cipher.
–
Surfer on the fallAug 3 '12 at 12:38

1

Why could double encryption be more breakable than single encryption? I'm sure you're right, I'm a newbie, you're crypthographers, but I wish you explain me a little about this. Thank you again for your free help.
–
Surfer on the fallAug 3 '12 at 12:41

1

It's not about the crypto itself. It's about the implementation of the crypto. Layman's explanation: if you write 100 lines of code to implement one algorithm, you've got 100 potential bugs. If you write 200 lines of code for 2 algorithms, and another 25 to chain them together, you've got 225 potential bugs. The potential for the crypto to fail (i.e. be discovered broken) is minuscule in comparison to the potential for your code to be buggy.
–
PolynomialAug 3 '12 at 14:27

1

I still don't understand this reasoning. You're not going to be implementing the algorithms in any of the options but you would use a crypto library. So the options are 1) Ciphertext = Enc_Alg1(Key, Plaintext) or 2) Ciphertext = Enc_Alg2(Key, Plaintext) or 3) Ciphertext = Enc_Alg2(Key2, Enc_Alg1(Key1, Plaintext)). If 3) is any way less secure than the others then there must be a failure in either the implementation of Alg1 or in Alg1 itself because in that case the attacker could simply take your ciphertext and encrypt it with Alg2 and start attacking from there.
–
resistorAug 6 '12 at 11:39

1

Also, you say that it is about the implementation and not the algorithm itself but wouldn't chaining the algorithms also help with implementation errors in either of the algorithms? Because in the scenario above if there is an implementation error in either Alg1 or Alg2 and you happen to have chosen that one then your encryption fails. But if you chain them then there would have to be an implementation error in both of them or else you would be fine. Again, if the chaining itself causes a problem I think that means the the first algorithm or implementation has already failed.What am I missing?
–
resistorAug 6 '12 at 11:40