There has been talk in literature about doing multiple encryption with different block ciphers or stream ciphers several times before, and the benefits and risks of such efforts. There may be hidden assumptions or weaknesses in designs of symmetric cipher algorithms, or perhaps they have been deliberately sabotaged by a hidden weakness by the design team, while on the other hand implementing multiple symmetric cipher algorithms introduces more ways that the code can go wrong and potential security problems in the implementation itself.

Focus on multiple symmetric ciphers to me strikes me as a misplaced concern, considering that the hardness assumption for symmetric ciphers seems largely based on the hardness of searching the keyspace. While its possible that there are breaks in one modern symmetric cipher that doesn't break another, it is starting to seem less likely.

Public key cryptography however relies on many different hardness assumptions, and so it seems strange to me that there hasn't been an effort to come up with protocols for multiple encryption with algorithms that have different hardness assumptions. The RSA problem is broken by factoring the modulus, elliptic curve cryptography is broken by discrete log, lattice based cryptography is broken by solving the shortest vector problem, and so on. And it seems all hidden subgroup problems for finite Abelian groups are broken in quantum computers (discrete log and factoring) and some groups that are 'close' to Abelian groups are also broken on quantum computers (factoring in quaternion groups for example.)

If we were to build a protocol or implementation that relied on multiple cryptographic primitives or algorithms for key exchange and signing, which would be appropriate for such a scheme? Obviously some algorithms are more compute intensive than others, and some users would prefer more performance over security so we might want to rank them in terms of performance and a guess on the hardness.

I have in mind creating certificates that are grounded in multiple hardness assumptions, so if one link in the chain is broken, the whole scheme doesn't fall apart.

I would think the best option would be large elliptic curves that take too many qubits to crack, more than the expected state of the art within the certificate ttl? 1279 bits for example should require something like 7000+ physical qubits
–
Richie FrameJun 14 '14 at 0:46

Yes, but elliptic curve cryptography is broken by breaking discrete log, and breaking the hidden subgroup problem for finite Abelian groups breaks discrete log and factoring. The threats aren't just quantum computers, but advances in any field of mathematics that breaks individual hardness assumptions. Hence, a protocol that has multiple cryptographic algorithms with independent hardness assumptions. Large ECC would be a good candidate for one algorithm that depended on discrete log for the hardness assumption, but we need more in case someone breaks it.
–
dezakinJun 14 '14 at 1:56

1

Combining public key encryption schemes is easy: just do independent key exchanges and then hash the shared secret to produce a combined shared secret. Encrypt the actual message with symmetric crypto as usual.
–
CodesInChaos♦Jun 17 '14 at 10:56

2 Answers
2

For the signatures, hash-based signatures provide a nice solution to your problem: You can use so called hash combiners to instantiate the signature. These are functions that construct a hash function given two or more hash functions and preserve certain properties as long as at least one of the hash function has this property. For example the concatenation combiner:
$C_{H_1,H_2}^{\|}(M) = (H_1(M)\|H_2(M))$

preserves collision resistance and second preimage resistance.

If you use hash functions based on different design principles, e.g. SWIFFT as one that has certain security guarantees steaming from the hardness of lattice problems and one classically constructed one like SHA-3, you can achieve good redundancy.

The big advantage is, that you do not have to implement two different schemes but only one: You would have to implement two different hash functions for message compression anyway but in addition you would also have to implement two "fixed length" signature schemes for multiple signatures.

I haven't heard of much work on this, but really choose any two that are based on assumptions that are independent of each other. Two candidates come to mind: ECC and lattice-based methods. These are based on assumptions that have nothing known in common (dlog vs. LWE).

Additionally, one of the reasons many people are interested in lattice-based methods is that we currently don't get any speedup for breaking the underlying problems with a quantum computer. Therefore, if someone gave you a quantum computer, you could not break these methods with current knowledge.

I wouldn't recommend lattice-based methods at this point in time, for long-term security; they haven't been studied in as much detail as others. It wouldn't be shocking if someone discovered a new attack 10 years from now that breaks them significantly faster than we currently know how to.
–
D.W.Jun 18 '14 at 0:54

1

I guess it depends on who you ask. Progress on the underlying assumptions have been practically none, since Ajtai's paper. Additionally, lattices have been investigated a lot by mathematicians, because of their use in number theory. However, the OP was asking for good combinations and they are pretty much the only ones independent of dlog and of a possible quantum computer.
–
Edvard FagerholmJun 18 '14 at 8:01

@RickyDemer: Thanks I don't remember having seen this one before. It seems like lattices are getting all the attention.
–
Edvard FagerholmJun 19 '14 at 10:23

I don't see how using lattice methods would hurt in the context I'm asking about. They're mature enough that they aren't obviously reducible to another hardness assumption that is used in public key cryptography. so far I can see: Okamoto–Uchiyama, factorization in Z ECC, discrete log in Z NTRU, shortest vector problem McEliece, coding theory Hidden Field Equations (speculative) Supersingular Isogeny Key Exchange. Also variations on the above algorithms that work in algebraic structures that are different, (quaternions, weyl algebras, etc) Trying to see what else can be added.
–
dezakinJun 19 '14 at 21:06