Séminaires 2010 -2011

16 juin 2011 - Peter Schwabe (National Taiwan University) - How to use the negation map in the Pollard rho methodAbstract: Pollard's rho method is the best known algorithm to solve the discrete logarithm problem (DLP) in generic prime-order groups. The algorithm algorithm requires a pseudo-random iteration function. Most implementations of the algorithm sacrifice some randomness for efficiency and use so-called additive walks. For elliptic-curve groups another textbook optimization consists in making use of efficiently computable endomorphisms, most importantly negation. This optimization gives a theoretical factor-sqrt(2) speedup but when combined with additive walks leads to the problem of so-called fruitless cycles. In July 2009 Bos, Kaihara, Kleinjung, Lenstra, and Montgomery announced breaking a DLP on a 112-bit elliptic curve, this is still the largest elliptic-curve DLP for which a solution has been publicly announced. The computations did not make use of the negation map, the reason given by the authors is that dealing with fruitless cycles would have been too expensive in the SIMD environment they chose to carry out the computations. In my talk I will explain the parallel version of Pollard's rho algorithm with the above-mentioned textbook optimizations. Then I will describe why the theoretical speedups are not trivially achievable in practice. Finally I will present results of joint research with Daniel J. Bernstein and Tanja Lange that show how the theoretical sqrt(2) speedup from using the negation map can be achieved in practice in exactly the SIMD computing environment that was used to solve the 112-bit elliptic-curve DLP.

9 juin 2011 - Jean-Gabriel Kammerer (DGA - IRMAR) - Deterministic parameterizations of plane curves. Abstract: Much attention has been focused recently on the problem of computing points on a given elliptic curve over a ﬁnite ﬁeld in deterministic polynomial time. This problem arises in a very natural manner in many cryptographic protocols when one wants to hash messages into the group of points of an elliptic curve. Hashing into finite fields being easy, the main step is to encode a field element into an elliptic curve. The difficulty is to deterministically ﬁnd a ﬁeld element x such that some polynomial in x is a square. In this presentation, we will quickly review the encodings proposed by Icart in 2009 then K. Lercier and Renault on the one hand, Farashahi on the other hand in 2010. We will then explain how the geometry of the nine flex tangents to a cubic closely relates to these parameterizations.

6 mai 2011 (horaire exceptionnel : 11h) - Céline Chevalier (LSV - ENS Cachan) - Composition of Password-based protocolsAbstract: In this talk, we investigate in the symbolic model the composition of protocols that share a common secret. This situation arises when users employ the same password on different services. More precisely we study whether resistance against guessing attacks composes when the same password is used. As already done in [DKR08], we model guessing attacks using a common definition based on static equivalence in a cryptographic process calculus close to the applied pi calculus. Resistance against guessing attacks composes in the presence of a passive attacker. However, composition does not preserve resistance against guessing attacks for an active attacker. We therefore propose a simple syntactic criterion under which we show this composition to hold. Finally, we present a protocol transformation that ensures this syntactic criterion and preserves resistance against guessing attacks.

5 mai 2011 - 14h30 : Benoît Gérard (post-doctorant UCL) - Shannon Entropy: a generic tool for analysing attacksAbstract: Cryptanalysis can be viewed as a communication problem. Indeed, a plaintext X is encrypted using a key K to obtain a ciphertext Y. The attacker's goal is to recover information on X knowing Y or recovering information on K knowing (X,Y). The maximum information he can extract from the observations he made is quantified by the mutual information between these variables: I(K;(X,Y)) for key recovery, I(X;Y) otherwise. Information theory has been used to prove the unconditional security of the one-time-pad algorithm for instance. In that case, it is easy to show that I(X;Y)=0.Shannon entropy metrics may also be used for a first analysis of a cryptanalysis. In 2009, entropy has been used both for unifying the analysis of side-channel attacks and for analysing multiple linear cryptanalysis leading to promising results. This presentation is focused on block cipher cryptanalysis for which many interesting results can be obtained. In that case, entropy is not directly computable but can be tightly approximated using a simple bound on mutual information.First, we will introduce Shannon entropy and mutual information. Then, we will focus on the mutual information decomposition bound that is the cornerstone of the derivation of an estimate for entropy. Finally, we will present and discuss the results obtained applying this bound to some examples of cryptanalysis.

4 mai 2011-10h30 : Anja Becker (UVSQ) - Improved Generic Algorithms for Hard Knapsacks - Joint work with Jean-Sébastien Coron, and Antoine Joux.At Eurocrypt 2010, Howgrave-Graham and Joux described an algorithm for solving hard knapsacks of density close to 1 in time O^~(2^{0.337n}) and memory O^~(2^{0.256n}), thereby improving a 30-year old algorithm by Shamir and Schroeppel. Our new technique allows us to get an algorithm with running time down to O^~(2^{0.291n}).The knapsack instance is divided in two halves with possible overlap, as in the Howgrave-Graham--Joux algorithm, but the set of possible coefficients is extended from {0,1} to {-1,0,+1}. This means that a coefficient -1 in the first half can be compensated with a coefficient +1 in the second half, resulting in an coefficient 0 of the golden solution. To reveal the golden solution, we therefore search for one decomposition (out of many) of the solution by solving two knapsacks. Adding (a few) -1 coefficients brings an additional degree of freedom that enables to again decrease the running time. To explain the idea, we will have a look at a practical example.

-15h : Gaëtan Leurent (Université Luxembourg) - Analysis of some SHA-3 candidatesAbstract: Hash functions are essential primitives in modern cryptography, used in many protocols and standards. Due to attacks on most widely used hash functions, the SHA-3 competition was launched by NIST in 2008 to select a new hash function for standardisation. This ongoing effort is focusing the attention of many people in the symmetric crypto community, from designers to cryptanalysts. In this talk we will describe two recent attacks on second-round SHA-3 candidates. The first attack is based on a cancellation property of some generalized Feistel schemes, and leads to a pseudo-preimage attack on the full compression function of SHAvite3-512. The same ideas can be used to attack reduced versions of Lesamnta, and also to target some generalized Feistel schemes in a block cipher setting. The second attack is a near-collision attack on the full compression function of Blue Midnight Wish (BMW). We describe some tools for the analysis of systems of additions and xors used in the attack, and which can be useful for the analysis of other ARX designs.

10 mars 2011 - Naomi Benger (UVSQ) - Tools for Pairing-Based Cryptography (PBC).Résumé : PBC is fundamentally different to most other Public-Key Cryptography. For RSA, or schemes based on the discrete logarithm problem (both in the finite field and in the group of points on an elliptic curve), an efficient implementation can be written which works well for any level of security. Of course an implementation specially tailored for, and hard-wired to, a particular level of security will perform somewhat better, but not spectacularly so. For PBC an efficient implementation at the 80-bit level of security will be completely different from an implementation at the 128-bit level and very little code will be reusable between the two implementations. The development and maintenance of good quality pairing code becomes difficult and there is a compelling case for the development of a cryptographic compiler which can generate good quality code for each case.I will present some of the tools developed to contribute to the construction of such a compiler.

12 janvier 2011 - Damien Robert (INRIA - Bordeaux) - Computing isogenies and applications in cryptography Résumé: Elliptic curves provide an efficient public-key cryptosystem. An important tool in the study of elliptic curves is Vélu's formulas which allow to compute explicit isogenies between them. In this talk, I will explain how to generalize Vélu's formulas to abelian varieties (this is a joint work with Romain Cosset and David Lubicz). I will then describe AVIsogenies, a full open source implementation of isogenies computation between Jacobians of genus 2 curves. The interest for cryptography is that genus 2 curves allow to work over fields of half the size for the same security as with elliptic curves. The talk will be in French, but the slides in English.