Abstract : There exist fast variants of the gcd algorithm which are all based on principles due to Knuth and Schönhage. On inputs of size n, these algorithms use a Divide and Conquer approach, perform FFT multiplications with complexity μ(n) and stop the recursion at a depth slightly smaller than lgn. A rough estimate of the worst-case complexity of these fast versions provides the bound O(μ(n)logn). Even the worst-case estimate is partly based on heuristics and is not actually proven. Here, we provide a precise probabilistic analysis of some of these fast variants, and we prove that their average bit-complexity on random inputs of size n is Θ(μ(n)logn), with a precise remainder term, and estimates of the constant in the Θ-term. Our analysis applies to any cases when the cost μ(n) is of order Ω(nlogn), and is valid both for the FFT multiplication algorithm of Schönhage-Strassen, but also for the new algorithm introduced quite recently by Fürer [Fürer, M., 2007. Faster integer Multiplication. In: Proceedings of STOC'07. pp. 57-66]. We view such a fast algorithm as a sequence of what we call interrupted algorithms, and we obtain two main results about the (plain) Euclid Algorithm, which are of independent interest. We precisely describe the evolution of the distribution of numbers during the execution of the (plain) Euclid Algorithm, and we exhibit an (unexpected) density ψ which plays a central rôle since it always appears at the beginning of each recursive call. This strong regularity phenomenon proves that the interrupted algorithms are locally "similar" to the total algorithm. This ultimately leads to the precise evaluation of the average bit-complexity of these fast algorithms. This work uses various tools, and is based on a precise study of generalised transfer operators related to the dynamical system underlying the Euclid Algorithm.