Assuming in the future there was a functioning 1024 qubit quantum supercomputer and it could run Shor's algorithm or Grover's algorithm to crack encryption very quickly. I'm interested in how the number of qubits translates to performance improvement over a regular 2 bit computer.

For example, if I used Shor's algorithm on a 4 qubit quantum supercomputer, would this take half the time to factor 1024 bit RSA as it would a regular 2 bit supercomputer? Then if we extrapolate upwards to 8 qubit supercomputer through to 512 qubit, 1024 qubit and even 2048 qubit etc. What sort of factorization speed increase would you get from adding more qubits? I originally thought quantum computers could have only 4 qubits. But it seems these days you can keep adding qubits up to the amount you want within technical reason. Does this mean if I had a 1024 qubit supercomputer I could factor RSA 1024 bit in a split second? At what speed could it check possible factorizations?

Thanks CodesInChaos and Uwe. @e-sushi I don't think question 1 has been answered. What I want to know is using Shor's algorithm on a 4qubit quantum supercomputer, would this take half the time to factor 1024bit RSA as it would a regular 2bit supercomputer. Then if we extrapolate upwards to 8qubit supercomputer through to 1024qubit and even 2048qubit. What sort of factorization speed increase do you get from adding more qubits? I originally thought quantum computers could have only 4 qubits. But it seems these days you can keep adding qubits up to the amount you want within technical reason.
–
user7827Jul 30 '13 at 21:52

@user7827: With current error rates of quantum computers and the currently known techniques for error correction (which give a polynomial blowup) you need a quantum computer with about $10^9$ qubits to factor a 1024-bit RSA modulus. (Source: A colleague of mine visiting shortly ago the workshop "From Quantum Matter to Quantum Information").
–
j.p.Jul 31 '13 at 8:19

1 Answer
1

Adding more qubits does not increase the computation speed. A quantum computer with 4 qubits does not factorize faster than one with 2. The qubits are the "memory" of the quantum computer. More qubits mean you can factor bigger numbers. If I remember correctly, you need a superposition of $\Theta(N^2)$ terms, which means $\Theta(\log(N^2))$ qubits to factor N.
The running time of Shor's algorithm is $O((\log N)^3)$ to factorize $N$. What is important to remember is that Shor's algorithm can only factorize (by solving the discrete log problem). See wikipedia's entry on Shor's algorithm.

As for Grover's algorithm, it provides quadratic advantage over classical computers for "black-box" queries. So a quantum computer could perform a brute-force attack in $O(\sqrt{N})$ trials whereas a classical computer would need $O(N)$ trials. Again, increasing the number of qubits does not lower the running time, but increases the "memory" of the quantum computer. In order to run Grover's algorithm to brute-force a key, you need a superposition of all keys, which requires $\log K$ qubits where $K$ is the number of possible keys.

Shor's algorithm consists of two parts, one classical and one quantum. The quantum part does the following: given $a$ and $N$ such that $gcd(a,N)=1$, find $r$ such that $a^x = a^{x+r} \mod N$ for all $x$. This is done using quantum Fourier transform. The classical part uses this $r$ to factorize $N$, i.e., it reduces the problem of factoring to the discrete logarithm problem.
–
Philippe LamontagneAug 23 '13 at 19:05