There is currently a debate ongoing on leading maths blog Gödel’s Lost Letter, between Gil Kalai and Aram Harrow, with the former arguing that building a quantum computer may not be possible due to noise propagation, and the latter arguing to the contrary.

I am wondering if there is any argument to show that building a quantum computer is possible, by virtue of showing that quantum computation is evident in the physical world.

So the question is:

(A) Are there any known examples of physical interactions where macro level state transitions could be determined to only be in correspondence with an underlying quantum computation? I.e. similarly to Shor's algorithm being exponentially faster than any known classical factoring algorithm, are there any examples of known physical processes, for example perturbation stabilization in a very large particle cluster, that could be shown, assuming P<>NP, to only be efficiently solved by a quantum computation.

Some, I admit highly speculative, additional questions would then be:

(B) Is the speed of light barrier possibly a natural computational limit of our particular universe, so that for the computational complexity class of quantum mechanics, working on an underlying relational network-like spacetime structure, this is the maximum speed that the computational rules can move a particle/wave representation through a network region of the lowest energy/complexity (i.e. a vacuum)?

(C) Is quantum mechanics an actual necessity for the universe to follow classical physical laws at the macro level? The informal argument being that in many-to-many particle quantum level interactions, only the capability of each particle to compute in parallel an infinite or quantum-quasi-infinite number of paths is what allows the universe to resolve a real-time solution at the macro level.

Requesting references to research along these lines, or any arguments to support or contradict these speculations.

$\begingroup$This book seems relevant: Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. Seth Lloyd. Would assume my speculations are included there, but have not yet read it.$\endgroup$
– Halfdan FaberMar 10 '12 at 4:51

$\begingroup$Lloyd's book has lot's of interesting material, particularly related to shannon entropy from a computational point of view. Doesn't seem like he covers my speculations, though.$\endgroup$
– Halfdan FaberMar 10 '12 at 21:16

4 Answers
4

The answer to this question is a surprising no, and this is not because we don't have enough quantum systems. We have plenty. The problem is that if a natural system with a large number of particles is implementing a computation that requires exponential resources, we can't do the computation to check if quantum mechanics is accurate. Quantum mechanics might be failing all the time in highly excited highly entangled nuclear states, but we wouldn't know it, because we can't compute the exact energy levels, we can only rely on experiment.

First, for A, every quantum system with a large number of particles and strong interactions is implementing a nontrivial quantum computation, but we can't check if it is doing it correctly. For example, if you excite a uranium nucleus to a very highly excited state, so that it can emit x-rays, neutrons, and protons, and look at the radiated spectrum of stuff, the amplitudes for emission are highly complicated product of a 250 particle system with impossible-to-calculate entanglements. These calculations simply can't be done by any classical computer, so we just wouldn't know whether quantum mechanics is failing. But yes, a uranium nucleus in a 700MeV excited state is performing an impossibly complex quantum computation that we can't compute even with a computer the size of the universe.

For B--- your question is nonsensical, but the speed of light does limit the information transfer speed in a computer. This is not much of a limitation of principle, because it just says that a computation step which moves data from point A to point B will take a time proportional to the distance between A and B. This has no bearing on the computational complexity, because you can do the motion in polynomial time in the size of your memory, even if it is inefficiently layed out in a straight line. This is a red herring. The words "this is the maximum speed a massless particle can compute a resolved path for when traveling through a vacuous quantum field" are meaningless.

For C: the answer here is no--- you can just have classical mechanics, which does not require infinite sums to resolve the answer. The idea that quantum mechanics is required for reproducing classical definite answers is strange, because it is actually mysterious how this happens. In order to produce definite results from the quantum superposition muddle, you need to assume that we are splitting entities in a many-worlds picture, or else to put in definite-making laws by hand which do the same thing effecively. If nature is fundamentally classical, this is not going to matter.

Comments on the Linked Discussion

The argument Gil Kalai makes is interesting, but it is phrased poorly. Christopher Moore made the point cogently in the first of the comments here: http://rjlipton.wordpress.com/2012/01/30/perpetual-motion-of-the-21st-century/ , and I do not want to repeat too much. When you are proposing that quantum computation will fail, you are proposing that quantum mechanics is incorrect, and the failure occurs for highly entangled large physical systems.

The argument against quantum mechanics from the implausibility of a physical system doing an exponential computation is completely different from other arguments against quantum mechanics. The philosophical principle is that nature can't be that much more computationally rich than we are, because this introduces a mystical element of in-principle uncomputability in large quantum systems in nature. This principle is new as far as the literature is concerned--- but it is not due to Gil Kalai. I heard it first from CS student Abram Connely a decade ago, this was his personal beef with quantum mechanics. I found it a persuasive and interesting point, and I tried to give it an exposition in my answer here: Consequences of the new theorem in QM? . The precise formulation Kalai gives is interesting, but formulated in a sub-optimal way.

In order to believe that quantum computation is impossible, you absolutely requires a new law of physics which replaces quantum mechanics, or at least a principle to determine how quantum mechanics fails. The statement that the failure is fundamental, because the universe can't be that complicated, requires you to at least try to specify how the universe can be simplified.

It is incorrect to argue that simple implementation noise makes quantum computation infeasable, without making a proposal that there is a law of nature forbidding quantum computing entanglements. The reason is that you can just remove the noise by cooling down the system, and making the parts precise. There is no in principle limit for quantum computing size, even without error correction. Quantum error correction is central to implementation in practice, but in principle, you can just imagine a perfect computer, and come closer and closer in a colder and colder system, with no limit except how much you are willing to spend.

A failure of quantum mechanics that only affects mutual entanglements of a large number of quantum particles can easily have escaped detection, but when proposing modifications to quantum mechanics, one must check that they do not lead to things that would not have escaped detection. Things like failure of energy conservation, failure of few-particle coherence, irreversible information loss in few-body systems, friction in atomic motion, and all sorts of other things.

In order to check these things, it is insufficient to formulate the computational failure principle about an abstract computing device. One must show how this principle modifies real atomic scale wavefunction dynamics. The idea that this is a nonlinearity in the Schrodinger equation is just bad, so if you are proposing such a modification, it should be because the SE is an emergent description of a fundamentally classical system.

These ideas are due to t'Hooft, who is also skeptical of quantum computation, mostly for the same reason Einstein was skeptical of quantum mechanics. t'Hooft has given several attempts at a model for how to replace quantum mechanics with a system which will not be capable of exponential computation, and if one is proposing fundamental decoherence (which is what Gil Kalai is doing), one should do so in the context of at least a speculation on the underlying substrate.

$\begingroup$Thanks, Ron. +1. Agree that B is nonsensical as stated. Will rewrite shortly to more accurately convey my intended meaning. I think this will not change your answer, though. With respect to A, I take you answer to mean: Yes, the universe implements quantum computations at the quantum level only, but at the macro level we can't verify this (i.e. it is effectively undecidable in our world) and also it (B) has no actual impact on the macro level physical world evolution.$\endgroup$
– Halfdan FaberMar 24 '12 at 14:39

1

$\begingroup$@GrigoriStrassmann: Yes--- and this is why I am personally wary of quantum computation--- by definition we haven't been able to verify when a physical system is doing more computation than can fit in a classical computer the size of the universe. Shor's algorithm is a brilliant exception, because we can check if the answer is a factor of the big number trivially. So we need to run Shor's algorithm on a 10,000 digit number at least once before we can be sure QM is the final theory. I have 60% confidence this will work, but maybe t'Hooft, Connely, and Kalai are right.$\endgroup$
– Ron MaimonMar 24 '12 at 17:06

$\begingroup$I would agree that Shor's algorithm is brilliant, but since factoring is believed to belong to the NP-intermediate class of problems, the behavior is as expected for all NP problems. Solutions are hard to compute, but easy to verify. So assuming a proof of P<>NP, wouldn't a working quantum computer factoring very large numbers in polynomial time be a definitive answer of yes to question A?$\endgroup$
– Halfdan FaberMar 25 '12 at 21:43

$\begingroup$@GrigoriStrassmann: Yes, sort of, up to bad wording. To make it rigorous, technically you have to prove that factoring is not polynomially solvable (it's not NP complete, so it's not enough to prove P<NP). For physics purposes, it is obvious that, even if there is a complicated P factoring algorithm, that the only way nature could be factoring using Shor's algorithm (which is nothing more than a search as far as the factoring is concerned) is to be trying out exponentially many multiplications at once, so that you wouldn't need a proof of anything--- QM would just be shown to be exponential.$\endgroup$
– Ron MaimonMar 26 '12 at 4:26

I just wanted to comment, but it was getting to long. I wanted to say something about (A).

Spin-flips are obviously in a natural correspondence with quantum computations and they occur all the time. Yet, I would not dare to argue that they are "only in correspondence with quantum computations", for you could make then "correspond" to absolutely anything that you want. In fact, you can also say that they correspond to classical coin-tosses. A more quantum (perhaps not very meaningful) example: you "could" still always say the universe is an analog quantum-computer which is simulating itself.

Maybe more relevant, why would any argument of this nature be a hint that quantum computers can be constructed? Assume that the argument you propose is valid in a strong sense and, in addition, that quantum computers can not be constructed. Since classical computers can be constructed. Then, you could perfectly argue that we "should" consider the universe to be a classical computer, using an $\epsilon$-far line of reasoning. Even if you do not believe on quantum computation, this does not look like a useful picture of reality for a modern physicist. Where do you put all quantum effects that have been experimentally demonstrated?

The experiments of John Bush on pilot-wave hydrodynamics have evidently proven that macro-scale replication of quantum effects is entirely possible, the Copenhagen interpretation may be in trouble, and the answer to your question may end up being "yes," after all, to the surprise of most all of us.

That being said, until more experiments are done, I doubt that the view expressed in this answer is likely to be very popular.

The following is exceedingly speculative and some of the arguments are anthropomorphic, so read at your own tolerance level. It relates essentially to interpreting the physical world in terms of information theory and possibly quantum measurement theory, instead of directly from quantum mechanics.

If we consider space or spacetime as a statistical construction from finite information acquired over time, and there exists a lower discrete limit to time such as Planck time, there must in fact be such a speed limit (perhaps c or some multiple of c) which arises naturally, since the observer cannot perceive objects traveling faster than the finite rate at which he/she can calculate the metric relationships between spacetime points. Traveling faster than this limit would be like trying to have your cake and eat it too...you wouldn't be able to observe a faster than light object because you would not have the time to perceive the space backdrop from information received. Now there might be very interesting loopholes to this idea which could allow FTL in certain circumstances, particularly if space can be created at a rate faster than the speed of light as perhaps occurred in the early universe. One could also argue, FTL is possible just not directly observable in this scenario. If c is the actual speed limit, one experimental effect one might expect is mixing of x,y, and z coordinates at speeds close to c, so that there should also be a y/z contraction as well as a Lorentz x contraction.

Perhaps more interesting than setting a simple speed limit however is that the types of such statistically determined background spaces which could be realistically measured and determined by an observer might have deeper connections with gravity at large open scales and O(N) groups at small closed scales. The Euclidean space we generally observe at intermediate scales between these two extremes has very simple and unique symmetric properties (rotation, translation, inversion invariance) which one might expect to naturally emerge from any statistical construction of all possible spaces much as Feynman many paths merge toward the least action principle. At very large scales, however there are most definitely dimensional (and likely topological) constraints for the perception of such a statistical space that would require asymmetries (and thus curvatures) to be introduced. We can for example approximate an observer who can collect only finite amounts of information about his space over time as a random walker who can observe one space point on a space lattice per unit time. It is a well know fact that on a infinite lattice higher than two dimensions, the observer would only return/observe any one point or transition a finite number of times despite an infinite time for observation, and would thus be unable to statistically determine the metric of such a space! For generally finite (and thus closed and small) spaces, this is not a problem however, and it is perhaps the reason why we get interesting gauge groups like SU(3) etc at at small scales whereas we perceive simple and limited 2D projection of a 3D Euclidean space at larger scales, and require curvatures/gravity at the largest scales.

It is perhaps also a telling anthropomorphism that we perceive open 2D spaces in two different ways, as a 2D screen like projection in front of us, or as a linear horizon-like projection times a radial distance upon the surface of a gravitating body. The later is much less direct and the linearity or lack thereof appears limited and controlled by gravity reaching an absolute on the surface of a black hole where the observer and his reality is completely flattened. If there was a continuous/smooth connection between the two, this could form a new duality.

$\begingroup$Please try not to make so many tiny edits. Every time you edit it bumps the question to the top of the front page and takes attention from other deserving questions. Save the editing for when you have something bigger to fix.$\endgroup$
– David Z♦Apr 27 '14 at 4:39

Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).