Can quantum measurements beat classical computers?

Boson sampling hardware is ready to build now.

One of the two devices used to demonstrate that boson sampling works in the lab.

Jame C. Gates

We've covered a lot of the progress that has gone into creating effective quantum computers. Although there has been a good deal of progress in terms of building individual components, we haven't been able to put things together into a complete device. That has left us building small, proof-of-concept systems that are handily outperformed by existing classical systems. Without a large enough system, we can't clearly demonstrate the sort of accelerated performance we predict we'll get from quantum hardware.

Two groups of researchers have now figured out a way to test whether quantum systems really can outperform a classical computer. Their systems, which take advantage of a phenomenon called boson sampling, can't be used to compute any algorithms, so they're not as useful as quantum computers might be. But they can be used to confirm that we're on the right track when it comes to quantum computers.

A quantum computer isn't like our existing computers, where electrons flow through a series of switches. Instead, a carefully prepared quantum system is allowed to evolve, and it is then measured. The system only provides us with an answer because we can map different answers to all the possible states that the system can end up in. Because quantum systems evolve very quickly, it should be possible for these systems to arrive at an answer much faster than a typical computer.

Emphasis on the "should be." Since we haven't been able to build a large system, all the problems we've solved have been so simple that a regular computer could solve them while the quantum mechanics are still setting everything up. What the new papers provide is a test case, a system that we can scale now. It won't provide any general computation abilities, but it will be able to give us a clear yes-or-no answer on whether quantum systems can outperform a classical computer.

Both devices are remarkably similar. Single photons (photons are one example of a boson) enter a system where they run into partial mirrors called beamsplitters, which have a 50/50 probability of reflecting the photons or letting them pass through. At random, the beamsplitters send the photons down one of a series of possible paths—in the demonstration devices, the photons should encounter three beamsplitters before they exit the device and are detected. Because of the quantum nature of the photon, even if it's the only one in the device, it can still create an interference pattern with itself, making calculating the probabilities of it coming out the different exit points rather complex.

Now, make the system bigger, with more potential paths. Then start increasing the number of photons that are sent in at the same time, so that they not only interfere with themselves, but with each other. The difficulty of calculating the probabilities of where photons come out of the device goes up very rapidly—so rapidly that it quickly becomes faster to just measure where the photons come out enough times to figure out the probabilities.

(In fact, the calculations are a problem that falls in the category of #P-complete, a special subgroup of the class of problems that can't be solved simply.)

In these papers, the authors keep things rather small. One used three identical photons sent in at once through any of six possible entry sites, with the paths etched into a silica-on-silicon waveguide; three entry points and six exits for the other. In this configuration, calculating the exit probabilities was rather simple, and it was faster than performing the quantum measurement.

But the key thing is that the calculations and the actual measurements agreed. Technical issues like the loss of photons in the equipment or the tendency of photon sources to sometimes send more than one into the device at a time weren't problematic enough that the calculations disagreed significantly from the real-world measurements.

Matthew Broome, from Australia's University of Queensland and lead author of one of the two papers, told Ars that, although their device didn't outperform a classical computer, it is likely that one that's able to handle roughly 20-30 photons should be be able to beat a computer handily. These results show that there is nothing to stop us from building something of that size tomorrow. "The device is the only technologically feasible proposed quantum device that could demonstrate a superior computing advantage over a classical computer," Broome said.

Broome didn't see any obvious calculations that could be done with his device (and, from his perspective, that wasn't the point of the research anyway). But Ian Walmsley, an author of the other paper, suggested it might be possible to use it to simulate other, less conveniently behaved quantum systems. Beyond that, he said, "We'll have to see what our theorist colleagues come up with."

"There is extremely strong evidence to support that Boson Sampling cannot be simulated efficiently on a classical computer, so demonstrating the Boson Sampling algorithm in the lab with a real quantum computer is strong evidence to show that one can indeed harness a computational advantage with quantum physics," Broome said. And that would provide a clear indication that all the effort put into quantum computing could ultimately pay off.

26 Reader Comments

"There is extremely strong evidence to support that Boson Sampling cannot be simulated efficiently on a classical computer, so demonstrating the Boson Sampling algorithm in the lab with a real quantum computer is strong evidence to show that one can indeed harness a computational advantage with quantum physics," Broome said. And that would provide a clear indication that all the effort put into quantum computing could ultimately pay off.

Or the NSA decrypting my messages and the black helicopters hovering overhead would be proof.

Is it just me I don't find this particularly impressive. An experiment is performed which shows it has potential to obtain results faster than calculation based on first principles. I can make a 'water computer' that performs fluid dynamic simulation much faster than a classical computer can. But there is not much merit in doing so?

Is it just me I don't find this particularly impressive. An experiment is performed which shows it has potential to obtain results faster than calculation based on first principles. I can make a 'water computer' that performs fluid dynamic simulation much faster than a classical computer can. But there is not much merit in doing so?

I think the point is that scaling up the fluid dynamics simulation requires polynomially more computing power, while the complexity of simulating boson sampling is exponential in the number of bosons.

Maybe I'm dumb, but I try, and I am just really confused about the whole quantum computer thing. This article just confirms exactly why I am confused.

There are these things called analog computers. You set up a physical system that undergoes the same (mathematical) transformations as your target system, run it, and measure the result. There are many times in which doing so would be theoretically faster than a digital computer, because of the complexity of the math involved. But... the world is not full of analog computers, because the general purpose-ness of the general purpose computer wins out. As MobiusPizza points out, this is kind of a no duh.

So what we have in the referenced article is an analog computer. An analog computer that "calculates" the states of... the very same system as the analog computer. Who'd have thunk it?

So if quantum computers are just new analog computers, then maybe one day they will be great for... something. But if they are general purpose, algorithmic computers, where is the theory of them that says they will be great at anything new? Good old fashioned digital computers rely basically on (a) formal logic and (b) arithmetic. That's the whole Turing point. And now people are trying to create the same structures out of quantum stuff. Where is the new space of algorithms? Where is the new underlying math from which to build them? Why should I believe in them as anything but a shiny new form of analog computer, suited, as all analog computers are, to very narrow, mathematically defined problems?

Maybe I'm dumb, but I try, and I am just really confused about the whole quantum computer thing. This article just confirms exactly why I am confused.

There are these things called analog computers. You set up a physical system that undergoes the same (mathematical) transformations as your target system, run it, and measure the result. There are many times in which doing so would be theoretically faster than a digital computer, because of the complexity of the math involved. But... the world is not full of analog computers, because the general purpose-ness of the general purpose computer wins out. As MobiusPizza points out, this is kind of a no duh.

So what we have in the referenced article is an analog computer. An analog computer that "calculates" the states of... the very same system as the analog computer. Who'd have thunk it?

So if quantum computers are just new analog computers, then maybe one day they will be great for... something. But if they are general purpose, algorithmic computers, where is the theory of them that says they will be great at anything new? Good old fashioned digital computers rely basically on (a) formal logic and (b) arithmetic. That's the whole Turing point. And now people are trying to create the same structures out of quantum stuff. Where is the new space of algorithms? Where is the new underlying math from which to build them? Why should I believe in them as anything but a shiny new form of analog computer, suited, as all analog computers are, to very narrow, mathematically defined problems?

It is believed that the class of problems that can be solved by a quantum computer in polynomial time (BQP) is a strict superset of the problems in P but still less than the entirety of NP.

Is it just me I don't find this particularly impressive. An experiment is performed which shows it has potential to obtain results faster than calculation based on first principles. I can make a 'water computer' that performs fluid dynamic simulation much faster than a classical computer can. But there is not much merit in doing so?

So can you make a water computer that performs fluid dynamic simulations of a B1 bomber faster than a classical computer can?

"We've covered a lot of the progress that has gone into creating effective quantum computers. "

If I had been alive in 1912, I surely would not have had a clue about the practical importance of the theory of relativiity. It is always possible that some similar event will happen this century. But what if anything that event might be is highly uncertain. By contrast, the development of ordinary computing, biotechnology, and likely some kind of nanotechnology built on top biotechnology principles are more less certain to have more of an impact on human society this century than any previous developments in technology. Those who are more interested in the ideology of science for some kind of semi religious worship may want to focus on subjects like quantum computing and space exploration. But, those interested in the impact of technology on human beings might want to prioritize more prosaic subjects like the developments in technology that are already in the pipe today and just have to be scaled up to radically transform the experience of human life.

So if quantum computers are just new analog computers, then maybe one day they will be great for... something. But if they are general purpose, algorithmic computers, where is the theory of them that says they will be great at anything new? Good old fashioned digital computers rely basically on (a) formal logic and (b) arithmetic. That's the whole Turing point. And now people are trying to create the same structures out of quantum stuff. Where is the new space of algorithms? Where is the new underlying math from which to build them? Why should I believe in them as anything but a shiny new form of analog computer, suited, as all analog computers are, to very narrow, mathematically defined problems?

Quantum computers are not faster versions of our current generalized computers. In no way, shape, or form. They probably won't even be used for the same sets of problems.

A quantum computer is used to solve certain algorithms that can't be solved {in reasonable timeframes} used classical computers. It isn't like setting up a fluid dynamics simulation, because fluid dynamics can be performed by classical computers in real time (provided you have a computer big enough). A problem that quantum computers could solve, however, could not. Some of the problems quantum computers are aimed to be used at literally could not be solved by a classical computer the size of the entire Earth before the universe ended, no matter how fast it was.

AndrewMilne is right. A classical computer of significant size can not solve fluid dynamics problems as fast as an analog simulation assuming perfect measuring accuracy of the analog simulation and infinitesimal cell size used for the digital computation. That computation is unbounded. A quantum computer differs from an analog computer only in one way. The quantum states are completely discrete. So measuring state probabilities in this case requires measurement of as many quantum events as necessary to get the desired probability accuracy. You may say no analog computer is perfectly measurable whereas the quantum computer would converge on an answer with any arbitrary accuracy with relatively few measurements required meaning that such measurements could be practically made. In that case the quantum computer is really a variation of analog computer. So it would likely be very application limited like the analog computer but would provide a practical way to get an answer of arbitrary accuracy using relatively few measurements (because of convergence) which an analog computer can not practically do. Worth pursuing? Probably for some things but it's unlikely to ever replace a general purpose computer.

AndrewMilne is right. A classical computer of significant size can not solve fluid dynamics problems as fast as an analog simulation assuming perfect measuring accuracy of the analog simulation and infinitesimal cell size used for the digital computation. That computation is unbounded. A quantum computer differs from an analog computer only in one way. The quantum states are completely discrete. So measuring state probabilities in this case requires measurement of as many quantum events as necessary to get the desired probability accuracy. You may say no analog computer is perfectly measurable whereas the quantum computer would converge on an answer with any arbitrary accuracy with relatively few measurements required meaning that such measurements could be practically made. In that case the quantum computer is really a variation of analog computer. So it would likely be very application limited like the analog computer but would provide a practical way to get an answer of arbitrary accuracy using relatively few measurements (because of convergence) which an analog computer can not practically do. Worth pursuing? Probably for some things but it's unlikely to ever replace a general purpose computer.

Quantum computers are universal, so in principle can be used for any calculation you like. But, as baloroth points out, quantum computers will probably be used only for very specific problems (like the simulation of quantum system), at least at first. And don't knock quantum simulations: they will offer the possibility of actually calculating material properties, rather than the current trial-and-error method. This will be truly revolutionary, hence the excitement.

Second, quantum computers are not some version of analog computers. The basic mechanism is completely different (quantum vs. classical mechanics). The reason analog computers don't exist is that they cannot be made fault tolerant, and quantum computers can. In fact, there is an analog version of a quantum computer using so-called continuous variables, which also does not have a fault tolerance theorem. So in my opinion it is entirely unhelpful to bring in analog computing to the discussion, because it is and always has been a non-starter due to bad noise properties, both in the classical and the quantum world.

"We've covered a lot of the progress that has gone into creating effective quantum computers. "

If I had been alive in 1912, I surely would not have had a clue about the practical importance of the theory of relativiity. It is always possible that some similar event will happen this century. But what if anything that event might be is highly uncertain. By contrast, the development of ordinary computing, biotechnology, and likely some kind of nanotechnology built on top biotechnology principles are more less certain to have more of an impact on human society this century than any previous developments in technology. Those who are more interested in the ideology of science for some kind of semi religious worship may want to focus on subjects like quantum computing and space exploration. But, those interested in the impact of technology on human beings might want to prioritize more prosaic subjects like the developments in technology that are already in the pipe today and just have to be scaled up to radically transform the experience of human life.

How short sighted; you argue we shouldn't do something with long term rewards because there is short term problems.

Quantum computing is in fact technology already in the pipe, it just has a longer pipe than you are comfortable with. There is nothing 'religious' about space exploration or quantum computing, there are real hard problems that both can/will/should solve that cannot be handled any other way.

Space exploration, as an example, requires we solve DNA damage due to radiation in outer space.

Doing so is likely to solve the same problems that currently cause old age and cancer (both due to errors accumulating over time) and possibly also tissue regeneration. The only difference is time and intensity; travel to Mars exposes you to more, sooner, than traveling around the Sun on planet Earth for several years.

That's all it boils down to. On the planet Earth we travel through space at a slow and measured pace.

Quantum computers are universal, so in principle can be used for any calculation you like. But, as baloroth points out, quantum computers will probably be used only for very specific problems (like the simulation of quantum system), at least at first. And don't knock quantum simulations: they will offer the possibility of actually calculating material properties, rather than the current trial-and-error method. This will be truly revolutionary, hence the excitement.

So I'm confused -- is the quantum computer in the article a universal computer? To me it seems to be able to calculate a certain specific thing, rather than be programmable.

Quantum computers are universal, so in principle can be used for any calculation you like. But, as baloroth points out, quantum computers will probably be used only for very specific problems (like the simulation of quantum system), at least at first. And don't knock quantum simulations: they will offer the possibility of actually calculating material properties, rather than the current trial-and-error method. This will be truly revolutionary, hence the excitement.

So I'm confused -- is the quantum computer in the article a universal computer? To me it seems to be able to calculate a certain specific thing, rather than be programmable.

The device in the article isn't a quantum computer, merely a device that exhibits certain properties in common with a quantum computer.

Said properties would imply it is profitable to build a quantum computer because it would be more efficient than using a classical computer to solve the problems exposed by said device.

Put another way, a punchcard loom isn't a computer, but is in fact similar and has similar properties to a computer, and it can be used to model a computer if desired.

So if quantum computers are just new analog computers, then maybe one day they will be great for... something. But if they are general purpose, algorithmic computers, where is the theory of them that says they will be great at anything new? Good old fashioned digital computers rely basically on (a) formal logic and (b) arithmetic. That's the whole Turing point. And now people are trying to create the same structures out of quantum stuff. Where is the new space of algorithms? Where is the new underlying math from which to build them? Why should I believe in them as anything but a shiny new form of analog computer, suited, as all analog computers are, to very narrow, mathematically defined problems?

Since that page doesn't really explain why quantum algorithms can be faster, under the spoiler is a description of the key difference between quantum and classic algorithms:

Spoiler: show

To give you the broad stroke picture of a typical quantum algorithm and how it differs from classical computing:

-In classical computing, the computer carries out a sequence of operations in a more or less linear fashion, eventually coming to a stop and spitting out the desired result.

-In quantum computing, the computer would set up a suite of entangled particles, act on the suite using some linear sequence of operations, and then measure the result on each particle to obtain the data. Because of the probabilistic nature of these systems, the computation needs to be run a few times to make the confidence interval tight.

The moral of the story is that, if you think of classical computing as being a single logical flow, etanglement allows us to do each operation once and yet obtain the results of doing that operation n times if you have n entangled particles. For those keeping score at home, that means that if you have d operations in a quantum algorithm on n particles, it is equivalent to doing n^d operations in a classical algorithm, hence why exponential time classical algorithms become polynomial time quantum algortihms.

Is it just me I don't find this particularly impressive. An experiment is performed which shows it has potential to obtain results faster than calculation based on first principles. I can make a 'water computer' that performs fluid dynamic simulation much faster than a classical computer can. But there is not much merit in doing so?

I think the point is that scaling up the fluid dynamics simulation requires polynomially more computing power, while the complexity of simulating boson sampling is exponential in the number of bosons.

Unless the system is linear like... a beam splitter. Then the problem of calculating the probabilities with one photon is the same as calculating it with any number of photons.

I don't think that there will ever be a universal quantum computer, at the most an analogue quantum computer.

Or perhaps the "creators" of whatever is produced that works sufficiently will just redefine the word "Computer" to fit with what they ended up with (Like quantum teleportation which really should be called "quantum faxing" )

It is believed that the class of problems that can be solved by a quantum computer in polynomial time (BQP) is a strict superset of the problems in P but still less than the entirety of NP.

There are reasons for the skepticism you find in people like Andrew and me.

1. Despite many years of effort, scientists have been unable to build one that can solve any problem that isn't ridiculously trivial. Progress has been so slow that it requires great patience to see if there's any progress at all.

2. Meanwhile it seems like each barely noticeable advancement is touted in the media as if it were a breakthrough and puts us on the eve of a great revolution in computing. But when you read the details you ffind out that the scientists have only managed to do what any fourth grader can do, in about the amount of time it would take to breed and educate your own fourth grader. (The fourth grader is more versatile and shows a great deal more potential for future progress.)

3. Most of us haven't seen or don't understand the math that describes how a quantum computer is supposed to work.

4. Even if we did understand it, articles about practical attempts to build such systems never include enough information to evaluate whether the systems that were built and tested realize what the math says they would need to be to be considered proper quantum computers.

Ouch, if you don't understand the math describing quantum computers, how do you expect to understand quantum computers?

First off, do you understand probability and superposition?

The basic principle is a physical configuration of atoms that are a superposition of all possible solution states. Sampling the atoms repeatedly gives you greater confidence your solution is in fact correct.

ETA: the fundamental universality claim was already covered by Deutsch' original article in 1985 (by making a connection to Turing machines), but I was lazy and grabbed the first paper that is relevant to your question.

It is believed that the class of problems that can be solved by a quantum computer in polynomial time (BQP) is a strict superset of the problems in P but still less than the entirety of NP.

There are reasons for the skepticism you find in people like Andrew and me.

1. Despite many years of effort, scientists have been unable to build one that can solve any problem that isn't ridiculously trivial. Progress has been so slow that it requires great patience to see if there's any progress at all.

Building a quantum computer is incredibly hard, because the required quantum states are tremendously fragile. You don't build something like that overnight. In the past decade, experimentalists have demonstrated control over many quantum systems, and right now the game is to decrease the noise levels for even better control. All this stuff requires talent, hard work, and a healthy amount of funding. It is all there, and as a result I would say quantum information processing is one of the fastest moving fields in science.

Quote:

2. Meanwhile it seems like each barely noticeable advancement is touted in the media as if it were a breakthrough and puts us on the eve of a great revolution in computing. But when you read the details you ffind out that the scientists have only managed to do what any fourth grader can do, in about the amount of time it would take to breed and educate your own fourth grader. (The fourth grader is more versatile and shows a great deal more potential for future progress.)

There is certainly a lot of attention in the media for quantum computing, and there is some justification for the hype, since often every breakthrough is another achievement necessary to reach the ultimate goal of building a useful quantum computer.

I can assure you that no fourth-grader can come up with and execute these experiments. You're probably thinking of the factoring of the number 15. But that is beside the point (see 1.).

Quote:

3. Most of us haven't seen or don't understand the math that describes how a quantum computer is supposed to work.

It does indeed require a lot of study to get to grips with this, and few have the time or motivation to delve into this. The rational thing to do in such instances is to rely on trusted sources like Ars Technica, Scientific American, the scientific literature, etc. What you probably shouldn't do is assume that your objections as a lay-person are completely on the mark and us experts have completely missed this for two decades.

Quote:

4. Even if we did understand it, articles about practical attempts to build such systems never include enough information to evaluate whether the systems that were built and tested realize what the math says they would need to be to be considered proper quantum computers.

What are you asking for here? That a popular science article conveys the intricate details to what extent an experiment conforms to the theory of quantum computation? This is not the role of popular science articles. You need a textbook for that.