Waveguides make quantum computers more reliable

Researchers show that a waveguide-based quantum logic gate works as close to …

Quantum computing is one of the current big things in both physics and computer science circles. But there is a serious divide between what we think might be possible and what we can, in fact, do. There are theorists out there working themselves into a frenzy, trying to show that quantum computing will make a smoother latte. On the experimental side, many researchers are still in various stages of single gate operations. It is like the difference between trying to make a valve and knowing what you can do with lots of valves once you have them.

In a recent paper, published in Applied Physics Letters, researchers from the UK and Australia have demonstrated that quantum computing gates with very low error rates, based on integrated optical circuits, are now feasible. This might pave the way for multi-gate optical quantum computers.

Quantum computing is, as the name might suggest, a merger between classical digital computers and the quantum freakiness that permeates the world around us at the smallest scales. In a classical computer, a bit can have two values: logic one and logic zero. When we perform operations on a string of bits, we either leave them unchanged or flip them, depending on some control bits. It is important to realize that the value of a bit at any particular time does not depend on any of its partner bits.

If we add a dash of quantumness to the mix, we can do two things. First, logic elements, now qubits, are no longer logic one or logic zero; instead, they are both at the same time. When we read out the result from a program, we obtain a definite one or zero, but during the computation, the qubit really is in both states. Operations don't necessarily flip bits. Instead, they modify the probability of a measurement returning a one or a zero. The second element added to the mix is correlations between qubits. When we perform an operation on one qubit in a string of them, we are actually performing an operation on all the qubits.

There are good and bad aspects to this. A quantum computer doesn't always return the right answer, but some operations, like factoring or database searches, can be sped up. Not returning the right answer comes from two factors. There is an intrinsic uncertainty associated with measurement—it's the price we pay for being in a quantum universe. There are also instrumental imperfections, which, at the moment, play a major role in limiting quantum computing.

This is where Laing and colleagues come in. They focused on the construction of near perfect circuity. In the case of optical quantum computing logic, this corresponds to making perfect beam splitters and interferometers.

These aren't the normal optics you might find in a microscope, which makes things both easier and more difficult. For instance, in a waveguide, a beam splitter is replaced by a directional coupler, where two waveguides are brought into close proximity. Over a certain length, light from one waveguide will leak into the adjacent waveguide. The amount of light that transfers depends on how close the two waveguides are and the distance they remain close. So, in principle, it is very easy to design a perfect beam splitter. In practice, fabrication uncertainty makes this a bit of a lottery—the usual procedure is to make quite a few, test them all, and pick the good one to report on.

Interferometers are similar, in that they involve splitting and recombining light beams. However, in addition to requiring two perfect beam splitters for the interferometer, one also needs to carefully control how far the light must travel between the two. In other words, the fabrication tolerances on the two different light paths are quite tight.

However, once you have these two elements, you can make a controlled NOT gate—a gate that inverts the quantum state of one qubit, depending on the state of the controlling qubit—which is a logic element from which all other logic elements can be constructed. That is exactly what this paper demonstrates. They show that they have very low loss waveguides, and that they can make beam splitters with a splitting ratio within a couple percent of their design ratio.

To illustrate this, they showed data obtained from the quantum interference between single photons passing through their beam splitter. The error bars on the data are tiny, so within the uncertainty of their measurements, they have a perfect instrument.

Likewise, Laing and colleagues show a controlled NOT gate that gets it right 97 percent of the time. "Right" being a relative thing here—this is the fidelity, which means it takes into account the fact that quantum measurements have a finite chance of getting the wrong answer irrespective of the quality of the equipment. From this, they calculate that, at worst, they have an error rate between one part in 100 and one part in 1000. The latter figure is probably good enough to start thinking about multiple gate operations.

As you can see, I'm not reporting on anything startling here, just a good solid bit of technology that is necessary for optical quantum computers to do anything useful. I do wonder, however, how many of the circuit elements on the wafer were functional, because that is probably the limiting factor now. One thing missing in all optical implementations of quantum computers is programmability, because that involves switching light paths around. In integrated optic implementations, like this one, switches could be fast, and if the losses are low enough, programmability might well be on the horizon.

The bigger problem on the horizon is multi-qubit calculations. To perform a calculation represented by a register of eight qubits, every one of those qubits has to be entangled with every other qubit, and that ain't easy.

22 Reader Comments

Can I just say that the phrase I used to hate more than any other from an editor was "can you do an image for a story on quantum computing?". SURE GET RIGHT ON THAT. Why it didn't occur to me to just combine cats and boxes in the past escapes me. Meow!

Can I just say that the phrase I used to hate more than any other from an editor was "can you do an image for a story on quantum computing?". SURE GET RIGHT ON THAT. Why it didn't occur to me to just combine cats and boxes in the past escapes me. Meow!

Can I just say that the phrase I used to hate more than any other from an editor was "can you do an image for a story on quantum computing?". SURE GET RIGHT ON THAT. Why it didn't occur to me to just combine cats and boxes in the past escapes me. Meow!

You do realise that I, and many other geeks too, now feel the urge to get similar boxes?

I think that it's worth elaborating on the fact that the gate that is described here is based on conditional measurement, if only to emphasize just how far away proper photonic quantum gates are. Conditional gates rely on a fundamentally probabilistic gate operating in just the right way. In other words, they fail most of the time.

The 97% success rate for this gate come from the fact that they don't count the cases where they can detect that it has failed. That's fair in a proof-of-principle gate, but it isn't scalable. If each gate only works half (or one-ninth) of the time, then a long chain of gates attempting to perform a useful computation will fail pretty much all of the time.

The work presented in article is an impressive miniaturization of quantum gates that couldn't possibly scale up to build a quantum computer. That certainly doesn't mean that the work is useless; simple optical devices will need to be scaled down and will need to operate at the single photon level for optical quantum computing and communication but the work doesn't mean that photonic quantum gates are a solved problem.

A single, deterministic optical quantum gate, even if it took up an entire room, would still be very big news in the community. What's needed is to get two photons to interact with each other; something that they don't tend to do.

Reminds me of Signal to Noise by Eric Nylund. They use waveguides to build quantum computers in the book, and there is a really cool scene where it describes the "waves" traveling through the "canyons".

Researchers show that a waveguide-based quantum logic gate works as close to perfectly as the quantum world will allow.

Read the whole story

Great article. But also depressing. It seems like the difficulties in creating a quantum computer are insurmountable. Perhaps it's nothing more than an entertaining thought experiment? Not that we should give up.

Speaking of thought experiments I think it's ironic that Schrodinger's Cat, a thought experiment meant as a critique of the prevailing interpretation of quantum mechanics, the Copenhagen interpretation, is now that interpretations most well-known symbol. I doubt anyone even thinks of the irony anymore given how entrenched the Copenhagen interpretation is but it is only one interpretation. I guess many-worlds gets some play these days but both are ridiculous.

So is it possible for a quantum operation to provide a result that is both right and wrong at the same time?

Yes.

This is a situation where physicists are frantically working on a product that computer scientists WILL NOT USE. It's important to understand the occupational distinction between the two. The following is a true statement:

There is no problem solvable on a turing machine implemented with qbit logic that cannot be solved on a turing machine without qbit logic.

In other words: the computability of a problem is not dependent on the hardware. The time to compute it, however, is. Quantum computing can make the solution of a computable problem faster but it cannot make things computable that previously weren't. So the value of a quantum computer lies solely in how much faster it is relative to a traditional one; cost and FLOPs being easy to compare against each other.

But here's where the distinction between physicists and programmers comes into play:

If a quantum computer cannot provide the same answer to the same question consistently... THEN IT IS NOT EVEN A TURING MACHINE. In other words, it's a broken computer; and thus useless to the computer scientist, who needs hardware that produces a consistent result, not an ambiguous probability. You can't compile and execute software MOSTLY RIGHT; it's either right or it's not, and if not then it mostly won't work, and on the times when it does you won't want to trust the answer it gives you.

I'm confused about how any logic gate can be created from a NOT gate... A NOT gate has an input and an output... It can (and must) only invert the input. Are you sure you didn't mean NOR or NAND? Or am I misunderstanding something?

So is it possible for a quantum operation to provide a result that is both right and wrong at the same time?

Yes.

This is a situation where physicists are frantically working on a product that computer scientists WILL NOT USE. It's important to understand the occupational distinction between the two. The following is a true statement:

There is no problem solvable on a turing machine implemented with qbit logic that cannot be solved on a turing machine without qbit logic.

In other words: the computability of a problem is not dependent on the hardware. The time to compute it, however, is. Quantum computing can make the solution of a computable problem faster but it cannot make things computable that previously weren't. So the value of a quantum computer lies solely in how much faster it is relative to a traditional one; cost and FLOPs being easy to compare against each other.

But here's where the distinction between physicists and programmers comes into play:

If a quantum computer cannot provide the same answer to the same question consistently... THEN IT IS NOT EVEN A TURING MACHINE. In other words, it's a broken computer; and thus useless to the computer scientist, who needs hardware that produces a consistent result, not an ambiguous probability. You can't compile and execute software MOSTLY RIGHT; it's either right or it's not, and if not then it mostly won't work, and on the times when it does you won't want to trust the answer it gives you.

From what I understand, you can make a qcomputer that is accurate. You just have to introduce a lot of error checking which would slow down its computational speed.

So is it possible for a quantum operation to provide a result that is both right and wrong at the same time?

Yes.

This is a situation where physicists are frantically working on a product that computer scientists WILL NOT USE. It's important to understand the occupational distinction between the two. The following is a true statement:

There is no problem solvable on a turing machine implemented with qbit logic that cannot be solved on a turing machine without qbit logic.

In other words: the computability of a problem is not dependent on the hardware. The time to compute it, however, is. Quantum computing can make the solution of a computable problem faster but it cannot make things computable that previously weren't. So the value of a quantum computer lies solely in how much faster it is relative to a traditional one; cost and FLOPs being easy to compare against each other.

But here's where the distinction between physicists and programmers comes into play:

If a quantum computer cannot provide the same answer to the same question consistently... THEN IT IS NOT EVEN A TURING MACHINE. In other words, it's a broken computer; and thus useless to the computer scientist, who needs hardware that produces a consistent result, not an ambiguous probability. You can't compile and execute software MOSTLY RIGHT; it's either right or it's not, and if not then it mostly won't work, and on the times when it does you won't want to trust the answer it gives you.

You'd be right if physicist were trying to replace classical computers with quantum computers but they aren't. The general idea is to use a quantum computer as a co-processor for certain domains of problems where it is efficient. In some of those domains the trade-off between probabilistic answers and efficiency is favorable assuming sufficient error checking. But, you are quite right to emphasize that quantum computers aren't ever going to wholly replace classic computers.

I'm intrigued about methods allowing for entanglement between qubits. Will simple paw sized holes work or is something more elaborate necessary? I'd guess the question can only begin to be studied with a sufficiently large and very curious LAN party.

More on topic if we get quantum computing figured out by the end of this decade what are we going to do for cryptography? Clearly governments will want to try keep it under raps for security reasons, but that would be just as dangerous for freedom as well.

You'd be right if physicist were trying to replace classical computers with quantum computers but they aren't. The general idea is to use a quantum computer as a co-processor for certain domains of problems where it is efficient. In some of those domains the trade-off between probabilistic answers and efficiency is favorable assuming sufficient error checking. But, you are quite right to emphasize that quantum computers aren't ever going to wholly replace classic computers.

Consider threading... when we approached the practical limit to how fast we could make a core run (and thus how fast we could make a single threaded process execute), the solution from the hardware manufacturers was to put more cores on the chip and then tell programmers to thread everything.

Those who know much about threading know that this has not happened. Because threads have overhead to manage, are extremely difficult to debug, and not every problem is easy to perform in parallel; many of the common tasks that ARE easy to run in parallel take more time to manage running in parallel then they do to run serially due to optimization and caching. In other cases, developers have simply not bothered to invest the time to rework their products to take advantage of it (one notorious case being Final Cut Pro, Apple's dirty secret they try to gloss over while hawking their twelve core towers).

I relate qbits to threads. They're not the same of course but I see them winding up on the same shelf of expensive gimmicks that ultimately do little to improve overall performance because they don't answer a question that needs answering.

The only way I see qbit computing having ANY significant impact is if they're able to use it to provide a new resurgence of CISC with no hit to cycle speed over current processors. That's the only use that will actually reduce line counts in a way that isn't gimmicky and rarely employed like threading.

I'm confused about how any logic gate can be created from a NOT gate... A NOT gate has an input and an output... It can (and must) only invert the input. Are you sure you didn't mean NOR or NAND? Or am I misunderstanding something?

This confused me as well, I thought it was only NOR and NAND that are functionally complete on their own. Is anyone able to clear things up?

Chris Lee / Chris writes for Ars Technica's science section. A physicist by day and science writer by night, he specializes in quantum physics and optics. He lives and works in Eindhoven, the Netherlands.