Posted
by
ScuttleMonkeyon Monday June 29, 2009 @12:58PM
from the baby-steps dept.

ScienceDaily is reporting that the first rudimentary solid-state quantum processor has been created by a team led by Yale University researchers. "Working with a group of theoretical physicists led by Steven Girvin, the Eugene Higgins Professor of Physics & Applied Physics, the team manufactured two artificial atoms, or qubits ('quantum bits'). While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states. These states are akin to the '1' and '0' or 'on' and 'off' states of regular bits employed by conventional computers. Because of the counterintuitive laws of quantum mechanics, however, scientists can effectively place qubits in a 'superposition' of multiple states at the same time, allowing for greater information storage and processing power."

I'm a layperson on the subject but have read extensively, and my understanding is yes, there is a "quantum superposition" and you can prove it by running exactly the same experiment over and over again, and getting different results each time. You take the average of the results and that's the answer to your problem.

Hence, a quantum algorithm only has a probability of arriving at the correct answer. Executing a quantum algorithm several times gives you increasingly better odds in polynomial time, and becaus

If it works, it works, whether you built it based on your interpretation or based directly on your observation. The proof is in the pudding, as they say, whether you baked the pudding by following a recipe or whether you just kinda threw some ingredients in a pot, stirred them a few times, and then said your special rhyme three times over them.

So the question becomes "does it work?" and the answer is 'yes'. It's a very small pudding but it still tastes like pudding, if you catch my meaning.

You can find the lab site here [yale.edu] with several papers freely available in pre-publication form on arxiv [yale.edu] from the researchers. I'm trying to find the "basic algorithms" the article alludes to that these rudimentary processors can perform. I thought only a handful were applicable (Shor's algorithm) to quantum computing. Anyone know?

While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states.

Does this sound like they're using real atoms to simulate qubits? Perhaps I'm misinterpretting, but it looks like it's still going to take an exponential amount of resources to "make" each additional qubit.

Why would it take an exponential amount of resources? One of these qubits only amounts to around 1.66 Ã-- 10e-14 percent of a mole of aluminum. For every mole of aluminum they can create 6 quadrillion qubits. I'm not sure how many qubits would be needed for a quantum computer but I'm doubting it's much more than that.

Well that's my question. Does it scale linearly with the number of qubits? The article is not very clear about that.

I see no indication that the number of atoms per qubit will scale at all in relation to anything but time spent in a quantum state. It's purely speculation (given a single data point) to assume that this number will scale at all just because qubits are added. It's also speculation to assume they won't, but it seems the more logical guess. The obvious correlation would be between number of atoms and ease of reading them.

As for being only 2 qubits, that's just to make the prototype simpler to create.

Yea... as I understand it, since a qubit can represent 0 and 1 simultaneously. In a sense a single qubit represents 2 bits, one bit in a 0 state and one bit in a 1 state. Ten qubits, can represent all 2^10 states simultaneously, so in that same sense 10 qubits can represent 1024 normal bits. 640K qubits can represent a HUGE number of classical orientation of bits. (This is about 10^800 times the larger than the number of atoms in the universe [wikipedia.org])

It's more complicated than you understand. A qubit can be |0> or |1>, or a superposition over |0> or |1>, OR a probability distribution over |0> or |1> (called mixture). Mixture is directly akin to the probability of a bit being in 0 or 1 state, whereas superposition (which is independent of mixture) can appear similar in certain conditions, but in general is a very different thing.

There's no simulation -- the large group of atoms forms one qubit. That's why this is interesting.
Normally, only very small things (like one atom) exhibit quantum behavior. This system is large for something able to exhibit quantum behavior. All the parts effectively join together to act like one quantum system.

I am not trying to split hairs. This is actually a rather important point: they did not manufacture "two artificial atoms, or qubits". They manufactured two clusters of atoms that acted as qubits.

If the quality of journalism we see for politics or for useless celebrity trivia became just like the quality of journalism we see for technical matters, there would be significant backlashes against it. Joe Sixpack might not care about the distinction between abstract qubits and their physical implementation, but by God they better not misreport how many times $POP_SINGER has been divorced!

Though I'm not so sure that blatantly inaccurate (or misleading) statements are worse than the way more mainstream

PS. And they did explain later in the text that "While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states.". You are splitting hairs, my friend;-)

Perhaps, but those weren't the hairs I was splitting. The article stated that the "artificial atoms" (incorrect enough to start with), were two qubits. And that is simply incorrect. They held two qubits of data... which is a different matter entirely.

Seriously, I wonder if this comes to pass and we continue on the binary process forever. (IIRC, some mainframes back in the '40s and '50s used decimal processing, which was too slow then, so all switched eventually to binary.)

Given that there is no real advantage to switching away from binary, why not? Decimal is far slower and less information packed, from the computer's perspective. And since it only takes a cycle or so for the computer to translate for the humans, just let it.

The only really viable alternative is trinary computing, which is slightly less optimal generally. (The actual ideal would be base e, but it's really hard to build a system around irrational numbers.)

While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states.

This sounds a like a bose-einstein condensate, where many atoms will act is if though they are all part of a larger, single atom. Also, it gains some pretty interesting properties, neither of which can be described exactly as solid, liquid or gas.

The ScienceDaily article and the/. summary seem to be confused on the experimental setup. From the Nature article, "[e]ach qubit has a split Josephson junction...." The Josephson effect is an effect where two superconductors are separated by a very thin insulating layer. A "supercurrent" composed of paired correlated electrons (Cooper pairs) can tunnel across this barrier under certain circumstances. Cooper pairs act as bosons, just as atoms do in Bose-Einstein condensates, so they have long been a focus of research for quantum computing. In this experiment, the device was a "180nm Nb film was d.c.-magnetron sputtered on the epipolished surface of an R-plane corundum wafer," meaning that the superconductor they used was niobium, and the insulator was aluminum oxide, aka corundum. They built it out of these [wikipedia.org], in other words.

They go on to mention that the apparatus was cooled to 13 millikelvin using a helium dilution refrigerator. Now, niobium is superconductive to about 9 kelvin in the pure state (and about 23 kelvin in some alloys), so I would assume the extra effort to make it that cold has more to do with preserving the delicate electronic state of the qubits than with merely chilling the superconductors.

"First Electronic Quantum Processor Created".. Sorry to spoil the fun, but does anyone do facts checking with these articles before posting? Guess not, because these [dwavesys.com] guys presented a 28 qbit prototype and working quantum processor back in 07 [zdnet.com].

Yes the first. The Dwave guys aren't building quantum computers. Their system lacks entanglement between the qubits, which is essential to running quantum algorithms. They have also been less than forthcoming about the coherence in their system.

I for one welcome our Linux running Qubit overlords and in full disclousure IANAL but ITFA they had me ROTFL'ing when I pondered Linxu being greated then Micro$oft running in an N-Dimensional space until NYCL told me that my ImaginaryProperty was sold by kdawson to CmdTaco because Truth != Facts != Love != Reality after SCO and the RIAA\MPAA sued Open Source and WON!

The fact that they managed to construct a quantum computing device using solid-state physics is a technological breakthrough. It may revive the interest in the topic (which was fading due to lack of technological progress).

I took a class on Quantum computing, and studied many specific QC algorithms, so I know a little bit about them. A lot of misunderstandings about them, so let me summarize.

Quantum Computers are not super-computers. On a bit-for-bit (or qubit-for-qubit) scale, they're not necessarily faster than regular computers, they just process info differently. Since information is stored in a quantum "superposition" of states, as opposed to a deterministic state like regular computers, the qubits exhibit quantum interference around other qubits. Typically, your bit starts in 50% '0' and 50% '1', and thus when you measure it, you get a 50% chance of it being one or the other (and then it assumes that state). But if you don't measure, and push it through quantum circuits allowing them to interact with other qubits, you get the quantum phases to interfere and cancel out. If you are damned smart (as I realized you have to be, to design QC algorithms), you can figure out creative ways to encode your problem into qubits, and use the interference to cancel out the information you don't want, and leave the information you do want.

For instance, some calculations will start with the 50/50 qubit above, and end with 99% '0' and 1% '1' at the end of the calculation, or vice versa, depending on the answer. Then you've got a 99% chance of getting the right answer. If you run the calculation twice, you have a 99.99% chance of measuring the correct answer.

However, the details of these circuits which perform quantum algorithms are extremely non-intuitive to most people, even those who study it. I found it to require an amazing degree of creativity, to figure out how to combine qubits to take advantage of quantum interference constructively. But what does this get us?

Well it turns out that quantum computers can run anything a classical computer can do, and such algorithms can be written identically if you really wanted to, but doing so gets the same results as the classical computer (i.e. same order of growth). But, the smart people who have been publishing papers about this for the past 20 years have been finding new ways to combine qubits, to take advantage of nature of certain problems (usually deep, pure-math concepts), to achieve better orders of growth than possible on a classical computer. For instance, factoring large numbers is difficult on classical computers, which is why RSA/PGP/GPG/PKI/SSL is secure. It's order of growth is e^( n^(1/3) ). It's not quite exponential, but it's still prohibitive. It turns out that Shor figured out how to get it to n^2 on a quantum computer (which is the same order of growth as decrypting with the private key on a classical computer!). Strangely, trying to guess someone's encryption key, normally O(n) on classical computers (where n is the number of possible keys encryption keys) it's only O(sqrt(n)) on QCs. Weird (but sqrt(n) is still usually too big).

There's a vast number of other problems for which efficient quantum algorithms have been found. Unfortunately, a lot of these problems aren't particularly useful in real life (besides to the curious pure-mathematician). A lot of them are better, but not phenomenal. Like verifying that two sparse matrices were mulitplied correctly has order of growth n^(7/3) on a classical computer, n^(5/3) on a quantum computer. You can find a pretty extensive list by googling "quantum algorithm zoo."

Unfortunately [for humanity], there is no evidence yet that quantum computers will solve NP-complete problems efficiently. Most likely, they won't. So don't get your hopes up about solving the traveling salesmen problem any time soon. But there is still a lot of cool stuff we can do with them. In fact, the theory is so far ahead of the technology, that we're anxiously waiting for breakthroughs like this, so we can start plugging problems through known algorithms.

Actually, it depends on how you define "chicken egg", unless there is a universally accepted definition that I'm not aware of. The genes of the organism in the egg were those of a chicken, but the egg itself was developed based on the genes of the mother, which you have stated was a non-chicken. Perhaps it was the father's genes which contained the crucial mutation that resulted in a chicken hatching from the non-chicken's egg. In that case, did it become a chicken egg after it was fertilized? It doesn'

What produced it just happened not to be a chicken. Something close, but not quite.

Except when posed in evolutionary terms, the whole question comes down to a problem of the human desire for classification versus nature's complete lack of giving a shit about that desire.

What precisely makes a chicken a chicken versus a chicken-minus-one-generation proto-chicken? Given that any population naturally has a degree of genetic variation, there's no "gold standard" for a chicken genome, and it is entirely possibl

The question has always been inane. We choose to define what is a "dinosaur" or what is an "egg", but nature doesn't care about our definitions. There must've been creatures with very similar reproductive or incubation systems that are not quite eggs. These creatures are also not quite "dinosaurs".

Next, the team will work to increase the amount of time the qubits maintain their quantum states so they can run more complex algorithms. They will also work to connect more qubits to the quantum bus. The processing power increases exponentially with each qubit added, Schoelkopf said, so the potential for more advanced quantum computing is enormous. But he cautions it will still be some time before quantum computers are being used to solve complex problems.
"We're still far away from bu