That would lead to a revolution in efficiency of everything from travel to mass transit to shipping.

Certainly a better example than a Sudoku solver... all a Sudoku puzzle is, at it's core, is a depth first search. You can speed up the algorithm with Dancing Links, but even on a moderately fast PC a DFS is fast enough to get a solution to a Sudoku puzzle in the blink of an eye.

Actually you don't need the DFS.I've built a Sudoku program to help me reduce the boring parts (filling in the only posssible option if it is known). I didn't know that it would solve all boards except for the hardest ones.

Then I've added another filter that still was not DFS and it solved all boards to this day except 2. One of which had 2 solutions and the second could be solved with a DFS of depth 1.

Took all the fun out of the game.

Some timing statistics: less than 1 second with Javascript on Firefox. Ab

Depending on how many initial numbers are given at the start of the puzzle, multiple solutions to a specific initial set of numbers could exist.
Those puzzles are generally considered "broken" - but you can obtain a valid solution from those puzzles.

As an example of what I mean, consider the following puzzle grid - in the upper left corner, the digit is 1, in the far lower right hand corner, the number is 9 - multiple solutions exist that have a 1 in the (1,1) position and a 9 in the (9,9) position, and

But of course they have solutions. Otherwise, you couldn't start with an empty 9x9 grid and create a Sudoku in it in the first place.

In any case, solving the things is a much more trivial problem than creating valid ones or arbitrary difficulty. But the fact that seemingly hundreds of different publishers are out there creating books without any quantum computers at all is probably a good hint that nothing to do with Sudoku is a particularly impressive computational task to use to sho

my wife thought I was insane, I learned a lot more about excel though.

I got it to two tier away solutions- if this can't be a 5, that must be a 7.

but never a third.. my head exploded just following the examples I saw on line of three dependencies to determine one number...and trying to calculate how to account for that, other than brute force checking.....

Likewise. It's just a purely mechanical process - grind it out till it's all filled in. I don't find it satisfying in the least. Give me a good cryptic crossword any day - I put the popularity of sudoku down to the fact that, unlike crosswords, you don't actually need to know anything to be able to do one.

Which is not an algorithm that runs in polynomial time. It is a DFS of all (legal, remaining) permutations, which is an exponential number.

even on a moderately fast PC a DFS is fast enough to get a solution to a Sudoku puzzle in the blink of an eye.

Regardless of how easy a 9x9 grid seems, Sudoku is still at its core an NP Complete [u-tokyo.ac.jp] (PDF warning) problem. Why is it therefore any less valid than any other NP complete problem? Travelling Salesman is also pretty easy with less than 10 nodes... likewise you can feasibly crack any encryption scheme by brute-force if you constrain it to have a tiny key size. It's all about scale.

The beauty is, if you solve any NPC problem you solve them all, by definition. So, Mr. Smarty Pants, if your Sudoku solver is good enough to solve any grid in polynomial time, please show the rest of us, as you've just cracked every encryption scheme invented to date.

Sudoku is less valid than another NP problem precisely because it is quickly solvable with a conventional computer, and therefore subject to fraud. I happen to have built a quantum computer in my garage. I'll be happy to prove that to you by solving any sudoku you send me in less than an hour.

The first digital computer systems did not solve anything "amazing" but the fact that they solved anything at all was the amazing bit.

Quantum computing is very new (in the physically exists sense) and the fact that they figured out how to build, program, and extract the solutions for some, albeit relatively simple, problems is a major step forward.

Once the understanding is complete enough and reliable enough then the really tough problems will be sure to follow.

I think you've missed the point of the Traveling Salesman problem. It's a theoretical mathematical exercise, not a practical issue in mass transit or shipping. The important bit is that we could solve all sorts of other, more interesting problems if we could solve that one.

Nobody is going to use Travelling Salesman in the real world to plan journeys. You can already quickly run an algorithm which will get you a journey plan that's maybe 99% as good as the optimum. Besides, real things (even salesmen) don't just have to travel in straight lines between points in space. There are other factors like lunch breaks and the location of good restaurants, which the problem doesn't account for.

Travelling Salesman is NP-hard, which means (I think) that if you find a polynomial time algorithm for it, any problem in NP can be done in polynomial time (hence P=NP). But I believe that even this quantum computer can't calculate TSP in polynomial time, except for instances small enough to fit inside the 16-qubit memory. Can anyone comment on this? By contrast, a conventional computer can always calculate TSP in the same time complexity (exponential time) no matter how large the input - you just have to hook up enough memory.

Even if you could solve TSP in polynomial time on a quantum computer, it still wouldn't show that P=NP as the machine would be nondeterministic. The goal of quantum computers is to sidestep the NP issue, not reduce it to P.

You are pretty much correct--traveling salesman is NP-hard and most theorists believe that BQP, the class of problems efficiently solvable by a quantum computer in polynomial time, is strictly smaller than NP (and therefore contains no NP-complete problems). This is, of course, not shown--for all we know, it could even be strictly bigger than NP. The evidence is that the only two problems quantum computers can solve in polytime right now that a turing machine can't are integer factorization and discrete log, neither of which is believed to be NP-hard.

BTW, of course a problem instance must fit in memory--but the idea is that we should be able to build a 1000 qubit computer, while going through 2^1000 possibilities to brute-force a problem of size 1000 is intractable. But with our knowledge now, even if you fit a traveling salesman instance, the quantum computer wouldn't know what to do with it.

As GP pointed out, we do have excellent approximations for TSP and most practical problems--the issue is mathematical--that we don't have algorithms that can guarantee the absolute best solution. The one place quantum computers are potentially useful is crypto--nearly all of which revolves around the hardness of factoring and discrete logs (which I think is a very strange coincidence, if it is one).

Actually, to be quite clear, the "Travelling Salesman" problem is one of traversing an arbitrary graph, which can indeed include information representing non-straight distances and preferred edges where the "good lunch places" can be found. It's a very general optimization problem, and has been effectively solved for practial use for a number of years, using the "simulated annealing" method.

Actually there are many variations on the Traveling Salesman that an optimal solution is required for. In a previous life I worked on some software for laying out circuit boards, massive circuit boards. In fact the speed of the production line was limited by how long it would take the robots to place everything on the board. We managed to have a fairly optimal solution by dividing up the card into smaller regions (bands) and solving each one individually. The less distance the robot would traverse the b

But an optimal solution is not required in the routing problem you cite. After all, you went for "fairly optimal" rather than optimal because at some point, the cost of computing time required (assuming generously that the problem is solvable in our universe, it doesn't take much for a routing problem to pass that threshhold) exceeds the benefit to be gained.

Currently the best solutions for TSP are based on parallel solutions, for which there is no shortage of hardware. Granted large parallel systems are not cheap (I'm not referring to little 4- and 8-way systems), but they do exist and are heavily used by researchers that require such technology.

Distributed parallel clusters such as the approach used by the SETI "screen saver" also work for such classes of problems. Each node computes just one possible solution, but tens of thousands of nodes chew at solu

I have no doubt that UPS, the company that trains their drivers to turn the ignition key with one hand and buckle up with the other simultaneously, TO SAVE TIME, is interested in improving solutions to the traveling salesman problem. They're probably pretty close, but I bet they'd love another 1 percent improvement in performance. I don't know whether UPS's problem can be stated well enough to make quantum computers speed it up any, but the TSP doesn't require straight lines in space, just estimated times b

Besides, real things (even salesmen) don't just have to travel in straight lines between points in space. There are other factors like lunch breaks and the location of good restaurants, which the problem doesn't account for.

Which, of course, clearly demonstrates the utility of the Bistro Drive for all Traveling Salesmen.

Disclaimer: I am a physicist who works on quantum computing and also has a computer science background

Nobody is going to use Travelling Salesman in the real world to plan journeys. You can already quickly run an algorithm which will get you a journey plan that's maybe 99% as good as the optimum.

Actually, I think there is a theorem that finding an algorithm that efficiently produces highly-accurate approximate solutions to arbitrary problems in NP-hard is about as hard solving NP-complete problems exactly.

All this aside, it's worth noting that D-Wave is only claiming to provide a square-root speed-up for NP-complete problems, and there is some doubt as to whether they can even deliver that, as they scale up to larger numbers of quantum computers. While it's technically possible that P=NP, most people believe P!=NP, and it seems almost as likely that BQP [caltech.edu] != NP (BQP is the class of problems efficiently solvable with a quantum computer).

For an excellent discussion of what D-Wave has done and just how skeptical you should be, visit Scott Aaronson's blog [scottaaronson.com]. (No, I am not Scott Aaronson, but I do know him and can vouch for him being an extremely smart guy. I am also not the Quantum Pontiff, aka Dave Bacon.)

There is already a P partial solution to the travelling salesman. It does not find the best path, but it is proven that it finds at worst a path that's twice as long.With a few heuristics, the worst case becomes very rare, and it usually finds something closer to the best solution.

So if we solve the NP travelling salesman in a perfect way, at most we'll get paths that are half the length.That could be an improvement, but not a revolution.

"I was taking Steven Cook's Complexity Theory class at UofT and when we were given the TSP I immediately saw a solution reduced into Dijkstra's algorithm. I don't think anyone in the class actually followed, the prof said that it can probably work. I never dealt with it again, but that was the reason I remember that class."

Never heard of one (bigit). I have, however, heard of the "binary digit" that was shortened to "bit". Given that history, "qubit" is short for "quantum binary digit" - which is an oxymoron since quantum digits can be any (or all) of several states, not just on or off (binary). A more accurate acronymish shortening would be "quigit" - which sounds awkward enough to be shortened to "qit", (pronounced KIT rather than QUIT to avoid confusion).

But qubits are a superposition of the two binary states – they're a|0> + b|1>, where |0> and |1> are linearly independent vectors in the state space, and a and b are complex numbers. I don't see why you couldn't have a third independent vector, giving qutits (?) of the form a|0> + b|1> + c|2>, and it probably doesn't even make the maths much harder; but binary qubits are easier to find in the physical world, such as the polarisation of photons.

I'm going to argue that "qubit" will most likely never be shorted, and certainly not to "quit."One often over-looked factor in the shortening of words (because its completely subjective and unquantifiable) is whether or not the original word was cooler sounding and rolled off the tongue or not. "Bigit" just doesn't. "Qubit" is better. Both are two syllable words, so brevity isn't as much a factor, for English speakers at least, as it is for other shortened words like "auto." Ultimately, the relative awk

Claude E. Shannon first used the word bit in a 1948 paper. He attributed its origin to John W. Tukey, who had written a Bell Labs memo in 9 January 1947 in which he contracted "binary digit" to simply "bit".

Who's going to be the first to add [wikipedia.org] "bigit" to Wikipedia? It should say something like: "Term first coined on/. in order to be re-abbreviated as 'bit'. Used to make example of how 'qubit' might one day be called 'quit'. Bigit should rhyme with digit so as to not be confused with bigot."

Bit is short for 'Binary Digit'. A few decades ago there was some academic playing around with base-3 computers, which used 'Trinary Digits' or Trits.

That led to the old joke - "When they build base-4 computers, will they call it quits?"

Anyway, qubit IS the right term, for Quantum Bit, or Quantum Binary Digit. It is perfectly binary, there are two eigensates, |0> and |1>, but the qubit can be in a linear combination of them. In fact, it can be in an imaginary linear combination of them, and the qub

I'm actually curious - for how long do the 16 qubits stay coherent? You can only do quantum computations while the qubits remain coherent. Furthermore, IIRC coherence times where (at best) in the range of a few microseconds.

I'm actually curious - for how long do the 16 qubits stay coherent? You can only do quantum computations while the qubits remain coherent. Furthermore, IIRC coherence times where (at best) in the range of a few microseconds.

That's why quantum computers will be so fast. If not, they will constantly forget... Damn. Where was I going with this?

Is this a true quantum computer, or one that simply uses certain quantum properties? Scientists weren't predicting this for another 20-30 years. Wouldn't a 1024 qubit computer be far faster than any cluser on earth? And if I'm not mistaken, a 16 qubit computer would be faster than any single computer. I'd like to see some speed comparisons for parallel tasks.

D-Wave held its first public demonstration Tuesday of a machine it claims uses quantum mechanics to solve a certain type of problems, such as searching a database for matching molecular structures.

But the company did not make the machine available for inspection and instead showed video from a remote location, saying it was too sensitive to be easily transported.

And notwithstanding lofty claims in the company's press release about creating the world's first commercial quantum computer, D-Wave Chief Executive Herb Martin emphasized that the machine is not a true quantum computer and is instead a kind of special-purpose machine that uses some quantum mechanics to solve problems.

Anyone want to guess how long before "qubit" gets compressed to "quit" (as "bigit" became "bit" in the last century)?Anyone want to guess how long before hillbillies start asking "How many quberts you got in that there system?"

I think qubert rolls of the tongue easier than qubit. Let's push qubert as the new way of saying qubit!

I come to warn you that there shall be a great outage.. go forth and build an array to save my creations. Make it 100 qubits long, 30 qubits wide, and 10 qubits deep. Into this hash all data in/usr/god/dataM/.. and/usr/god/dataF/

A 16 qubit machine will calculate the results for all 65536 states that the qubits can be in simultaneously. Make a 1024 qubit machine, and you can factorise a 1024 bit number in constant time. Essentially it will meant that all NP-complete problems of a small enough size will be solvable in (low order) polynomial time.

As long as the number of qubits required is proportional to N and not to something bigger, it still counts as polynomial time, no matter the size. It just adds an extra factor of N to the polynomial.There was some talk a few years ago about DNA computing, which solved problems by generating all of the possible solutions in DNA and then doing an O(1) step to figure out which strand had the right properties for your solution. But since a TSP of 100 nodes required DNA weighing more than the planet Earth, tha

Ternary logic represents 3 values, in a balanced ternary system -1, 0, +1. It is similar in that 0 represents both + and - like a superpositioned qubit. The difference is the entanglement of the qubits. A qubits that is entangled with another, even if separated, must measure the same as the one it is entangled with, allowing for multiple calculations to occure at once; something that ternary logic doesn't do, though someone correct me if I'm wrong. Interesting to note there's a 'qubit' for ternary logic as

Pretty slim, the NSA has loads of mathematics PhDs but no physicists.
Also if it was so easy to do the NSA would be concerned about American info being snooped on. Surely they would rather that nothing can be snooped on than a free for all where everything can be snooped on.

To save some work to people replying to misconceptions, here's a list of the common misconceptions about quantum computing that I've seen lately:

Quantum computers can solve NP-complete problems quickly (in polynomial time) -> not true, the speedup given by Grover's algorithm [wikipedia.org] is quadratic, not exponential. This means that an NP-complete problem which would take O(2^n) in a classical computer would take O(2^(n/2)) in a quantum computer, which is clearly not polynomial time in n

Quantum computers with N qubits are as efficient as 2^N classical computers -> not true, not all algorithms can get an exponential speedup with quantum computing

Quantum computers will render cryptography useless -> not true, breaking asymmetric ciphers with QCs are subject to the quadratic speedup I explained above which means it will be enough to double the size of the key in order to account for QCs. Some symmetric ciphers (i.e. public key systems) are broken by quantum computing (for example RSA and discrete logarithm), but it is thought that there are some symmetric ciphers which are not vulnerable to attacks by QCs (excluding by the grover search algorithm, which as I explained above is not very hard to account for). Quite ironically, I remember reading that the same attack which renders discrete logarithm public key cryptography useless also allows for the existence of a public key encryption which requires fast calculation of discrete logarithms.

This would all be true for a Quantum Computer based on Binary, but this is an "Analog" computer. Back in the 40's and 50's Analog computers could run rings around digital computers in their domains, but they where only approximations, but for plotting flight plans and trajectories a few digits of precision where often enough.

I'm guessing similarly this machine might quickly calculate solutions to things like Traveling Sales Man problem and other NP complete problems, but not be guaranteed to have found

Since I'm not very familiar with the things you mentioned there, I'll ask: how better would the analog QC solutions be, compared to the current approximation algorithms for NP-complete problems? References/links welcome:)

Let's take prime factorisation (two prime numbers multiply together to get a product), it's hard for classical computers to do. The simlest algorithm is to divide by every prime up to the root of the product until you get another prime out the other end. The simplest way to speed up that is to use more/faster processors. Oh and you also probably have to work out as you go along which numbers are prime and which are not.That's a lot of number crunching. using a "true" quantumn computer you could set up a mul

Also I don't think qubit is the right word any more because it sounds like each of the 16 qubits in this machine is really a whole value like a byte or a word, else it wouldn't be "analog"

Perhaps this thing will rock at some application that needs thousands or
millions of NP-complete approximations per second to be practical. Maybe it
can efficiently simulate neural networks. Maybe it can speed voice or vision recognition or assist

You ignore what is probably the source of the misconceptions, namely Shor's algorithm [wikipedia.org] which solves integer factorization in O(N^3) time and O(N) space (where N is the log of the number to be factored). However, integer factorization probably is not NP-complete and no one has figured out how to recast in polynomial time an NP-complete problem as a factoring problem that the Shor algorithm can handle.

Immediately after booting, the Quantum Computer disappeared in a flash of light and noise. It resurfaced in 1985, where it briefly took over a Commodore 64 and corrected some mistakes it made the first time around, before moving onto a UNIVAC in 1955...

One problem with a demo of this sort is that it isn't possible to know how much was actually done by the quantum computer and how much was done by the classical front end computer. Typical Sudoku puzzles can
be solved essentially instantaneously by classical computers. 16 qubits isn't a heck of a lot of
quantum parellelism, so I do wonder how much of the real work they were doing.