IBM Just Simulated the Biggest Quantum Computer to Date—What That Means for the Field

Google was in poll position to win the race for quantum supremacy, the point at which a quantum computer can do things a conventional one can’t. But IBM seems to have pulled the rug from beneath their rivals by carrying out the largest simulation of a quantum computer to date.

It had long been assumed that simulating more than 49 qubits—the quantum computing equivalent of the digital bits used in standard computers—was near enough impossible due to the colossal amount of memory it would require. But by using some smart mathematical shortcuts the group was able to simulate a 56-qubit machine using just 4.5 terabytes of memory rather than the exabyte (one million terabytes) previous approaches would have required.

That means the 49-qubit processor Google plans to unveil before the end of this year will not take the quantum supremacy crown, and it might take longer than some have predicted for quantum computers to surpass their conventional cousins. IBM researchers say they still don’t know the limits of how many qubits their approach could simulate.

They are at pains to say their findings in no way undercut the quest for practical quantum computers, which is unsurprising seeing as IBM is also one of the leading players in the race. The company has already built a 17-qubit processor and says it plans to achieve 50 “in the next few years.”

Considering that the team behind the advance was only established this year, after Google had announced its quantum supremacy goal, it’s hard to believe the research wasn’t at least partly motivated by a desire to take some of the steam out of their rival’s PR engine. Nonetheless, the news has gone some way to bursting a bubble of hype around quantum computing that has been building to a fever pitch this year.

To be clear, the fact that it would no longer represent the achievement of quantum supremacy will not mean building a 49-qubit processor is anything other than an incredible feat of engineering. “Any system with lots of qubits is worthwhile, because to get to 1,000 or 1,000,000 qubits we need to deal with 100 first,” Christopher Monroe, a professor at the University of Maryland who studies quantum information theory, told MIT Tech Review.

In fact, it’s been pointed out that simulating larger quantum computers could actually be invaluable for the technology’s continued development. “It being possible to simulate 49-qubit circuits using a classical computer is a precondition for Google’s planned quantum supremacy experiment, because it’s the only way we know to check such an experiment’s results,” the head of the University of Texas at Austin’s Quantum Information Center Scott Aaronson recently pointed out in a blog post.

Robert Wisnieff, one of the IBM study’s senior authors, explained to IEEE Spectrum that these simulations could help researchers get a head start on working out which applications the devices are best suited for and help investigate how and why errors creep into quantum computers—an outstanding problem for the technology. They are also a billion times slower than a real quantum computer, so the real deal is likely to have major speed advantages.

But what the announcement may do is inject a dose of realism back into the conversation around quantum computers. Quantum supremacy has been touted (primarily by Google) as a major milestone for the technology, but in reality it is a fairly technical one. Google’s plan to achieve supremacy involves getting their processor to produce a random output that a supercomputer could not predict—a task with few obvious practical applications. Getting quantum computers to solve real-world problems better than conventional ones will be a much greater challenge.

Writing in The Conversation, Michael Biercuk, a professor of quantum technology at the University of Sydney, says current machines represent massive progress for those researchers working on the fundamentals of the technology, but are “little more than toys from a practical perspective.”

He points out that it’s not even clear yet what kind of hardware is likely to prove the most viable. The superconducting qubits preferred by Google and IBM are fast but error-prone, while trapped ion qubits—the main competitor—are relatively resistant to error but comparatively slow.

Quantum computers’ propensity for errors could even provide a fundamental roadblock to progress. Most schemes for correcting these errors require the computer to have extra qubits dedicated to just this task, but certain predictions suggest the effort required could scale exponentially, with the number of qubits effectively making practical error-corrected quantum computers impossible. “We still face many fundamental questions about how to build, operate, or even validate the performance of the large-scale systems we sometimes hear are just around the corner,” writes Biercuk.

Ultimately, IBM’s shifting of the goalposts may actually do the field more good than harm. Achievement of quantum supremacy may have simply added to already overinflated expectations, despite the fact that even the technology’s most ardent advocates like Google think it will be at least a decade before there are practical applications.

Simon Benjamin, a professor of quantum technologies at the University of Oxford, told Bloomberg that such a long delay before the technology provides a useful return could see investors sucked in by the current hype get burned, which could create a backlash that ultimately hinders the technology’s development.

Biercuk warns against seeing the recent entry of big tech players and venture capitalists into the field as a sign of accelerating improvements in the technology. Instead, he says, it is in fact a response to recent advances in a field that is making steady but modest progress.

I am a freelance science and technology writer based in Bangalore, India. My main areas of interest are engineering, computing and biology, with a particular focus on the intersections between the three.