Microsoft makes play for next wave of computing with quantum computing toolkit

Microsoft wants to be ready for a quantum computing world.

At its Ignite conference today, Microsoft announced its moves to embrace the next big thing in computing: quantum computing. Later this year, Microsoft will release a new quantum computing programming language, with full Visual Studio integration, along with a quantum computing simulator. With these, developers will be able to both develop and debug quantum programs implementing quantum algorithms.

Quantum computing uses quantum features such as superposition and entanglement to perform calculations. Where traditional digital computers are made from bits, each bit representing either a one or a zero, quantum computers are made from some number of qubits (quantum bits). Qubits represent, in some sense, both one and zero simultaneously (a quantum superposition of 1 and 0). This ability for qubits to represent multiple values gives quantum computers exponentially more computing power than traditional computers.

Traditional computers are built up of logic gates—groups of transistors that combine bits in various ways to perform operations on them—but this construction is largely invisible to people writing programs for them. Programs and algorithms aren't written in terms of logic gates; they use higher level constructs, from arithmetic to functions to objects, and more. The same is not really true of quantum algorithms; the quantum algorithms that have been developed so far are in some ways more familiar to an electronic engineer than a software developer, with algorithms often represented as quantum circuits—arrangements of quantum logic gates, through which qubits flow—rather than more typical programming language concepts.

Microsoft's quantum programming language—as yet unnamed—offers a more familiar look to programming quantum computers, borrowing elements from C#, Python, and F#. Developers will still need to use and understand quantum logic gates and their operations, but they'll be able to use them to write functions, with variables and branches and other typical constructs. As an example, a program to perform quantum teleportation is offered as a kind of "Hello, World!" for quantum computing:

This snippet of code has a couple of functions, EPR and Teleport, along with a third function, TeleportTest to test that the Teleport function works. EPR creates an EPR pair of entangled qubits, using a Hadamard gate (H), which generates a qubit with an equal probability of being a 1 or a 0, and a controlled-NOT gate, which entangles two qubits to make the EPR pair. The Teleport function entangles two qubits and then measures (with M) the value of one of them. Measurement forces the qubits to take a specific value instead of both values in superposition.

The language is integrated into Visual Studio. This means not only color coding support, but also other Visual Studio features, such as debugging, are supported. Debugging programs means, of course, that you have to be able to run them. With quantum computers something of a rarity these days, Microsoft is also going to release two versions of a quantum simulator. One version will run locally; the other version will run on Azure. The simulator will be able to run quantum programs and offer something comparable to the traditional Visual Studio debugging experience so that algorithms can be inspected as they run.

It will have quite significant memory requirements. The local version will offer up to 32 qubits, but to do this will require 32GB of RAM. Each additional qubit doubles the amount of memory required. The Azure version will scale up to 40 qubits.

Longer term, of course, the ambition is to run on a real quantum computer. Microsoft doesn't have one, yet, but it's working on one. The company is researching the development of a topological qubit. These are appealing because they help address a consistent difficulty faced by quantum computers. Quantum states are delicate, and quantum noise introduces errors into calculations. Additional qubits can be added to algorithms to correct for this noise. The number of extra qubits varies according both to the algorithm and the kind of quantum machine being used; in principle, topological qubits will need far fewer error correcting qubits than other quantum computer designs.

This reduction in the number of qubits required is valuable because current prototype quantum computers are far smaller than the size required to do useful work. The number of qubits necessary depends on the application and the field. Microsoft is envisaging a wide range of applications, from quantum chemistry to materials science to medicine development or climate change research. In chemistry applications, a few hundred qubits can be used for, say, helping develop catalysts for nitrogen fixation. Materials science applications require perhaps a few thousand.

But even these numbers are huge compared to the quantum computers that have been built. Last year, for example, IBM offered access to a 5 qubit computer. Microsoft's hope is that the greater error protection offered by topological qubits will—eventually—make it easier to build quantum computers that are big enough for real work.

One awkward spectre is what happens if someone does manage to build a large quantum computer. Certain kinds of encryption gain their security from the fact that integer factorization—breaking a number down into smaller numbers that, when multiplied together, produce the original number—is believed to be hard for traditional computers. RSA encryption uses large integers—2,048 or 4,096 bits, typically—that are produced by multiplying together two large prime numbers, and RSA gains its strength from the fact that this multiplication is in a sense "irreversible;" given the product of the two primes, it's really hard to figure out what the primes were in the first place. Another kind of encryption, Diffie-Hellman, has a similar kind of property; the mathematical problem is slightly different (it's called the discrete logarithm), but the same concept applies; we don't have a good algorithm for traditional computers.

We do, however, have a good algorithm for quantum computers. For it to be useful, we'd need much larger quantum computers than we have today—factorizing a number with n bits requires about 2n qubits, so we're talking something like 4,000 to 8,000 qubits to break common encryption today—but if the technology were developed to build quantum computers with a few thousand qubits, these encryption algorithms would become extremely vulnerable.

In tandem with the work to develop quantum computers and quantum programs, Microsoft is also researching so-called quantum-resistant encryption algorithms. These are algorithms designed to run on traditional computers that remain secure even in a world with large quantum computers. Getting these algorithms developed, proven, and widely deployed will take many years.

That quantum computing future is, fortunately, still likely to be many years off. For now, Microsoft is taking sign ups for its quantum preview today. The preview, with the simulator and the programming language, will be released by the end of the year. It'll also include tutorials and libraries to help developers become familiar with quantum computing.

However many years from now when quantum computers become more practically available, this move will be seen as either astoundingly visionary, or utterly silly.

I'm going to predict 'visionary' - by getting a head start on exposure to the programs people write to implement quantum algorithms, Microsoft can start studying quantum program compilers. They can identify and implement optimizations and transformations for things like operation ordering, 'register' (qubit) assignment, etc. If they can generate programs that require shorter coherence times or fewer qubits than other implementors, they have a huge competitive advantage for as long as machine scalability is limited. Once that limitation relaxes, they'll have a solidly established place in the market, and customers may still prefer MS 'Q#' over alternatives if it persists in having lowing resource requirements.

On a more serious note, did they say anything about how well they expect the simulators will model expected performance profile of actual quantum computers? If you try out two different approaches for a solution will the faster one in the simulator necessarily be the faster one on actual hardware when that comes available?

However many years from now when quantum computers become more practically available, this move will be seen as either astoundingly visionary, or utterly silly.

I'm going to predict 'visionary' - by getting a head start on exposure to the programs people write to implement quantum algorithms, Microsoft can start studying quantum program compilers. They can identify and implement optimizations and transformations for things like operation ordering, 'register' (qubit) assignment, etc. If they can generate programs that require shorter coherence times or fewer qubits than other implementors, they have a huge competitive advantage for as long as machine scalability is limited. Once that limitation relaxes, they'll have a solidly established place in the market, and customers may still prefer MS 'Q#' over alternatives if it persists in having lowing resource requirements.

A big issue Microsoft seems to have is that when they have these visionary moments and get a huge headstart on something over the rest of the industry, they then usually sit on it or drag their feet without ever releasing a product or they release the finished product long after competitors have taken over the space.

They gravitate towards words that are common language but uncommon in application. So something like Hyperspace, Logic, Quasar - of course I doubt any of these would be the winner but you get the idea.

They gravitate towards words that are common language but uncommon in application. So something like Hyperspace, Logic, Quasar - of course I doubt any of these would be the winner but you get the idea.

Two more classic examples are Word and Excel. Or going back even further, Basic. C# though is more directly in this area, and not clear it's so much a "product" per se.

They gravitate towards words that are common language but uncommon in application. So something like Hyperspace, Logic, Quasar - of course I doubt any of these would be the winner but you get the idea.

Two more classic examples are Word and Excel. Or going back even further, Basic. C# though is more directly in this area, and not clear it's so much a "product" per se.

Basic was created in 1964 by John Kemeny and Thomas Kurtz; MS has nothing to do with its name.

They gravitate towards words that are common language but uncommon in application. So something like Hyperspace, Logic, Quasar - of course I doubt any of these would be the winner but you get the idea.

Two more classic examples are Word and Excel. Or going back even further, Basic. C# though is more directly in this area, and not clear it's so much a "product" per se.

Basic was created in 1964 by John Kemeny and Thomas Kurtz; MS has nothing to do with its name.

The snippet provided looks incredibly low-level, on the level of a rudimentary assembly language. This is clearly just the first step of many; it's going to take decades more research to get to something like the high-level languages used on classical computers.

The snippet provided looks incredibly low-level, on the level of a rudimentary assembly language. This is clearly just the first step of many; it's going to take decades more research to get to something like the high-level languages used on classical computers.

At the same time the abstraction gap between raw machine code and ASM is similar in size to the one between ASM and highly abstracted languages like SQL so it's still a huge step forward.

The snippet provided looks incredibly low-level, on the level of a rudimentary assembly language. This is clearly just the first step of many; it's going to take decades more research to get to something like the high-level languages used on classical computers.

At the same time the abstraction gap between raw machine code and ASM is similar in size to the one between ASM and highly abstracted languages like SQL so it's still a huge step forward.

... and as someone that's worked in both raw machine code & ASM, the later saves a lot of hassle and time. It's not even the initial writing, as with enough practice in raw MC you can form that at something close to ASM coding speed. But writing code ends up being a relatively small part of programming, you read code a lot more than you write it. Once you get to the scale of "useful" size the difference becomes a lot more apparent, especially reading code someone else wrote or you wrote some months/years back.

Where traditional digital computers are made from bits, each bit representing either a one or a zero, quantum computers are made from some number of qubits (quantum bits). Qubits represent, in some sense, both one and zero simultaneously (a quantum superposition of 1 and 0). This ability for qubits to represent multiple values gives quantum computers exponentially more computing power than traditional computers.

It seems to me that going from a bit that can hold 0 or 1 to a bit that can hold 0 and 1 at best doubles the computing power.

Where traditional digital computers are made from bits, each bit representing either a one or a zero, quantum computers are made from some number of qubits (quantum bits). Qubits represent, in some sense, both one and zero simultaneously (a quantum superposition of 1 and 0). This ability for qubits to represent multiple values gives quantum computers exponentially more computing power than traditional computers.

It seems to me that going from a bit that can hold 0 or 1 to a bit that can hold 0 and 1 at best doubles the computing power.

What it has is a magnitude of how probable it's value is a 1, and thus implies the complementary probability of it being a 0. So it's a lot more than double the information richness. More importantly, the speed the overall device can resolve the outcome of the interaction of these magnitudes means a lot of computation is done in a VERY short time.

I got an Azure subscription last year for my own learning purposes. It was a bit torturous... namely, expensive and complicated.

I do hope developers will get access to quantum computers without something like Azure. All the big players are pursing quantum computing so surely one of them will offer an affordable price and easy API (too much to hope for?)

In the meantime, I will definitely be looking forward to Q# in visual studio. Wrapping your head around qm programming is a general skill... so whether I eventually use Azure or another IDE/service, it's good to get started with it now in VS.

... I guess another way QM computing could all go for the small-time developers is QM chips. For example today there are USB machine learning / neural network sticks. Since the engineering is still in its early(ish) phases, I could see this might take another 10 years at least.

Where traditional digital computers are made from bits, each bit representing either a one or a zero, quantum computers are made from some number of qubits (quantum bits). Qubits represent, in some sense, both one and zero simultaneously (a quantum superposition of 1 and 0). This ability for qubits to represent multiple values gives quantum computers exponentially more computing power than traditional computers.

It seems to me that going from a bit that can hold 0 or 1 to a bit that can hold 0 and 1 at best doubles the computing power.

No, for the right type of problems that can fit in to the N qubits it has it's 2^N times faster.

Where quantum computers make a killing is for a type of problems called NP-Complete. In laymans terms these are problems that have exponentially many possible solutions, but each possible solution can be quickly verified.

These are impossible to solve precisely on a normal computer because the total number of possibilities to check is impossibly high. A quantum computer can take use of each qbit being both 0 and 1 to try all of them at the same time.

Where traditional digital computers are made from bits, each bit representing either a one or a zero, quantum computers are made from some number of qubits (quantum bits). Qubits represent, in some sense, both one and zero simultaneously (a quantum superposition of 1 and 0). This ability for qubits to represent multiple values gives quantum computers exponentially more computing power than traditional computers.

It seems to me that going from a bit that can hold 0 or 1 to a bit that can hold 0 and 1 at best doubles the computing power.

I got an Azure subscription last year for my own learning purposes. It was a bit torturous... namely, expensive and complicated.

I do hope developers will get access to quantum computers without something like Azure. All the big players are pursing quantum computing so surely one of them will offer an affordable price and easy API (too much to hope for?).

I'm not sure why this would be different than the history of binary digital computers, although maybe at somewhat different timescales. Remember that it took 3 decades from the first "useful" to being something "useful I can personally own and park on my desk".

EDIT: Even if the timescale of progress ends up such that it is shorter than that, it seems very likely you'll go through a similar progression where you've got this dedicated site for a machine that needs a lot of support resources and the economics are such that lots of different parties need to timeshare using it because it's so expensive.

I got an Azure subscription last year for my own learning purposes. It was a bit torturous... namely, expensive and complicated.

I do hope developers will get access to quantum computers without something like Azure. All the big players are pursing quantum computing so surely one of them will offer an affordable price and easy API (too much to hope for?).

I'm not sure why this would be different than the history of binary digital computers, although maybe at somewhat different timescales. Remember that it took 3 decades from the first "useful" to being something "useful I can personally own and park on my desk".

My concern is that it'll play out differently because circumstances are certainly different. Market pressures are different.

These days big corporations have no need to share their technology or even give you something physical. They can rent out their tech and hardware as a service. There may be little to no need for R&D on small affordable quantum computers.

I got an Azure subscription last year for my own learning purposes. It was a bit torturous... namely, expensive and complicated.

I do hope developers will get access to quantum computers without something like Azure. All the big players are pursing quantum computing so surely one of them will offer an affordable price and easy API (too much to hope for?).

I'm not sure why this would be different than the history of binary digital computers, although maybe at somewhat different timescales. Remember that it took 3 decades from the first "useful" to being something "useful I can personally own and park on my desk".

My concern is that it'll play out differently because circumstances are certainly different. Market pressures are different.

These days big corporations have no need to share their technology or even give you something physical. They can rent out their tech and hardware as a service. There may be little to no need for R&D on small affordable quantum computers.

Very, very few people back in the 50's envisioned personal computers, any purpose for them. Hell even in the 70's there weren't many people that were thinking it'd be a thing, and the visionaries that did had a pretty fuzzy picture themselves of exactly what the use would be.

But yeah, I expect for some time if you want to run a real program doing real things you'll need access to real expensive hardware.

The Azure like services are actually the solution for democratizing access until such time as more economical, compact, personal consumer friendly products can roll out.

For it to be useful, we'd need much larger quantum computers than we have today—factorizing a number with n bits requires about 2n qubits, so we're talking something like 4,000 to 8,000 qubits to break common encryption today—but if the technology were developed to build quantum computers with a few thousand qubits, these encryption algorithms would become extremely vulnerable.

So I have a basic grasp of computational complexity theory and I understand that Shor's algorithm is asymptotically faster than anything we can do on a normal computer and for huge numbers that's almost always more important than any constant factor. However doesn't the ability to break encryption also assume that quantum computers will be fast? Like if they have a low "clock speed" it might still be impractical to break RSA and DH.

I guess what I'm asking is what the equivalent of "clock speed" for a quantum computer and how fast are existing quantum computers by that metric?

For it to be useful, we'd need much larger quantum computers than we have today—factorizing a number with n bits requires about 2n qubits, so we're talking something like 4,000 to 8,000 qubits to break common encryption today—but if the technology were developed to build quantum computers with a few thousand qubits, these encryption algorithms would become extremely vulnerable.

So I have a basic grasp of computational complexity theory and I understand that Shor's algorithm is asymptotically faster than anything we can do on a normal computer and for huge numbers that's almost always more important than any constant factor. However doesn't the ability to break encryption also assume that quantum computers will be fast? Like if they have a low "clock speed" it might still be impractical to break RSA and DH.

I guess what I'm asking is what the equivalent of "clock speed" for a quantum computer and how fast are existing quantum computers by that metric?

Because of the 2^N speedup, we will see quantum computers with "strangely" many qubits running in parallel. That is in contrast to modern computers which often run with 64 bits and 1e9 frequency clocks (GHz level).

The actual clock speed doesn't matter as much when adding qubits doubles your processing power with each one added. In fact you can be really slow but if you have for example 128 qubits, that 3e38 speedup is going to blow any classical computer out of the water even if it runs at a very slow speed.

"Small" problems will still be the domain of classical computers, probably forever. But once the problems are big enough, slow quantum computers will be far superior to GHz or THz classical machines (and no, the GPU running with hundreds, thousands, or even millions of cores still won't beat a quantum computer with 64 qubits).

More on "how fast are they"... quantum computing can be implemented in many different ways. Those solutions all have their own pros and cons which result in different speeds and engineering problems. See here for example.

AFAIK the whole paradigm of quantum machines is different. You have a set of initialized qubits, a set of quantum logic gates, and you get a probabilistic result at the end when you read the qubits out into classical bits. You run this multiple times to become confident in the answer. Obviously this lends itself to certain algorithms, and is hardly comparable to classical computing.

Back when Intel first realized they could connect the backplanes of a bunch of 286 boards and create an actual parallel platform, the problem was that, well, nobody actually knew what to do with one. Before that, people had invented a number of parallel processing languages that didn't really pan out once faced with actual hardware implementations. Neural networks less so, but still.

I get that MS desperately wants to reacquire market leadership in something(anything), and gosh they get to use 'qubit' in a product even, and I'm sure they have the spare dollars to blow on developer time, but really?

Back when Intel first realized they could connect the backplanes of a bunch of 286 boards and create an actual parallel platform, the problem was that, well, nobody actually knew what to do with one. Before that, people had invented a number of parallel processing languages that didn't really pan out once faced with actual hardware implementations. Neural networks less so, but still.

I get that MS desperately wants to reacquire market leadership in something(anything), and gosh they get to use 'qubit' in a product even, and I'm sure they have the spare dollars to blow on developer time, but really?

If quantum computing is marketing b.s.... then I guess IBM and MIT are also in on it

However many years from now when quantum computers become more practically available, this move will be seen as either astoundingly visionary, or utterly silly.

I'm going to predict 'visionary' - by getting a head start on exposure to the programs people write to implement quantum algorithms, Microsoft can start studying quantum program compilers. They can identify and implement optimizations and transformations for things like operation ordering, 'register' (qubit) assignment, etc. If they can generate programs that require shorter coherence times or fewer qubits than other implementors, they have a huge competitive advantage for as long as machine scalability is limited. Once that limitation relaxes, they'll have a solidly established place in the market, and customers may still prefer MS 'Q#' over alternatives if it persists in having lowing resource requirements.

A big issue Microsoft seems to have is that when they have these visionary moments and get a huge headstart on something over the rest of the industry, they then usually sit on it or drag their feet without ever releasing a product or they release the finished product long after competitors have taken over the space.

This is exactly their problem. They have so great ideas but they're always half in half out. They don't give their best until they see a competitor pass them by.