Just as Wall Street was facing financial meltdown, Microsoft turned up at the 2008 High Performance on Wall Street conference with the idea of using Windows for what we used to call supercomputing. The event marked the RTM (release to manufacturing) of the latest HPC (high performance computing) version of Windows Server 2008, and followed the launch of a cheap Cray CX1 supercomputer designed to run it. In a business where machines can cost $100m, "cheap" starts at around $25,000. Other machines will follow from companies such as HP, IBM and Dell (bit.ly/mshpc).

This doesn't mean Microsoft expects to take over the supercomputing business, where Linux has something like a 95% market share. Very few real supercomputers run Windows, though one recently established a top-25 benchmark performance of 68.5 teraflops. (Actually it was a Linux supercomputer at America's National Center for Supercomputing Applications, and the NCSA just ran the Windows benchmark.)

According to Kyril Faenov, general manager of Microsoft's HPC team, the idea is to deliver an integrated system that works at a departmental level, running existing Windows software and utilising existing Windows skills. It's a sort of deskside PC vision of a supercomputer, rather than an enterprise-wide or national resource.

Says Faenov: "Even small companies can take advantage of high-performance computing. We're taking it to the mainstream." For example, Faenov says Excel 2007 supports "transparent parallelism", so that an HPC machine can speed up everyday workflows in financial institutions where they have spreadsheets that take hours or even days to run. (Crazy, I know.)

Windows HPC is based on the same Windows core as Vista SP1 and Server 2008, so programs can be developed in Microsoft Visual Studio. But Faenov says a lot of supercomputing programs "move very easily to Windows: usually the code has been abstracted to use only a few operating system specific calls".

If your financial world is collapsing, I can understand the appeal of being able to run financial analyses in a hurry. But people may soon get their own supercomputing facilities by exploiting their PC's graphics processors.

Nvidia has been talking up GPUs for a couple of years, and Nvidia's chief scientist David Kirk told a conference: "If you think about it, this is a massively parallel supercomputer on your desktop. It is truly the democratisation of supercomputing."

The potential is already visible in the results from cooperative projects such as Folding@Home (bit.ly/fathome). The problem is that programs written for CPUs don't usually have any way of accessing GPU power, except for mundane, repetitive tasks such as pixel shading and playing DVDs.

But that could change now AMD (Intel's main rival) owns ATI (Nvidia's main rival). According to Ian McNaughton, a senior manager with AMD in the UK, AMD is planning multi-core processors that will have both CPUs and GPUs on the same chip. For consumers who need fast graphics processing for video and games, it would make more sense to have a quad-core chip with two CPUs and two GPUs than four CPUs, he says. No doubt Intel and Via are thinking along the same lines.

It should certainly be possible to enable an on-chip GPU to handle the sort of repetitive parallel processing that would race through an Excel spreadsheet. In which case, the market for deskside supercomputers may not be as big or as profitable as Microsoft and its PC partners hope.