Manage your subscription

Breaking the noise barrier: Enter the phonon computer

By Justin Mullins

Noise is a chip designer’s worst enemy. But handled properly it could become a powerful ally – and usher in the age of phonon computing

In 2001, Pat Gelsinger, then the chief technology officer of Intel, made a striking prediction about the future of microchips. If current design trends continue, he said, microchips will be running at 30 gigahertz by the end of the decade. However, he added, at this speed they will be generating more heat per cubic centimetre than a nuclear reactor.

Sure enough, by 2003, Intel and other chip-makers had found that their plans for faster processors were running into trouble. For a chip to speed up, its transistors need to be shrunk, but smaller transistors must consume less power or they overheat. With chip-makers unable to keep to the reduced heat budget, the race for faster chips hit a wall (see diagram).

At best, today’s microprocessors can operate at just 3 GHz or so. To deliver a major performance boost, chip-makers have resorted to putting several processors, or cores, on the same chip. This keeps heat at manageable levels. Just.

Designing transistors that need far less power is, it turns out, no easy task. One of the main reasons is that microchips still require plenty of power to overcome electrical noise, which tends to flip the 1s and 0s in digital data, destroying information. The codes that computers rely on to transmit information have built-in checks to combat this.

The effects of noise become more serious in chips that run at low power since the actual signals being handled by a chip become …