Why A.I. and Cryptocurrency Are Making One Type of Computer Chip Scarce

SAN FRANCISCO — Two technology booms — some people might call them frenzies — are combining to turn a once-obscure type of microprocessor into a must-have but scarce commodity.

Artificial intelligence systems, made by companies ranging in size from Google to the Chinese start-up Malong Technologies, rely heavily on a computer chip called a graphics processing unit, or G.P.U. The chips are also very useful in mining digital currencies like Ethereum, a Bitcoin alternative riding the same wave of hype as its more famous cousin.

With people and companies involved in the two surging tech niches buying up the same chips, G.P.U.s have been in short supply over the past several months. Prices have increased by as much as 50 percent, according to some resellers and customers.

“The chips are simply going out of stock,” said Matt Scott, a technologist from the United States who founded Malong after leaving Microsoft’s research lab in Beijing in 2014. “And the problem is getting worse.”

Malong, which is based in Shenzhen, China, is building a system that can analyze digital photos and learn to recognize objects. Doing so requires an enormous number of photos, and analyzing all these photos depends on the G.P.U. chip.

When the company recently ordered new hardware from a supplier in China, the shipment was delayed by four weeks. And the price of the chips was about 15 percent higher than it had been six months earlier.

“We need the latest G.P.U.s to stay competitive,” Mr. Scott said. “There is a tangible impact to our research work.”

But he did not blame the shortage on other A.I. specialists. He blamed it on cryptocurrency miners. “We have never had this problem before,” he said. “It was only when crypto got hot that we saw a significant slowdown in our ability to get G.P.U.s.”

G.P.U.s were originally designed to render graphics for computer games and other software. In recent years, they have become an essential tool in the creation of artificial intelligence. Almost every A.I. company relies on the chips.

Like Malong, those companies build what are called neural networks, complex algorithms that learn tasks by analyzing vast amounts of data. Large numbers of G.P.U.s, which consume relatively little electrical power and can be packed into a small space, can process the huge amounts of math required by neural networks more efficiently than standard chips.

Speculators in digital currency are snapping up G.P.U.s for a very different purpose. After setting up machines that help run the large computer networks that manage Ethereum and other Bitcoin alternatives, people and businesses can receive payment in the form of newly created digital coins. G.P.U.s are also efficient for processing the math required for this digital mining.

Crypto miners bought three million G.P.U. boards — flat panels that can be added to personal and other computers — worth $776 million last year, said Jon Peddie, a researcher who has tracked sales of the chips for decades.

That may not sound like a lot in an overall market worth more than $15 billion, but the combination of A.I. builders and crypto miners — not to mention gamers — has squeezed the G.P.U. supply. Things have gotten so tight that resellers for Nvidia, the Silicon Valley chip maker that produces 70 percent of the G.P.U. boards, often restrict how many a company can buy each day.

“It is a tough moment. We could do more if we had more of these” chips in our data centers, said Kevin Scott, Microsoft’s chief technology officer. “There are real products that could be getting better right now for real users. This is not a theoretical exercise.”

AMD, another G.P.U. supplier, and other companies say some of current shortage is a result of a limited worldwide supply of other components on G.P.U. boards, and they note that retail prices have begun to stabilize. But in March, at his company’s annual chip conference in Silicon Valley, Nvidia’s chief executive, Jen-Hsun Huang, indicated that the company still could not produce the chips fast enough.

This has created an opportunity for numerous other chip makers. A company called Bitmain, for instance, has released a new chip specifically for mining Ethereum coins. Google has built its own chip for work on A.I. and is giving other companies access to it through a cloud computing service. Last month, Facebook indicated in a series of online job postings that it, too, was working to build a chip just for A.I.

Dozens of other companies are designing similar chips that take the already specialized G.P.U. into smaller niches, and more companies producing chips means a greater supply and lower prices.

“You want this not just for economic reasons, but for supply chain stability,” said Mr. Scott of Microsoft.

The market will not diversify overnight. Matthew Zeiler, the chief executive and founder of a computer-vision start-up in New York, said the prices of some of the G.P.U. boards that the company uses have risen more than 40 percent since last year.

Mr. Zeiler believes that Nvidia will be very hard to unseat. Many companies will stick with the company’s technology because that is what they are familiar with, and because the G.P.U. boards it provides can do more than one thing.

Kevin Zhang, the founder of ABC Consulting, has bought thousands of G.P.U.s for mining various digital currencies. He said that a chip just for, say, mining Ethereum was not necessarily an attractive option for miners. It cannot be used to mine other currencies, and the groups that run systems like Ethereum often change the underlying technology, which can make dedicated chips useless.

Interest in digital currency mining could cool, of course. But the A.I. and gaming markets will continue to grow.

Mr. Zeiller said that his company had recently bought new G.P.U.s for its data center in New Jersey, but could not install them for more than a month because the computer racks needed to house the chips were in short supply as a result of the same market pressures.

“The demand,” he said, “is definitely crazy.”

Correction:

An earlier version of this article misstated the given name of a researcher who tracks graphics processing units. He is Jon Peddie, not Joe.