Why we made this change

Visitors are allowed 3 free articles per month (without a subscription), and private browsing prevents us from counting how many stories you've read. We hope you understand, and consider subscribing for unlimited online access.

The Race to Power AI’s Silicon Brains

Chip startups see AI as a once-in-a-lifetime chance to build big businesses. Many won’t make it.

Nigel Toon, the cofounder and CEO of Graphcore, a semiconductor startup based in the U.K., recalls that only a couple of years ago many venture capitalists viewed the idea of investing in semiconductor chips as something of joke. “You’d take an idea to a meeting,” he says, “and many of the partners would roll about on the floor laughing.” Now some chip entrepreneurs are getting a very different reception. Instead of rolling on the floor, investors are rolling out their checkbooks.

Venture capitalists have good reason to be wary of silicon, even though it gave Silicon Valley its name. Semiconductor chips cost far more to develop than software, and until recently there has been little room for radical innovations to distinguish new versions. Even if they survive, young companies often end up with profit margins thinner than the silicon wafers their chips are made from. Giant incumbents such as Intel and Nvidia are formidable competitors with deep industry knowledge and even deeper pockets.

What’s changed is a growing belief among some investors that AI could be a unique opportunity to create significant new semiconductor companies. Venture capitalists have invested $113 million in AI-focused chip startups this year—almost three times as much as in all of 2015, according to data from PitchBook, a service that tracks private company transactions.

With massive amounts of computational power, machines can now recognize objects and translate speech in real time. Artificial intelligence is finally getting smart.

Graphcore has been one of the beneficiaries of this shift, recently adding $50 million in funding from Sequoia Capital, a leading Silicon Valley venture firm. A number of other chip startups, including Mythic, Wave Computing, and Cerebras in the United States and DeePhi Tech and Cambricon in China, are also developing new chips tailored for AI applications. Cambricon, one of the most prominent Chinese startups in the field, has raised $100 million in an initial financing led by a Chinese government fund.

Ever since the advent of the mainframe, advances in computing hardware have triggered innovations in software. These, in turn, have inspired subsequent improvements in hardware. AI is the latest twist in this digital cycle. Companies in many industries have been investing heavily in hardware to run deep-learning systems (see “10 Breakthrough Technologies 2013: Deep Learning”). But as these become more sophisticated, they are exposing the limitations of existing chips used for AI work.

Many of those processors come from Nvidia, whose graphics chips are widely used to power games and graphic production. The processors have thousands of tiny computers operating in parallel to render pixels. With some tweaks, they’ve been adapted to run deep-learning algorithms, which also involve very large numbers of parallel computations (see “Nvidia CEO: Software Is Eating the World, but AI Is Going to Eat Software”).

Although they have been widely adopted, graphics chips have some drawbacks. One of the biggest is that when large numbers of them work in parallel they soak up a lot of energy. Carnegie Mellon University, a leading AI research center, has even had to ask researchers there to throttle back their use of the chips temporarily because they were putting a strain on the university’s power system. Franz Franchetti, a professor at CMU, says the university is looking at alternative power sources to alleviate the issue.

The AI chip startups are planning to produce more power-efficient processors. But what’s really energizing them is their belief that tailor-made processors for AI applications can beat less specialized chips at a wide range of machine-learning tasks. The new generation of chips combine multiple processing functions into a single step, whereas graphics processors take multiple steps to achieve the same result. The functions are typically bundled to optimize specific use cases, such as training algorithms to help an autonomous car spot potential obstacles ahead.

Jensen Huang predicts that health care and autos are going to be transformed by artificial intelligence.

Graphcore claims that in preliminary tests its new “intelligence processing unit,” which will ship to early customers in the first quarter of next year, is between 10 and 100 times faster than current hardware at such tasks. China’s Cambricon is already winning plaudits for its processors. Huawei, a Cambricon customer, reckons that for deep-learning applications like training algorithms to identify images, the startup’s chips are six times faster than running the same function on a graphics processor.

Researchers are excited by the prospect of a significant leap forward in compute power for AI. “There’s still a big gap between where we are and what we’d like to do,” says Andrew Davison, a professor at Imperial College in the U.K. who focuses on robotics and computer vision. Davison thinks the innovations brought to market by the chip startups will accelerate progress in fields like his own.

Such reactions are encouraging, but they won’t guarantee victory. Big chip companies are already unveiling their own made-for-AI chips to compete with the startups’ offerings. Intel, for instance, recently announced plans to release a new family of processors designed with Nervana Systems, a startup it acquired last year. Nvidia is also moving quickly to upgrade the capabilities of its own chips.

The startups face another challenge. Many of them are designing hardware to support highly specialized AI applications. But it can take years to get a chip to market. Given the speed at which AI is evolving, there’s a real risk that by the time their products are widely available the uses for which they were designed will no longer be top of mind.

Shahin Farshchi of Lux Capital, which invested in Nervana and has a stake in Mythic, draws a parallel with startups building processors for 4G wireless applications in the mid-2000s. Many of these ended up failing because they optimized for applications that didn’t become mainstream. “There’s going to be a shakeout again for chip companies that are very narrowly focused,” he says.

But if young firms build chips that span too wide a set of application areas, they’ll likely sacrifice performance levels. And that could leave them vulnerable to competition from Nvidia, Intel, and others. Some may get bought by the chip giants. But if many end up failing, venture capitalists will start rolling up their checkbooks again.

Share

Tagged

I am the San Francisco bureau chief of MIT Technology Review, where I cover the future of computing and the companies in Silicon Valley that are shaping it. Before joining the publication, I led research and publishing at a venture capital… More firm focused on business technology. Prior to that, I worked for The Economist for many years as a reporter and editor, most recently as the paper’s West Coast-based tech writer.

The best of MIT Technology Review in print and online, plus unlimited access to our online archive, an ad-free web experience, discounts to MIT Technology Review events, and The Download delivered to your email in-box each weekday.

You've read
of three
free articles this month.
Subscribe now for unlimited online access.
You've read
of three
free articles this month.
Subscribe now for unlimited online access.
This is your last free article this month.
Subscribe now for unlimited online access.
You've read all your free articles this month.
Subscribe now for unlimited online access.
You've read
of three
free articles this month.
Log in for more, or subscribe now for unlimited online access.
Log in for two more free articles, or subscribe now
for unlimited online access.