To bring you the best content on our sites and applications, Meredith partners with third party advertisers to serve digital ads, including personalized digital ads. Those advertisers use tracking technologies to collect information about your activity on our sites and applications and across the Internet and your other apps and devices.

You always have the choice to experience our sites without personalized advertising based on your web browsing activity by visiting the DAA’s Consumer Choice page, the NAI's website, and/or the EU online choices page, from each of your browsers or devices. To avoid personalized advertising based on your mobile app activity, you can install the DAA’s AppChoices app here. You can find much more information about your privacy choices in our privacy policy. Even if you choose not to have your activity tracked by third parties for advertising services, you will still see non-personalized ads on our sites and applications.

By clicking continue below and using our sites or applications, you agree that we and our third party advertisers can:· transfer your data to the United States or other countries; and· process and share your data so that we and third parties may serve you with personalized ads, subject to your choices as described above and in our privacy policy.

How Machine Learning Fueled Nvidia Stock's 15% Jump

Nvidia CEO Jen-Hsun Huang announced a partnership with Baidu to develop an end-to-end computing platform for self-driving cars.

Nvidia CEO Jen-Hsun Huang announced a partnership with Baidu to develop an end-to-end computing platform for self-driving cars.

Aaron Pressman

November 11th, 2016

Nvidia used to be a little company making graphics chips for PCs, but it’s well on the way to transforming into one of the leading computing platforms for cloud servers, machine learning, and artificial intelligence.

Fortunately for Nvidia, it turns out that the kinds of tasks graphics chips are good at—like processing many, many simple calculations at the same time—are just what’s needed to run analysis programs in a cloud data center, steer a self-driving car, or pilot an automated drone.

Thursday brought more evidence that the company’s successful transition is in full swing. Nvidia nvda reported third quarter results that blew through Wall Street expectations, and its stock price, which had already doubled this year, rose another 15% in after-hours trading.

Revenue from the data center business nearly tripled to $240 million, fueled by sales of a new chip in its Tesla line that dramatically speeds up machine learning tasks. Sales of gaming graphics chips, boosted by demand from a new Nintendo ntdoy video game system, rose 63% to $1.2 billion.

And while those well-established units are doing quite nicely, the company’s effort to make chips for self-driving cars, virtual reality gear, and other smart devices is still at a nascent stage, with lots of potential ahead.

Speaking on a call with analysts, CEO Jen-Hsun Huang said the old graphic processing chip business has reached a “tipping point” where gaming was no longer the only market. With demand from cloud and corporate data centers, PCs, gaming consoles, and smart devices, “it’s no longer a niche component,” he said.

The company’s growing data center chip sales still pale in comparison to Intel, the market leader. Intel reported data center chip sales of $4.5 billion for the third quarter, nearly 20 times Nvidia’s sales in the quarter. But Intel’s sales grew just 10%. And its lead is shrinking–a year ago, Intel’s data center business was over 50 times bigger than Nvidia’s.

Intel intc has tried to short-circuit Nvidia’s data center push by offering a very different kind of chip. Thanks to its purchase of Altera last year, Intel is pitching a type of chip known as field-programmable gate array, or FPGA, that can be reprogrammed for different specialized tasks on the fly.

For more on artificial intelligence, watch:

Traditionally, for specialized computing tasks like sorting big data or doing facial recognition on photos, companies use application specific integrated circuit, or ASIC, chips that have permanent circuit paths laid down that can’t be changed. With FPGA, the chips are more flexible and could gain an advantage if a data center operator changed the algorithms it used to perform those critical tasks.

Huang was almost completely dismissive of the effort, given the success of Nvidia’s new Tesla chip, an ASIC design for speeding up the performance of deep learning tasks. At one point, Huang jokingly said the new chip was so much faster, it was like giving users a time machine.

“FPGA is what you use when the volume is not large, FPGA is what you use when you’re not certain about the functionality you want to put into something,” he said on the call with analysts. “You can build an ASIC, a custom chip, that obviously can deliver more performance—not 20% more performance, but 10 times better performance and better energy efficiency.”

Intel has also been trying to catch up by developing better artificial intelligence functions in its chips. In August, Intel bought Nervana Systems, a 2-year-old startup considered among the leaders in developing machine learning technology.

But Huang touted a growing ecosystem of developers and startups in the field choosing to rely on Nvidia’s chip platform.Worldwide, some 1500 companies in AI are using Nvidia chips, he said.

Still, Nvidia has a long way to go to catch Intel. It’s trailing in the current market, but it may have the more desirable products for where the market is going.