Slow Uptake for Apple’s Deep Learning Tools

When Apple unveiled iOS 10 this past summer, the company made a big deal of how it was baking artificial intelligence into the operating system. Among the changes were features that make it easier for developers to create apps that can recognize the content of images and understand natural language, powered by deep learning software. It represented an effort by Apple to keep pace with Google, whose TensorFlow technology is getting more popular with developers.

But so far Apple’s changes, made in two deep learning application programming interfaces, aren’t drawing much interest from developers. That’s thanks to various shortcomings which limit their appeal.

Apple's senior vp for software engineering, Craig Federighi, at the WWDC in June. Photo by Bloomberg.

No subscription? You're missing out.

Join the high-powered community of tech and business leaders who rely on The Information's original news and in-depth reporting.

Apple’s deep learning APIs, despite their limitations, show that the company is taking deep learning seriously, said Mr. Mierswa. “By putting deep learning on devices, Apple is putting a stake in the ground and saying we support this, and we think it’s important,” he said.