When Apple CEO Tim Cook introduced the iPhone X Tuesday he claimed it would “set the path for technology for the next decade.” Some new features are superficial: a near-borderless OLED screen and the elimination of the traditional home button. Deep inside the phone, however, is an innovation likely to become standard in future smartphones, and crucial to the long-term dreams of Apple and its competitors.

That feature is the “neural engine,” part of the new A11 processor that Apple developed to power the iPhone X. The engine has circuits tuned to accelerate certain kinds of artificial-intelligence software, called artificial neural networks, that are good at processing images and speech.

Apple said the neural engine would power the algorithms that recognize your face to unlock the phone and transfer your facial expressions onto animated emoji. It also said the new silicon could enable unspecified “other features.”

Chip experts say the neural engine could become central to the future of the iPhone as Apple moves more deeply into areas such as augmented reality and image recognition, which rely on machine-learning algorithms. They predict that Google, Samsung, and other leading mobile-tech companies will soon create neural engines of their own. Earlier this month, China's Huawei announced a new mobile chip with a dedicated "neural processing unit" to accelerate machine learning.

“I think you’re going to see them everywhere for sure,” says Eugenio Culurciello, a professor at Purdue who works on chips for machine learning. Patrick Moorhead, an analyst at Moor Insights & Strategy, agrees. He expects Samsung and leading mobile chipmaker Qualcomm to offer the most serious competition to Apple’s neural engine, and to also see a mobile AI chip design from Google. “There’s a myriad of things silicon like this could do,” Moorhead says. In particular, he said the new hardware could help Apple's ambitions in healthcare, by helping an iPhone analyze data from a user’s Apple Watch. Apple said Tuesday that it is working with Stanford researchers to test an app that detects abnormal heart rhythms.

Apple has released little detail on its neural engine and did not respond to a request for more information. Culurciello says Apple’s new silicon could improve the iPhone’s ability to understand your voice and the world around you.

Applications such as Siri have got much better at recognizing speech in the past few years as Apple, Google, and other tech companies have rebuilt their speech recognition systems around artificial neural networks. Neural networks also power the feature that allows you to search your images in Apple Photos using terms such as "dog."

Custom circuits like those of Apple’s neural engine allow machine-learning algorithms on a phone to analyze data more quickly, and reduce how much they sap a device’s battery. Culurciello says that could open new uses of machine learning and image recognition on the iPhone, because more powerful algorithms can be deployed right in a user’s hand.

An augmented-reality app such as a game that Apple displayed Tuesday that recognizes and responds to objects in the physical world needs to do so as quickly as possible, for example. Data can be analyzed more intensively in the cloud, but it takes time for the data to travel to the cloud and back. Plus, Apple prefers to process user data on the phone itself for privacy reasons. The neural engine in the iPhone X reduces the downside of that strategy, by bringing the phone slightly closer to the power of cloud hardware.

Leading tech companies already are battling to develop more powerful hardware for machine-learning algorithms running in the cloud. Google has developed custom chips called TPUs to boost the power and efficiency of algorithms used to recognize speech or images, for example. Microsoft, Intel, graphics chip giant NVIDIA, and many startups are all working on new ideas of their own.

Apple could nourish a powerful engine of the iPhone’s success by allowing third party developers to tap into the neural engine inside its new phone. Convincing programmers and companies to spend time and money bringing new features and functions to the iPhone has been an efficient way for Apple to drive sales of its most important product.

In June, Apple announced new tools to help developers run machine-learning algorithms inside apps, including a new standard for neural networks called CoreML. Moorhead says it would be logical to connect that with the new AI hardware in the iPhone X. “I see a direct link between CoreML and the neural engine,” he says.

Longer term, mobile hardware that can run machine learning software efficiently will be important to the future of autonomous vehicles and wearable augmented-reality glasses—ideas Apple has recently signaled interest in.

Tom Simonite is a senior writer for WIRED in San Francisco covering artificial intelligence and its effects on the world. He was previously the San Francisco bureau chief at MIT Technology Review, and wrote and edited technology coverage at New Scientist magazine in London. Simonite received a bachelor’s degree from... Read more

WIRED is where tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives—from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries.