eBay open-sources HeadGaze so you can control an iPhone X screen with head movements

eBay has lifted the lid on new technology that helps physically impaired users interact with their iPhone X screen through head movements.

With HeadGaze, the ecommerce giant leverages Apple’s ARKit platform — which was designed to help developers build augmented reality apps — and the iPhone X’s TrueDepth front-facing camera (which enables Face ID) to allow applications to track a user’s head motions so they can guide an on-screen cursor. HeadGaze was developed by eBay’s computer vision team and guided by Muratcan Cicek, an eBay intern and PhD candidate who describes himself as having “extensive motor impairments.”

The technology isn’t actually available in eBay yet, though the company has showcased it working within the app — and is making the underlying technology available on GitHub for anyone to use.

How it works

In a nutshell, a virtual stylus follows the user’s head motions so they can move the cursor toward scroll bars or other interactive buttons. To activate a click, the technology detects how long the cursor has been in one spot and then triggers the desired action. So if someone wanted to click a “Buy it Now” button on eBay, for example, they would motion the cursor onto the button with their head, and the “click” would happen automatically after a couple of seconds.

Above: eBay’s HeadGaze

Though a major driving force behind HeadGaze is its potential to help people who have restricted use of their hands, Cicek is quick to note that it could have other use cases further down the line.

“HeadGaze enables you to scroll and interact on your phone with only subtle head movements — think of all the ways that this could be brought to life,” he said. “Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?”

Cicek added that the company is also looking at ways to track eye movements, which could open more possibilities for hands-free mobile phone controls. “The fusion of these gazing experiences opens up a broader possibility on defining various hands-free gestures, enabling much more interesting applications,” he said.

Target market

Around 15 percent of people “experience some form of disability” globally, according to World Bank data. This translates to roughly 1 billion people — and 20 percent of those “experience significant disabilities.” And it represents a substantial market: “If people with disabilities were a formally recognized minority group, at 19 percent of the population, they would be the largest minority group in the United States,” according to a report the Institute on Disability published in 2011.

Against that backdrop, technology companies are increasingly investing in improved accessibility for their various products. Airbnb recently introduced 21 new accessibility filters to make it easier for those with disabilities to find suitable accommodation, while Google added “wheelchair accessible” routes to transit navigation in Google Maps. A few months back, Microsoft announced its Xbox Adaptive Controller, which helps people with limited mobility play games, and it finally went on sale last week.

Elsewhere, Microsoft recently committed $25 million to its AI for Accessibility program, Facebook said it’s collecting data from its disabled users to inform design decisions, and Google announced that it’s bringing native hearing aid support to Android.

eBay’s latest initiative nestles nicely into that broader push to open up access to products, and making the source code available to anyone on GitHub could lead to some interesting use cases in the future.

“It is because of HeadGaze’s potential to make a tremendous impact on the lives of many people that we are open-sourcing this tool,” Cicek said. “We want to encourage developers to build more apps that don’t require screen touch.”