This Guy Control an iPhone With His Eyes

Matt Moss had a chance to play around with the iOS 12 developer beta. In the process, he came to a neat realization: ARKit 2.0 opened up some unexpected possibilities.

“I saw that ARKit 2 introduced eye tracking and quickly wondered if it’s precise enough to determine where on the screen a user is looking,” he explained over on Twitter direct message. “Initially, I started to build the demo to see if this level of eye tracking was even possible.”

It turns out that it most definitely is, as the video demo he tweeted shows.

As Moss was quick to point out, the potential use for such tech isn’t just for the extremely lazy.

“Once the demo started to work, I began to think of all the possible use cases, the most important of which being accessibility,” he wrote.”I think this kind of technology could really improve the lives of people with disabilities.”

And while concerns over advertisers abusing extremely precise eye-tracking tools are legitimate, Moss didn’t develop the technology available in the new ARKit. He did, however, manage to find a way to potentially put it to a positive use.