Eonite's inside-out tracking tech could be VR's next big breakthrough

I am standing in someone else's living room, doing squats. None of this is real. How do I know? Because around me, small drones hover like flies, and each time I look at one, I shoot it down with an uncontrollable optic blast.

By all accounts, this is probably one of the least surreal virtual reality experiences out there, but technologically it's remarkable. I am wearing a HTC Vive but there are no trackers around the room, just one taped to the front of the headset. Were it not for the (soon-to-be-detachable) wire, I could roam wherever I pleased without being told off. Eonite believes the sensor it has built, the one on my face, is a breakthrough, and it might just be right.

Inside-out positional tracking, where movement is traced by the headset without any external sensors, is rare in VR right now. That's because it's more challenging to get right, but inside-out is much easier to set up - just put on the headset and you're away - and a hell of a lot more intuitive. "We as humans are designed as inside-out trackers," jokes Eonite co-founder Youssri Helmy.

We've tried inside-out VR at Wareable before, most recently the Pico Neo CV. But most of the systems on the market, including the "big three", use outside-in, meaning that even without wires your play space is defined by the limits of the sensor vision.

"We're about making machines see the world," says Helmy. Eonite's other founder is mathematician and roboticist Anna Petrovskaya, who was part of the core team that built the Stanford autonomous car Junior - which became the Google car. Her algorithm was built to exact more signal from noisy sensors, and it's this that laid the foundation for Eonite, which she started with Helmy.

What makes Eonite's technology such a breakthrough, according to Helmy, is that it balances power consumption, cost and accuracy in a way that hasn't before been done. It also functions well in low light conditions, as I discover when using it in a poorly lit hotel room.

As I lean down to peek the underside of the coffee table in this virtual living room, I appreciate that this is the most precise inside-out tracking I've tried. Eonite's sensor provide sub-millimetre accuracy and six degrees of freedom (6DOF), and latency is very good compared to other inside-out systems I've tried. Aside from the odd judder, which I'm told is down to it skipping from 6DOF to 3DOF (something they said would not be a problem in the finished product) the experience is mighty impressive.

There's potential for augmented reality here too, although there's an inherent latency in the many of the camera passthroughs that would need to be solved first.

Eonite is now ready to show the world what it's been building, but Helmy tells me that in the first quarter of the year it will announce it's working with a partner to bring a headset to market. However Eonite won't be making any systems of its own.

It's firstly targeting tethered PC headsets, but plans to move to mobile in the near future - and is even looking at the potential of this vision tech for drones and robots. Screw the robots - they can wait - VR needs this right now.