A developer's perspective on immersive 3D computer graphics

Main menu

Post navigation

Oculus Rift DK2’s tracking update rate

I’ve been involved in some arguments about the inner workings of the Oculus Rift’s and HTC/Valve Vive’s tracking systems recently, and while I don’t want to get into any of that right now, I just did a little experiment.

The tracking update rate of the Oculus Rift DK2, meaning the rate at which Oculus’ tracking driver sends different position/orientation estimates to VR applications, is 1000 Hz. However, the time between updates is 2ms, meaning that the driver updates the position/orientation, and then updates it again immediately afterwards, 500 times per second.

This is not surprising at all, given my earlier observation that the DK2 samples its internal IMU at a rate of 1000 Hz, and sends data packets containing 2 IMU samples each to the host at a rate of 500 Hz. The tracking driver is then kind enough to process these samples individually, and pass updated tracking data to applications after it’s done processing each one. That second part is maybe a bit superfluous, but I’ll take it.

Here is a (very short excerpt of a) dump from the test application I wrote:

The first column is the time interval between each row and the previous row, in seconds. The second to fourth rows are the reported (x, y, z) position of the headset.

I hope this puts the myth to rest that the DK2 only updates its tracking data when it receives a new frame from the tracking camera, which is 60 times per second, and confirms that the DK2’s tracking is based on dead reckoning with drift correction. Now, while it is possible that the commercial version of the Rift does things differently, I don’t see a reason why it should.

PS: If you look closely, you’ll notice an outlier in rows 15 and 17: the first interval is 3ms, and the second interval is only 1ms. One sample missed the 1000 Hz sample clock, and was delivered on the next update.

7 thoughts on “Oculus Rift DK2’s tracking update rate”

That’s fascinating info. Given that the camera rate is 60hz, does that mean the minimum latency is still 16ms? Not entirely sure what happens when a 2ms signal is periodically corrected with a 16ms one. IE, when the frame correction comes in its potentially 16ms behind no?

I talk about this in detail in the reddit thread. The effective latency after sensor fusion is 2ms worst-case, 1ms average case. In short, tracking is driven by the IMUs, and the camera is used to correct accumulated drift after the fact.

That’s what I would love to know. Given that so much is publicly known about tracking systems (regular 60fps webcam + IMU + extended Kalman filter), why are there no cheap open-source, third party solutions for delivering 6dof data to mobile platforms ?

I understand part of the ultimate goal of mobile, is untethered, but I for one, would jump at a solution of a single beacon (IR cam or LEDs), broadcasting low latency BLE position data to cardboard, or gear vr headset.

@okreylos Why isn’t that a thing? And if not, why aren’t you launching a kickstarter for it? 😉