NVIDIA Demonstrates Experimental 16,000Hz AR Display

The Entire VR Industry in One Little Email

The Daily Roundup is our comprehensive coverage of the VR industry wrapped up into one daily email, delivered directly to your inbox.

Email Address

While virtual reality surely has high performance requirements to maintain a comfortable experience, immersive AR may ultimately require a greater level of performance if we’re to achieve the theoretical ideal.

While virtual reality can get away with comfortable and immersive performance at latency under 20ms, augmented reality has the backdrop of the instantaneous-latency real life view against which the digital information is compared, increasing the need for high frequency output to keep the real and digital world in very close sync.

To that end, NVIDIA has demonstrated a prototype AR display running at a whopping 16,000Hz compared to a more conventional AR display running at 60Hz (for reference, high end VR headsets today run at 90Hz/120Hz). The video headlining this article details the rig used to achieve these results.

Overlaying a grid of digital white boxes onto a real checkerboard pattern, with the contemporary display running at 60Hz you can see that as the camera moves (representing the movement of a user’s head), the digital information struggles to stay ‘locked’ to the real world object because the speed of the movement means that the latest image rendered to the screen moves significantly off of the real-world object before the next image is rendered.

Meanwhile, the 16,000Hz display updates so quickly that despite the movement of the camera, the image re-renders so quickly, that it appears to stay almost perfectly locked onto the real-world grid.

Presenting the work—a collaboration between UNC-Chapel Hill, NVIDIA Research, and InnerOptic Technology—at GTC 2017 this week, NVIDIA’s Morgan McGuire says that the motion-to-photon latency in the 16,000Hz system is 0.08ms, or 80 microseconds, which is more than 100 times faster than today’s high-end VR headsets.

One very interesting piece of this work is that for both the 60Hz display and the 16,000Hz display, the source input remains just 60Hz. That means that the researchers have been able to achieve this ultra-low latency interpolation without requiring that the source input achieve 16,000FPS to match the 16,000Hz display, which would be computationally impractical for real world scenarios.

We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment. The combination of mechanical tracking at near-zero latency with reconfigurable display processing has given us a measured average of 80 μs of end-to-end latency (from head motion to change in photons from the display) and also a versatile test platform for extremely-low-latency display systems. We have used it to examine the trade-offs between image quality and cost (i.e. power and logical complexity) and have found that quality can be maintained with a fairly simple display modulation scheme.

Now, 16,000Hz may represent close to the theoretical ideal for AR, but we can certainly still have a comfortable AR experience with less. HoloLens, largely considered the best AR headset in production today runs at just 60Hz. And while its visual performance isn’t perfect, it’s certainly passable. As the form-factor of AR headsets improves, users will move their heads faster and expect their AR glasses to keep up, so we hope to see display performance to continue to increase over time.

Road to VR is a proud media sponsor of GTC 2017

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Still kinda confused that the refresh rate is so high and yet they’re only outputting 60fps…

psuedonymous

It’s effectively the same idea as timewarp, or forward frame extrapolation in general: you take the same source image and shift & warp it based on the (known) change in viewpoint in order to get a #new’ view-correct frame without rendering a new unique frame.

Firestorm185

Oh, ok. So it’s just like, 15,940 synthetic frames a second?

Tadd Seiff

Is this analogous to what is called “reprojection” in VR?

Notably reprojection is used to make up for poor framerate or dropped frames, whereas this is used to enhance an existing frame rate. But in both cases, we are dynamically reprojecting frames to increase the rate.

Just to clear things up fro those who don’t understand it. This display is not running at 16 000 Hz! it is running at 60 hz (aka 60 FPS) and the system that they use to align the projected image is calculated at 16kHz or 16 000 times per second for greater accuracy, this by no means make display output faster, but the content it’s rendering is more accurate to the motion. This is, as mentioned before, very much like Oculus’ Time Warp feature, that ‘fakes’ frames to smooth the motion out

Mermado 1936

We only need 1440p or 4k at 90 Hz… we don’t ask too much…

Nairobi

Cool Ill go grab my 16,000 Titans.

Jack H

The display system described in the linked paper is a true 16kHz display. It uses a Digital Micro-mirror Device (DMD) display from Texas Instruments which is essentially a lot of tiny swivelling mirrors which are only on or off position. They are normally driven by on/ off signals for (1/256, 1/128, 1/64, …etc,) but here is driven in binary mode. The versions of the DMD for things like SLA 3D pritners uses this mode.
The graphics system in this device has a routine which constantly tries to catch up to the latest position by averaging out the the previous binary frame brightnesses. It’s different from stereo warp or re-projection.
There has yet to be a demonstration of colour video but I don’t expect the rainbow effect to arise.