The design, the researchers claim in the paper, uses a white OLED with color filter structure for high density pixelization and an n‐type LTPS backplane for faster response time than mobile phone displays.

The researchers also developed a foveated pixel pipeline for the display which they say is “appropriate for virtual reality and augmented reality applications, especially mobile systems.” The researchers attached to the project are Google’s Carlin Vieri, Grace Lee, and Nikhil Balram, and LG’s Sang Hoon Jung, Joon Young Yang, Soo Young Yoon, and In Byeong Kang.

The paper maintains the human visual system (HVS) has a FOV of approximately 160 degrees horizontal by 150 degrees vertical, and an acuity of 60 pixels per degree (ppd), which translates to a resolution requirement of 9,600 × 9,000 pixels per eye. At least in the context of VR, a display with those exact specs would match up to a human’s natural ability to see.

Take a look at the specs below to see a comparison for the panel as configured vs the human’s natural visual system.

Human Visual System vs. Google/LG’s Display

Specification

Upper bound

As built

Pixel count (h × v)

9,600 × 9,000

4,800 × 3,840

Acuity (ppd)

60

40

Pixels per inch (ppi)

2,183

1,443

Pixel pitch (µm)

11.6

17.6

FoV (°, h × v)

160 × 150

120 × 96

To reduce motion blur, the 120Hz OLED is said to support short persistence illumination of up to 1.65 ms.

To drive the display, the paper specifically mentions the need for foveated rendering, a technique that uses eye-tracking to position the highest resolution image directly at the eye’s most photon receptor-dense section. “Foveated rendering and transport are critical elements for implementation of standalone VR HMDs using this 4.3′′ OLED display,” the researchers say.

While it’s difficult to communicate a display’s acuity on a webpage and viewed on traditional monitors, which necessarily includes a modicum of image compression, the paper also includes a few pictures of the panel in action.

Without VR optic (bare panel) – Image courtesy Google & LG

The researchers also photographed the panel with a VR optic to magnify what you might see if it were implemented in a headset. No distortion correction was applied to the displayed image, although the quality is “very good with no visible screen door effect even when viewed through high quality optics in a wide FoV HMD,” the paper maintains.

With VR optic – Image courtesy Google & LG

Considering Google and LG say this specific panel configuration, which requires foveated rendering, is “especially ideal for mobile systems,” the future for mobile VR/AR may be very bright (and clear) indeed.

We expect to hear more about Google and LG’s display at today’s SID Display Week talk, which takes place on May 22nd between 11:10 AM – 12:30 PM PT. We’ll update this article accordingly. In the meantime, check out the specs below:

Google & LG New VR Display Specs

Attribute

Value

Size (diagonal)

4.3″

Subpixel count

3840 × 2 (either RG or BG) × 4800

Pixel pitch

17.6 µm (1443 ppi)

Brightness

150 cd/m2 @ 20% duty

Contrast

>15,000:1

Color depth

10 bits

Viewing angle²

30°(H), 15° (V)

Refresh rate

120 Hz

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Wow. Now that’s a display! No tiny 110 FOV. This will be good for my gen 2 VR. I guess it will appear in the first VR HMD within 10 years?

Rogue Transfer

The actual built HMD has only a 120° × 96° per eye (see second column in table above).

Raphael

That’s crappy. I retract my enthusiastic statement,

Adrian Meredith

I hate to say I told you so….

Raphael

I need to read more words on these articles before replying. Currently reading only two lines of text. Will increase to three or four or maybe even six.

Raphael

You were right all along flappy. I shoulda listened.

Baldrickk

120 is still a little better than 110. At least it isn’t a step backwards on that front.

impurekind

Good stuff. Keep it coming. . . .

Lucidfeuer

“9,600 × 9,000 pixels per eye” is it per focus point or ambiguated per-FOV resolution (the total resolution of all added focus points in a user FOV?). This seems really low…

However the specs of their built, if implemented soon enough is finally a worthy upgrade for VR, especially if 120×96° is per-eye.

Johan Pruijs

jep.. I asked myself the same question. Is that FOV per eye??

cham

an eye need 60ppd, with 160° H per eye, you get 9600px H per eye.

Lucidfeuer

I remember hearing a lead engineer from Oculus saying you’d need “64K” for the total 2*180° H FOV

cham

i suppose because vision is more than this snellen chart. In the sky, you can see betelgeuse, angular diameter : 0.05″.
While the snellen chart give us 20/20 with 1 minute of arc(=60ppd) . Only 1200x larger than 0.05″ :)

This paragraph is mixing different terms:
“To reduce motion blur, the 120Hz OLED is said to support short persistence illumination of up to 1.65 ms, which leaves plenty of room for addressing data considering comfortable VR viewing typically has an upper limit of 20ms, a.k.a. the max motion-to-photon latency commonly considered a requirement in modern VR headsets.”

Panel persistence is time it stays lit during the frame. If it’s lit all the time, then single frame is displayed for 1000/120=8.33ms. If this one is lit for 1.65ms, it means it’s lit for 1.65/8.33= ~20% of frame time (so it flashes and goes black for global illumination panels). This is called low persistence, and those numbers need to stay below 4ms on average. Thats time after panel starts to glow, but first data needs to get to it, which probably will match it’s refresh frequency, meaning that data transfer will take 8.33ms (whole previous frame). As application needs to render at the same frequency as display (ideal case) it means that app rendering time on average will take also 8.33ms. And before rendering, you need some time to encode rendering operations with already predicted pose. Lets say you do it 2ms earlier. This means that application to photon latency is 2+8.33+8.33 = ~19ms.

Hi Karol, thank you for the correction and explanation. I’ve removed that bit from the article until I can better summarize what this means for the display in regards to VR. Again, thanks for keeping an eye out, and reaching out with your keen input!

Paul Sutton

So how high end of a VR/Gaming machine will be needed to run these properly?

Karol Gasiński

From paper it looks like this panel will be in fact used in mobile VR headset (some DayDream equivalent of Oculus Go?). You can find hints on it in the linked paper:

“The MIPI DSI interface between the mobile SoC and FPGA was limited to 6 Gb/s (uncompressed), which implies a 250 MHz pixel clock at 24 bits/pixel. We settled on foveated transport pixel counts near 1280 × 1920/75 Hz to fit within this bandwidth limitation. ”

So image will be rendered at ~1280×1920 (thats probably summed count of pixels in High and Low resolution areas) per eye at 75Hz and driven by mobile SoC GPU. There will be higher resolution area in the center, and lower resolution area outside. This lower resolution area will be upscaled in the panel to it’s native resolution.

Paul Sutton

BTW , I’ve owned a Vive for some time, so the next gen VR devices greatly interest me. Besides deciding when it’s time to unload the Vive for an upgrade, going wireless but with room scale is something I’m looking for in the next gen VR headset.

You’re saying this will be a stand alone with no PC to run the headset? That would be great with those specs. But either they’ved figured out a way to reasonably $$$ obtain the power needed, or this is going to be very expensive.

Any idea of price point? Will this, and VR in general, be at E3 next month?

Baldrickk

Personally, I’m hoping that the LG Ultragear (when it finally arrives) will be boasting these displays. It would seem to be a good fit (LG screen in LG HMD) and a step ahead of the Vive Pro (would we be able to call it the first proper 2nd Gen HMD?) and driving a display like that (even with foveated rendering) would seem to be more in the PC space rather than the mobile space at this time.

Bruce Banner

From earlier articles, they mentioned fabricating their own custom high bandwidth IC driver, and that foveated driving logic for both VR and AR was implemented. LG also patented eye-tracking technology for their UltraGear. That could reduce GPU load by as much as 50%.

Rosie Haft

Are the photos taken from the same distance away from the eye? I like OLED displays but the optics don’t seem quite right in order to be used as a near eye display. This would help to know!

Jonathan Pratte

We are getting there!

Albert Hartman

Is it good enough to read text? How small?

Paul-Simon

It should be good enough to read text at moderate distances.
Our actual resolution is more similar to 16K^2, which is twice the angular resolution of this – but this is pretty close, so clarity will be pretty incredible.