HDMI/Displayport - display specs confusion for HDR gaming

Regular

Until HDMI2.1 is out, we have to live with the confusion of whether our HDTV or PC monitor fully support base HDR gaming?

HDR 10bit/60hz/4:4:4/60fps/4K - this being the base target to hit.

There is a new Asus Pro-art FALD monitor which i plan to buy for PC gaming, but it is only displayport 1.2a....but why Asus? You want $2K but do not use displayport 1.4? I am better off sticking to my OLED C7? (i am looking for desk bound HDR PC gaming along with good ppi for desktop use!) I love my OLED but no for long term PC use!

I hear conflicting posts that there are some HDTV can support 4:4:4 60Hz/4K HDR through HDMI2.0, which is only 18Gbps..? How? Native downsampling behind our backs...? Displayport 1.2a has 21.6Gbps, so the Asus will work..?

Then there are now the even more expensive 27" Asus/Acer 144Hz HDR FALD monitors, displayport 1.4, but using 8 bit panels?

Regular

Why exactly do you think DisplayPort 1.2a is a limit, and what HDMI 2.1 has to do with HDR computer monitors at all?

PA32UC, as well as PA329Q and PA34V, use 60 Hz panels, so there is no need for additional bandwidth provided by HBR3 mode of DisplayPort 1.3+ or HDMI 2.1, and their HDR (both DisplayHDR and FreeSync 2 HDR) capabilities are exposed through industry standard EDID/DisplayID structures, and not some protocol-specific embedded message like HDMI Infoframe or DisplayPort 1.4 metadata.

A gaming monitor with 100 Hz refresh does require at least DisplayPort 1.3 (HBR3 mode); further, 120 Hz would have to use 4:2:2 chroma subsampling and 144 Hz would have to use 4:2:0 subsampling.
That said, 10-bit 4:2:0 format is used in UHD Blu-ray releases, so I'd think the practical impact of this subsampling would be minimal.

A gaming monitor with 100 Hz refresh does require at least DisplayPort 1.3 (HBR3 mode); further, 120 Hz would have to use 4:2:2 chroma subsampling and 144 Hz would have to use 4:2:0 subsampling.
That said, 10-bit 4:2:0 format is used in UHD Blu-ray releases, so I'd think the practical impact of this subsampling would be minimal.

Click to expand...

4:2:0 Chroma subsampling is mostly fine for motion video, but shows artifacts in general PC use. This is most easily and noticeably seen with text on a flat color, hence why it is undesirable in a PC monitor.

Games that feature a lot of text display can also suffer from this to a greater or lesser degree depending on how much text is displayed and how often the user needs to read the text. Although that said, most AAA games are ported over from consoles where 4:2:0 displays need to be accommodated so they take that into consideration at the game level and design around it.

I'm leaning on getting two of the asus personally (waiting for those volta consumer cards though...).

Click to expand...

That Asus monitor does indeed look tasty, but $2000 is more than I can justify spending for such a small improvement over my current 1440p, 144Hz G-Sync display. Here's hoping prices for these kinds of displays get a lot more reasonable within a year or two.

RegularNewcomer

That Asus monitor does indeed look tasty, but $2000 is more than I can justify spending for such a small improvement over my current 1440p, 144Hz G-Sync display. Here's hoping prices for these kinds of displays get a lot more reasonable within a year or two.

Click to expand...

Your problem here is, unfortunately, Nvidia. There's a similar Freesync 2 monitor out for half the price now, but Nvidia refuses to support Freesync (it gets a cut of all G-sync monitors).

Unfortunately it doesn't look like this'll change anytime soon, nor is there probably any other HDR monitor with adaptive sync support on the horizon this year. Maybe you'll have better luck next year : /

ModeratorVeteranAlphaSubscriber

The original poster's performance target was "HDR 10bit/60hz/4:4:4/60fps/4K"

Click to expand...

I was just answering the "Then there are now the even more expensive 27" Asus/Acer 144Hz HDR FALD monitors, displayport 1.4, but using 8 bit panels?" part of his post. That was (slightly) incorrect. It only drops down when the user (stupidly?) sets the refresh rate above 98hz (I thought I did the math long ago and came to 96hz for dp 1.3/1.4, but reviews say 98hz so I guess I can't do math).

Which one is that? Genuinely interested on behalf of my Vega 56!

Click to expand...

There isn't one. I'd still take those two monitors (sans freesync/gsync) over any other monitor currently on the market. My guess is the reason it's $2k is not gsync, but rather the panel (4k, DisplayHDR 1000, high refresh rate, etc.). But yeah these monitors aren't cheap...

Veteran

There isn't one. I'd still take those two monitors (sans freesync/gsync) over any other monitor currently on the market. My guess is the reason it's $2k is not gsync, but rather the panel (4k, DisplayHDR 1000, high refresh rate, etc.). But yeah these monitors aren't cheap...

Click to expand...

You are partially true, HDR1000 is a rarity among PC monitors, 4K @144Hz/120Hz is also a rarity. The Asus is also a Quantum Dot IPS with Full Array Backlit (384 zones), so it's a high quality panel already. However the GSync module costs a significant 500$ alone! The module is powered by an FPGA made by Altera, and is equipped with 3GB of DDR4 RAM, which increases the cost even further.https://www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing

ModeratorVeteranAlphaSubscriber

You are partially true, HDR1000 is a rarity among PC monitors, 4K @144Hz/120Hz is also a rarity. The Asus is also a Quantum Dot IPS with Full Array Backlit (384 zones), so it's a high quality panel already. However the GSync module costs a significant 500$ alone! The module is powered by an FPGA made by Altera, and is equipped with 3GB of DDR4 RAM, which increases the cost even further.https://www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing

Click to expand...

Ah interesting! Thank you for the information. That's actually borderline insane! Honestly not even remotely close to being worth it. I'm really only in for the 4k/DisplayHDR 1000/high refresh rate.

Legend

A bit bigger, and not as fast a refresh rate. But then who can play modern games above 4k 60hz anyway, let alone 144hz?

Click to expand...

That's not really comparable. I'm sure it's an amazing display, but yes, 144Hz matters. Even if the most graphically-intensive games have a hard time going above 60Hz at 4k, many older games have no problem whatsoever.

VeteranSubscriber

A bit bigger, and not as fast a refresh rate. But then who can play modern games above 4k 60hz anyway, let alone 144hz?

Click to expand...

Thank you, but that is not a 120 Hz+ display, which is what I guess makes the super-expensive ones so special. I am looking for that combination (4K+120=Hz), but apparently I'll have to wait just a little longer yet. HDR, meh, i really don't care about much.

Legend

Thank you, but that is not a 120 Hz+ display, which is what I guess makes the super-expensive ones so special. I am looking for that combination (4K+120=Hz), but apparently I'll have to wait just a little longer yet. HDR, meh, i really don't care about much.

Click to expand...

HDR really is amazing as well (I have an HDR TV). In terms of quality, I'd definitely choose HDR over higher resolution (though not over high refresh rate).

The only problem is: very few PC games these days take advantage of it. Eventually that will change. But for now it's one of those things that is rarely beneficial.

Hopefully by the time HDR support is common, monitors that let you have it all will have become much cheaper.

RegularNewcomer

Yup, me too! It's a shame that owning almost all high end monitor features only comes from GSync displays with such a high commanding price at this moment, but it is what it is.

It's VA and It's edge lit, not really comparable to an IPS FALD. Especially when it comes to HDR. Worse yet it's FreeSync doesn't support LFC. Lacking this feature makes it really overpriced at 1000$.

Click to expand...

Right now FALD is bure marketing BS for HDR. Human vision is exponential in nature, on a log2 basis. 2 nits looks twice as bright as 1 nit. But 510 nits is barely distinguishable from 500 nits. When you combine this with screen reflectance, which is above 5% even on an excellent screen, it means the difference between a 3000 to 1 contrast ratio and a 20,000 to 1 contrast ratio is nonexistent. Your minimum black level is going to be 5 nits at the lowest for even a relatively dark room, doubled out you get a little over 7 levels or "stops" of brightness from a 1000 nit display, less than a 3k to 1 contrast ratio gives you ideally, and less than a 20k to 1 obviously. But you literally cannot see enough of the darks to tell the difference between the two displays unless you're in a totally blacked out room.

About Us

Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!