VESA establishes world's first open standard for HDR displays

HDR is a term increasingly causing confusion amongst both photographers and the masses. 'Isn't it that thing that makes my images look flat and less contrasty by including all the shadows and highlights in my final image?' many of our friends and forum members ask.

Well, yes, if it's not done right. But when it comes to displays, the ironic thing is that 'HDR' is meant to make imagery look less flat, by taking the wide dynamic range encompassed in HDR images and stretching it back out on the display to no longer look flat but, instead, encompass nearly as much punch as the scene had in the real world.

Whenever a new display technology comes along, and particularly when it falls into that gap before it's well defined or understood, monitor manufacturers LOVE to throw the spec all over their products. That, in a nutshell, is what has happened with the 'HDR' moniker and computer displays, making it very difficult for someone to know what is and isn't a "real" HDR monitor.

What kind of brightness and contrast ratio should you be looking for? What's the actual static contrast ratio, not the stupidly high (and irrelevant) dynamic contrast ratio often quoted? What kind of color output should you expect out of an HDR monitor? And what the heck is local dimming?

These are the questions that manufacturers tend to not answer, at least for now, and it's why VESA has created the world's first open standard for HDR displays: DisplayHDR.

Targeted largely at LCD-based computer monitors (not OLED), the purpose of DisplayHDR is to establish an open standard with fully transparent testing methodology, so you can "rate" your display and see where it falls on the HDR scales. Is it really just an SDR monitor, or does it rank as DisplayHDR-400 (low-tier), DisplayHDR-600 (mid-tier), or DisplayHDR-1000 (top-tier)?

Here's how those tiers break down, and the performance metrics they have to hit:

'Corner Maximum Limit' is aimed to ensure local dimming implementations can effectively keep black levels low even when small non-central portions are illuminated brightly. 'Tunnel Maximum Limit' ensures good overall contrast with varied content all over the screen but with nothing hitting pure white. Many of these targets cannot be met without some sort of local dimming capability, which most computer displays don't have. Consider these targets a 'push' to get manufacturers to embrace the future of HDR display.

Up until now, there was no open standard for HDR displays. The closest thing we had is the UHD Alliance Premium Standard, which is essentially just a stamp that you'll see on TVs, Blu-ray players, discs, and the like that ensures your device hits 4K resolution, BT.2020 color space, 10-bit encoding, and a few key contrast and brightness specs. But unlike the VESA standard, there's no gradation: you either have the UHD Alliance Premium Standard badge or you don't.

VESA's standard, on the other hand, aims to grade LCD-based computer monitor displays or grading monitors. It establishes tiers that manufacturers can shoot for when designing computer monitors. And since most if not all of these manufacturers are members of VESA, they have access to the documentation outlining the specifications and testing methodologies.

The hope is that the standard becomes widely accepted. That way, you can look for the VESA badge on your next monitor purchase to make sure the manufacturer isn't just throwing the term "HDR" onto an IPS monitor that can only hit 350 nits brightness and a 1000:1 static contrast ratio (many otherwise highly-rated IPS monitors aimed at photographers from manufacturers like Dell, BenQ, Eizo and the like).

A DisplayHDR-400 rated display would be guaranteed to hit peak brightness of 400 nits, a black level of no more than 0.4 nits for a largely black scene (or 0.1 nits for a more varied scene only hitting 50% white at any point), 10-bit encoding, and 95% sRGB coverage. This would be considered the "first genuine entry point for HDR" by VESA. Funny enough, the otherwise excellent IPS displays many photographers choose might hit this standard, but we'd argue you shouldn't consider such a display 'HDR'. In other words, we here at DPReview don't really consider monitors with the 'DisplayHDR 400' truly 'HDR'. Grading or processing your images on these displays aren't going to guarantee your images will look proper on future, truly 'HDR' displays.

A DisplayHDR-600 rated display would be guaranteed to hit a peak brightness of 600 nits, a black level of no more than 0.1nits, 10-bit encoding, 99% sRGB, and at least 90% DCI-P3 coverage. These specs, according to VESA, describe "professional/enthusiast-level laptops and high-performance monitors." This rating, in our opinion, is far more stringent and is better indicative or a truly 'HDR' display. If you want your images and video to be future-proof, pick a display rated no lower than this.

Finally, a DisplayHDR-1000 rated display would guarantee peak brightness of 1000 nits, a black level of no more than 0.05 nits, 10-bit encoding, 99% sRGB, and at least 90% DCI-P3 coverage. This final tier describes, "professional/enthusiast/content-creator PC monitors." This is the stamp of approval we'd be looking at were we to be grading video or photos that will look good on displays of the future. Monitors with the DisplayHDR-1000 badge will be far more representative of the displays of the future, so if you want to make sure your content is ready to be displayed on future devices, this is the badge you'll want to look for when shopping for monitors.

These new standards are also more stringent about color gamut coverage: the 600 and 1000 standards require what we'd call 'wide gamut' color coverage, capable of displaying colors well outside of the old (can we say 'boring') sRGB standard of yesteryear. That means they can display colors well outside of old photochemical printing devices, so you can edit far more saturated and interesting colors into your image that will be displayed by monitors and printers of the future (and current).

Furthermore, these new standards set stringent requirements on bit-depth: while 8-bit monitors with dithering are allowed, each one of these standards require you hit 10-bit color reproduction with or without 2-bit temporal dithering (many monitors of the past would only hit 8-bit by 6-bit panels with 2-bit dithering: a big no-no for HDR content capable of displaying a wider range of luminances and colors that might otherwise band or posterize with 6-bit panels).

To learn more about the new VESA standards, head over to the DisplayHDR website. There, you'll find a simple breakdown of what constitutes an HDR display, why the standard was set up, and a link to download the DisplayHDR CTS (Compliance Test Specification) for free.

Comments

Specifying response time in frames is pretty much meaningless unless the frame rate is mentioned, which I haven't seen anywhere. 30? 48? 60? 96? 120? Why not use milliseconds, since hopefully the screen response time is not tied to the media being played back?

"displaying colors well outside of the old (can we say 'boring') sRGB standard of yesteryear. "sRGB is still the current standard. What is boring is your silly, geeky excitement with each "flavor of the month" tech - 3D, VR, 8K UHD, HDR-TV etc.

"That means they can display colors well outside of old photochemical printing devices, so you can edit far more saturated and interesting colors "You mean even more lurid colors than those in the landscapes displayed on the monitors above? Yuck!

I just bought a 65" LG C7 OLED....and I can tell you the image quality with good HDR material is much more then a 'flavour of the month'. Bar none it is the best image quality I have ever seen on a monitor.

These "ratings" are misleading and don't target professional photographer.

There are 2 huge problems on this "standard":- Variable luminosity where it should be stable- Dynamic lighting (per zone) with false contrast, that is not identical through all the surface of the screen.

I wonder if we will need to deactivate dynamic lighting to be able to calibrate these "HDR" displays, and keep it that way to have an honest image (one you could trust).

And OLED have problems on it's own, even if we don't consider marking (ouch!) there's a significant loss of number of colours/greys when you dim it, meaning in many cases you may work with just 6bits or 5bits per channel instead 8bits or 10bits (holly grail!).

The black level should be the foundation of a solid future HDR, not tear-jerking levels of brightness (initially designed for 4" sunlit phone LCDs).

The HiFi audio builds on true silence, as a low noise floor practically matters more than ear-splitting loudness (who listens home to a 120dB IX-th finale...?).Ditto HDR: Wide color and HDR contrast should build upon a true black floor, as achievable since 20+ yrs by OLED and partly by xVA panels.

By obsessing with nut (ahem, nit) brightness as our main/single control knob, the OLED and even the xVA are excluded - albeit both are vastly superior in what matters practically.

It's much easier/cheaper to pump the (Q)LED brightness to insane, read-on-the-beach nit levels.

Net: How practical and future-proof is an IPS/LCD-based DisplayHDR in AD2018?

Most pro LCD screens (10b, 100+ % AdobeRGB) remain low-contrast high-black-level IPS; they are calibrated around 100-150nits (or lower), out of a max of ca. 300nits. True black matters more

Black levels matter only in a perfectly dark environment. In a lit room the deepest black is always determined by the reflection of ambient light. (It doesn't matter if the monitor is glossy or matte) So if we don't have a major breakthrough in anti reflective coatings for monitors the only way to increase the contrast is by increasing the brightness. Besides that a major jump in brightness is also desirable. If you want to keep the average brightness of 150nits for a subject, monitors should be able to display the sun or something else in the frame with 1000nits brightness. Otherwise you would have to lower the subject brightness in order to achieve the high contrast vs the sun in the frame. The image would just look underexposed.

"Black levels matter only in a perfectly dark environment" : Music is listened to in malls, offices, living rooms and streets - not only in studios and recital halls. And very rarely near jet engines (requiring a 140dB dynamic range).

Black levels matters everywhere, particularly in average (ie., real life) scenarios. How many of us are doing their PPS or watch movies on the beach - how often (a mobile scenario) ?

In my lab the 100s of screens i see are used on average 10-14hrs/day at way under 100nit - except a few particularly bright desks or folks w/ vision issues. The vast majority of my time is in front of screens calibrated at 40-70nits... at 150 i barely can stand few 10s of minutes.

Confusion continued!HDR *is* about max brightness, not contrast. Contrast is a must, correct, but that's not the point. The human eye has very low static dynamic range capabilities, about 6 bits. So, for an image to be Hdr, it must contain patches too bright to be looked at w/o eye adaption. Therefore, the actual cinematic HDR standards refer to absolute brightness levels, not contrast ratios. Many photographers get this wrong, as it is outside our current box of thinking.

@migusFirst you need to differentiate. Do you calibrate your screen for print brightness or for working brightness? A workplace should be at roughly 750Lux, and screens in this environment somewhere between 140 and 200nits.

Second people often see these high numbers and get confused. For example the white background behind text or on websites will still be 140nits. It is not like all of your content will be that bright. In fact if you display SDR content on a HDR screen the content should still be only 140nits in the brightest part. The 1000nits are for very small objects on screen, like for example the Sun in a photo, or stars on a black background. HDR will force your eye to adapt to different parts of the scene, the same as in real light. For example if you have a portrait with the sun in the background you should not be able to look directly into the sun. All this for more emersion. Contrast ratio is a requirement but not the goal of HDR.

PS: Your music example hits the nail on the head. If you listen to music in a mall noise floor doesn’t matter all that much, because you can’t hear it over the ambient noise. That’s why 0.0005 nits vs 0.05 nits doesn’t matter if you reflection is already 3 nits.

Think about HDR as another marketing BS term to push the latest products to customers. No Tv nor monitor will magically transfer existing content suddenly look great. Maybe some games but that's it. I'd honestly prefer if OLED tech was finally sorted and made to masses. it takes way too long.

Good to see some benchmarking with monitors in regards to dynamic range. Maybe there could be a VESA certified star rating on monitors or a simple tick box matrix rating so the potential buyer can have a quick guide to what they are getting. For quite a while in my profession I have had to sift through many a spec sheet in regards to medical imaging monitors. Claims of 1000cd/m2 were made but had to run at 600cd/m2 or even 400 if to be used all day. Refresh rates were important as we some times use dynamic imaging at up to 60fps which really put the monitor to the test. I assume the refresh rates are becoming more important due to video, time-lapse... Bit depth is another story but not enough space here to expand on. So I guess what I am suggesting is a rating on the monitor using the VESA standard which might equate to your needs and use. Something I would like to see in the medical imaging profession. Good article DPR!

Interesting in that 'standards' could help cut through variations in assorted marketing 'emphasis' ;-)

However, given the longstanding recommendation to keep screen brightnesses well down if editing for print, I guess we should brace for the inevitable increase in "Why are my prints too dark" posts ;-)

I'm lucky enough to test high end printers for Canon and Epson and making great prints doesn't benefit from superbright monitors - wider gamuts than sRGB for sure. Print technologies massively expanding the dynamic range in print are not imminent

As a working photographer not using video, screens having such brightness capabilities are of minimal use (for now). That may change but as it stands I'm minded to regard the new standards as being more relevant to general PC users and gamers than photographers

My other lingering concern is that such standards make it easier to give spurious performance ratings to monitors that have no real relevance for many photographers.

Having just read the article, I looked up the numbers for my brand new, budget, 4K I.P.S. monitor which arrived just a few days ago. I was stunned to see the manufacturers only claim 300 nits as I have the brightness turned down to 59% to make it usable. I can't imagine how you'd even use 600 nits, let alone 1,000.

Looking at test charts, the colour accuracy/contrast straight out of the box is so good I haven't bothered calibrating it yet! Things are definitely getting better in the world of monitors.

Not every photo is edited with print in mind! If you’re taking a photo from the screen to a print, you already need to take that into account. What’s interesting to me is that we seem to be entering an era where screens are becoming the display formats of choice, rather than prints. When dynamic range and color gamut of electronic displays exceed those of ink on paper, we enter a realm where we can see ourselves saying, “Yes, well, you really have to see it on a screen to experience it as the artist really intended.”

That's the thing - people are now obsessing with brightness and keep forgetting that best phones have ~600 nits at 100% apl as they are supposed to be used outside. If you crank the brightness all the way up in a room it's ridiculously bright.

Yeah, max brightness matters more for outdoor displays than anything else. If you’re editing photos in a studio at home it’s not nearly so important. As any photographer knows, it’s way brighter outside than in even if you can see perfectly well.

Here again, people are confused about the standard. 10/12 bit Hdr means that brightest content displays 4x/16x brighter than normal white. For a display dimmed to 50% of 300 nits at SDR content, thats 600/2400nits usable for explosion, lighning flash or specular highlight effects. An entire scene does NOT become brighter with an hdr monitor!

Of course, this requires double encoding (hdr/sdr) of movies and photos to display right. Another thing not normally done, so we are at an infancy stage still.

All good points, but I should repeat that my comments were primarily aimed at 'normal' photographic editing and printing, where 120 cd/m2 would be bright, and as pointed out, the quality of black makes a more significant difference.

As yet (emphasis deliberate) HDR display has no place in my day to day work as a working photographer. When it does (and makes sense from a business POV) I'll be ready ;-)

As mentioned already, that's HDR for display, which is a very different thing to what is more commonly meant when people talk of HDR here on DPR. I am not planning to follow the tawdry side of the force just yet ;-)

Keith, as you pointed out, we are talking two entirely different worlds here.Classic photography copes with 6 bit DR prints and poor blacks.HDR monitors try to come closer to nature and to force the eye to use dynamic brightness adaption, i.e. to provide a more immersive experience.

Ultimatively, 360-3D HDR immersion isn‘t photography anymore, even with still images. Because an image composition can‘t be experienced as art anymore. Well, maybe art yes, but as an art installation rather than photography then.

Therefore, it is to be expected that HDR, in the two worlds, have completely separate meanings.

my favorite book that discusses the implication of mandelbrots research[ and similar science and math]and the place of shape , complexity, and structure of the physical biological and matho-imaginary-geometrical in well ...life the universe and everything [ thanks douglas adams ]

that book is "chaos " by james gleick should be on the reading list for the human race

I guess this is a good thing. But, I have to wonder why there are so many "standards" these days. Just too many different formats to do same thing. Why does blu-ray need 3 different codecs for video and even more options for audio? In the future it seems there will be a lot of different color spaces and HDR implementations possible. Who will benefit from this? Why not standardize just one or two possibilities and stick with it for 10 years or so.

Yes, but it’s just an extension of an age-old problem. Two images will only look exactly the same if they’re viewed on screens with identical specifications, settings, calibration, and external lighting. Since most screens aren’t even calibrated to begin with, in practice images look different from display to display. Having a high-end, well-calibrated display to edit your photos on just means you have a good starting point for a best-case viewing scenario.

Even if you’re making prints, the lighting under which they’re viewed can change the look of the photo significantly. It’s really an intractable problem unless you’re in a gallery setting or something and have full control over all variables.

I'm talking about color - inkjet printers today already can print colors well outside of sRGB. So having a P3 gamut display is already useful - you can edit in colors that will show up in the print, that you wouldn't have edited in on a standard sRGB monitor.

Rishi, sure but sRGB was never a print standard. Please let me know (in pure terms of printing) if you really think HDR monitors will make people (or indeed clients!) more or less happy with what can be printed onto flat, bendy or cardish things.

Agree but the requirement for OLED might be quite different. In term of contrast, color reproduction and bit deph, would not be a challenge at all. making an oled screen that could not match would be hard.

Only the peak brightness would be challenging but if you are using them in dim light that may still look more HDR than an LCD screen anyway.

As I understand it, Otto, an OLED screen wouldn't be able to hit the sustained brightness specs required by VESA (at 100% APL) without literally melting the glass on the screen, which is why this is an LCD-only standard.

IIRC LG old panels were ~400 nits peak and ~130 for 100% apl. New models claim up to 1000 nits peak so likely ~300 at most for 100%. This still is not good enough this standard, but I would take being able to actually light any pattern at lower brightness then being able to have 1000 nits but at 100-odd zones on 8MP display (so no stars or black lines, etc). YMMV

@Roland that's the weirder part - currently manufacturers are using QD only for better light source behind the LCD, but longer term plan is to use it instead of LCD layer. How this would apply then is anybody's guess...

"Plus, there's some really exciting LCD tech (quantum dots, nanoIPS) with insane static contrast ratios coming down the pike."Please don't spread false information. Quantum dots and Nano IPS as it is now doesn't do anything on their own for contrast ratios. IPS still hasn't reached 2000:1.

In the larger display market (around 20"+ panels) TVs are at least a year ahead in terms technology compared to computer monitors, so we can just look there for the latest buzzwords since there are already reviews. Neither Samsung (QLED) nor LG (nanoIPS) who created these marketing terms even advertise them as improving contrast ratios, so there's no reason for consumers to think that.

Self/electro-emissive QD-LED which hasn't been implemented on any device yet would improve contrast since that would be more similar to OLED in terms of how each individual pixel can be turned on and off. Current quantum dot tech produces more accurate colors, increases gamut, and is more power efficient than older LED backlighting (uses blue LED backlighting rather than white plus some more stuff). Quantum dots + VA does have high contrast ratio, but that's due to the VA panel rather than quantum dots. VA easily gets over 3000:1 static contrast, and we've seen over 5000:1 recently on TVs, but this is largely attributed to the increased brightness rather than the use of QLED. And did you know that Samsung even decided to but QLED on some of their TN monitors which have even worse static contrast by design?

NanoIPS as LG calls it (another marketing term) is probably the same as Nano Cell IPS on their TVs, it's just that it's so new that it hasn't been tested yet. Take a look at the marketing for Nano Cell IPS and even LG says it's just a derivative of quantum dots. In case you didn't know already, quantum dots are nanometer sized semiconductor particles, so that's probably where LG got the "Nano" term from for both Nano Cell for their TVs and NanoIPS for their monitors. LG mostly advertises Nano Cell as improving viewing angles, improves color accuracy and gamut, which is generally what you get from quantum dots too. The viewing angles are going to be largely from IPS tech, since which its contrast ratio is worse than VA, viewing angles are much better. Now take a look at the reviews for the LG SJ9500 (their highest end 2017 TV with Nano Cell) and even then it barely cracks 1500:1 static contrast, which is far away from even low end VA panels.

There is local dimming in TVs that helps a bit with contrast, but it generally doesn't work well for small changes within the scene, and is mainly used for viewing pleasure rather than accuracy since it does cause halos and blooms among other things, not to mention the increased cost to display production. As it is now, the only way to get good static contrast on displays is OLED, self-emissive QD-LED, microLED, VA, or some other new tech that we probably won't see on commercial display for quite a while.

@bybblyboo: " Neither Samsung (QLED) nor LG (nanoIPS) who created these marketing terms even advertise them as improving contrast ratios."

Ok, but to get the HDR600 badge you have to be able to hit around 3500:1 sustained static contrast on an IPS panel, they must be doing something interesting, and I don't think it's local dimming on a computer monitor.

Furthermore, Panasonic and Eizo and Canon have all showed dual-layer IPS panels that hit incredible contrast ratios. They're pricey now, but I wouldn't be surprised to see it become more affordable with time.

Samsung 'QLED' is a misnomer, they shouldn't have called it that until they actually implemented self-emissive quantum dots, but Nanosys is working on perfecting that technology so we'll eventually see it.

Point being - these technologies are coming, and it's good to start thinking about what it means for your workflow.

Worse yet, this standard allows for local dimming when calculating static contrast. If that's not cheating I don't know what is. Essentially it allows you to have a poor actual contrast for nearby pixels, but have a great one for pixels on opposite parts of the screen back-lit by different sets of LEDs (one set turned off) and still call it great contrast... Not unlike using low quality JPEG full of artifacts for B&W lineart and saying it's great.

@otto kGood point to bring up. Edge-lit local dimming (cheaper option that will be used in most HDR600+ rated monitors) generally only does large zones (vertical or horizontal, having both improves the zoning a bit) which means you can only change the brightness of large portions of the screen at a time, and that it's generally useless for actual use. Full array local dimming can do more zones, but even up to 70 or so you still get odd bright spots, halos, and blooms, and the more complex system adds thickness and cost to the whole display. Local dimming is still NOT a fix for low contrast ratio since the dimming algorithms cannot be perfectly uniform to your content. Also, as you said, accounting for local dimming is not the proper way to measure static contrast, since local dimming is more of a bad dynamic/active contrast implementation causes uniformity issues.

@Rishi Sanyal 1/7I overlooked this but there is somewhat of a contrast ratio defined by the VESA DisplayHDR spec. I say somewhat because the method used to test the contrast is extremely bad and easily cheated. For normal TVs, contrast is tested with a black/white checkerboard pattern across the whole display, which means that outside of FALD systems with many zones, you generally can’t cheat the contrast (meaning turning off the black boxes for “infinite” contrast). Here’s a link to how Rtings (and just about everyone) tests for contrast with the checkerboard patternhttps://www.rtings.com/tv/tests/picture-quality/contrast-ratioFurthermore, for more detailed understand of testing procedures for the VESA HDR standards, go from the DisplayHDR site to a box.com folder and there’s a PDF called "DisplayHDR_CTS_v1.0". My following explanations will stem from that specification.

@Rishi Sanyal 2/7The VESA criteria for minimum white luminance are all shams since it’s either 10% of the whole screen needs to be lit white or 2s of full white screen is lit and then monitor can get the 400/600/1000 cd/m^2 rating for DisplayHDR400/600/1000 respectively. The full screen long duration of 30m drops those down to 320/350/600 which is what general HDR content would use (videos generally last longer than just and cover more than 10% of the screen). This isn’t really important for monitors though since anything more than ~350cd/m^2 is probably too bright to look at that close of a range. The min white tests end up being fairly useless, but these are the basis of the 400-1000 naming convention.

@Rishi Sanyal 3/7So, we then go to look at the maximum black level tests. The tunnel test is pointless since it says the panel needs either 4000:1 OR local/global dimming, and OEMs can just add a cheap edge-lit backlight to pass that. The corner box test is a bit more interesting, since both the min white and max black luminance are tested here, which should get us a proper number for contrast right? Nope. If you look at the min white and max black luminance on the chart (corner) and divide those for the contrast, you get 1000:1 contrast for HDR400, 6000:1 for 600, and 12000:1 for 1000. 6000:1 is only seen in the best TV VA panels, and anything higher would be OLED territory. Additionally, Samsung’s HDR600 monitor, the CHG90 is only rated for 3000:1 static contrast (standard VA contrast), so how do monitors get these absurd values? We go back to local dimming.

@Rishi Sanyal 4/7It is pretty much required to use local dimming for HDR600/1000, since panels just aren’t at that level. But there’s another catch: the corner test is only 2.5% of each corner, adding to a total of 10% of the screen area. The main thing that’s being tested here is the middle for max black luminance. So, the only way for monitors to achieve 6000:1 contrast is with the aid of local dimming, most notably edge-lit. Because the VESA corner test was designed so poorly, and did not use the standard checkerboard pattern (hard to cheat) to test contrast, OEMs can easily cheat this test even with a cheap 4-8 zone edge lit backlight. It is guaranteed that any HDR600/1000 monitor will use local dimming to get this, and all they need is a 4-8 zone edge-lit local dimming system and can turn off the whole middle section for the min black luminance rating.

@Rishi Sanyal 5/7This is only useful for the test, and you won't see it applicable in normal use. The only real use case for edge-lit backlighting would be to reduce max black luminance in the case where you’re displaying a non-matching aspect ratio video on the display (say 21:9 movie on 16:9 display) and you have black border that can be turned off with edge-lit dimming. FALD dimming would be better, but again it’s thicker, more complex, and expensive to implement. Any FALD monitor that comes out will surely come with heavy marketing on how many zones of FALD it has. If anyone doesn’t know how horrible edge-lit local dimming is, this CNET article gives some good examples. https://www.cnet.com/news/led-local-dimming-explained/

@Rishi Sanyal 6/7TLDR; The VESA HDR spec doesn’t require any minimum STATIC contrast ratio outside of 1000:1 for HDR400, since HDR600 and 1000 NEED local dimming, and that isn’t static. If there was a true static contrast ratio minimum, then VESA would have stated it. Most of the tests are shams in the first place since the min white luminance is either very brief or only accounts for 10% of the screen at most. The only relevant test is the corner box test but even that is very easily cheatable with cheap edge-lit local dimming, which ends up being useless and looks bad in most cases. The only saving grace would be if the monitor has a 100+ zones of FALD, and even that can have uniformity issues.

7/7One last thing, VESA’s HDR spec is mostly useless for pro photographers, since monitors used in this field are generally true 10-bit panels. VESA’s HDR only needs 10-bit processing on the highest level, while the panel itself can be 8-bit + FRC (dithering). In the end you still must look at the monitor specifications to make sure it’s native 10-bit to get a proper photography monitor. Some might still say that true 10-bit is rare, but most 4K HDR TVs starting from $1K come with 10-bit panels now. Lack of 10-bit in monitors is inexcusable at this point.

This color/HDR/Gamut thing is a nightmare...As a little hobbyist photographer, I want to change my old TN screen and it's impossible to make a choice :- 10 bit panels : seems important but when you look at it, you find that you also need a specific videocard (Firepro or quadro) to manage it.- wide gamut : more colors the better? my camera support the wider adobe RGB gamut. My lightroom editor can also display it.But in the end, those two "must-have-if-you-are-serious" don't help 95% of the time :- you share your picture online (no web site manage those standards, let alone browser or OSes.- you want to print them : you exoprt limited jpegs to send to your online print service... great!- you print them at home with a non-professional printer (high-end A4 all-in-one in my case) : I doubt I can get many improvement from an adobe RGB workflow.

in the end, for the small enthusiast, we're still better with a full sRGB workflow that will be recognized in all the pictures use case.

Don't bother anymore with outdated LCD tech and take an oled display. The constrast is naturally huge,no issue, no need for local diming or other stuff like that. The number of colors that can be displayed aren't an issue neither.

The only issue is that you'd need a recent oled panel to meet the nit requirements for max brgithness but it is already the case for latest TV. The technology is already there.

I use my 55" oled TV screen as a monitor more and more and it work quite well.

Latest in-depth reviews

The Hasselblad X1D-50c is a mirrorless medium format camera from one of the most famous camera brands of the 20th century. Following a series of feature-enhancing firmware updates we've been able to complete our review.

The LG G7 ThinQ is a flagship device with a dual camera that departs from the norm: rather than the usual tele/wide combo, it offers wide and super-wide angle lenses. While it doesn't produce class-leading image quality, it's a solid option if you favor wide-angle shooting.

The Fujifilm X-T100 is the company's least expensive X-series camera to include an electronic viewfinder. It shares most of its guts with the entry-level X-A5, including its hybrid AF system and 24MP sensor and, unfortunately, its 4K/15p video mode.

Whether you're hitting the beach in the Northern Hemisphere or the ski slopes in the Southern, a rugged compact camera makes a great companion. In this buying guide we've taken a look at seven current models and chosen our favorites.

What's the best camera for a parent? The best cameras for shooting kids and family must have fast autofocus, good low-light image quality and great video. In this buying guide we've rounded-up several great cameras for parents, and recommended the best.

What's the best camera for shooting landscapes? High resolution, weather-sealed bodies and wide dynamic range are all important. In this buying guide we've rounded-up several great cameras for shooting landscapes, and recommended the best.

What’s the best camera costing over $2000? The best high-end camera costing more than $2000 should have plenty of resolution, exceptional build quality, good 4K video capture and top-notch autofocus for advanced and professional users. In this buying guide we’ve rounded up all the current interchangeable lens cameras costing over $2000 and recommended the best.

Alex and Kathryn are photographers, friends and Tokyo residents who love exploring Japan's hidden cultural treasures. They each brought a Canon EOS M50 on a recent trip starting in bustling Tokyo and ending in the peaceful riverside town of Gujo Hachiman.

Canon's latest 70-200mm F4L comes with a five stops of image stabilization, a new coat of paint and impressive sharpness. We've been shooting with our copy for several weeks now - see how it stacks up in our sample gallery.

Special 4K and 6K Photo modes may be one of the most under-appreciated features on recent cameras. In this week's episode, Chris and Jordan take a closer look at these modes and explain why – and when – you'll be glad to have them on your camera.

Ten years ago this month Panasonic and Olympus announced a new concept called Micro Four Thirds. We're now on the brink of full-frame mirrorless from at least one major player, so perhaps it's a good time to take a look back at where it all started – and how far we've come.

At a high-profile launch event in New York, Samsung took the wraps off its next Note device. The Galaxy Note 9 borrows the S9+'s 12MP dual-aperture dual-cam, with OIS in both cameras and an emphasis on AI-enhanced shooting modes.

One of the most keenly-awaited lenses for a while, the new Pentax D FA* 50mm F1.4 is finally here, and we've been using it for a few days. In this article, we're updating our initial impressions on the basis of our recent shooting with the K-1 II.

This week we take a look at one of the most unusual optics we've seen for quite a while. The Laowa 24mm F14 Macro Probe lens may look like something out of a science fiction movie, but as Chris and Jordan discover, it opens the door to some pretty cool photo opportunities.

GoPro has revealed its Q2 2018 financial results, boasting a massive 40% quarter-over-quarter revenue increase to $283 million and net loss of $32 million, which the company says is a 51% sequential improvement.