I've spent the last couple of weeks investigating the HDR output of various games. When it comes to SDR video , you may be familiar with RGB values 0-255 , with 0 representing the colour black and 255 representing white (if you've ever used a colour picker in MS paint or a word processor, you may have seen this type of number). HDR10 /Dolby Vision is a little bit different, not just because it uses a scale of 0-1023, but because each of these data values represent not just black to white (or colour), but also a measure of luminance in Nits, which isthe intensity of the light (how bright it is) Unlike previous video formats, these values are defined and are absolute. A value of 0 will always represent no light at all (total black) , a value of 1023 will always represent 10000nits of luminance, a value of 769 will always represent 1000nits. So if a modern HDR TV is fed these values, they should be exactly outputting the amount of light described by the value given. HDR10 and Dolby vision both used this system, and can refereed to as PQ based HDR Now as it stands, there aren't many TVs that get to the heady heights of 10k nits, you are lucky if you can get one that goes above 1500 at the moment. When the signal being received goes beyond the hardware capability of the display, the TV chooses how it handles this, most manufacturers simply clip the white values above a level chosen by them. They may also choose a soft roll off and try to make the shift into the clipped values less obvious. In order to do this, the when the content is mastered/produced for HDR10 and Dolby vision specify some additional information about the image content in the form of metadata, this metadata usually says what is the most intensely brightest value that will be seen in the game (or movie) and what the average luminance is across all of the content. These values are defined by the display the content was mastered on. Most UHD content current is being mastered for 1000nit screens or 4000nit screens. The purpose of this metadata is so an SDR image (or something inbetween SDR and HDR) can be derived from the the original HDR content in the event that the content is viewed on a display that does not reach the peak Luma of the display that the content was mastered on. So if you movie has been mastered on a 1000nit reference display, and you are using an OLED screen with a 650nit max output the TV can use this metadata to try best decide how to display the information that can't otherwise be displayed, due to hardware limitations. Once you are using a display that meets or exceeds the peak brightness of the content, the metadata actually becomes irrelevant. So with this in mind I've been looking at how games have been mastered, what options do they present to the user to adjust the image , what do these options actually do and what is the relationship between these things and how the HDR looks. Videogames have a big advantage over movies in that the image is generated in real time, so the image can be adjusted at will Due to the nature of HDR content, this is actually really easy to measure, all we need is an un-tonemapped screenshot or video capture and from this we can look at the code values that have been used in various in parts of the image image. We can see if the game is actually outputting anything that is black (or has a cinematic grade been applied with raised blacks) we can also see what is the very brightest value that the game is going to try and use to represent something like the sun. So I've been looking at the make up for various different games for Xbox (which allows for HDR screenshot output) to try and understand what the in-game adjustments actually do and how I should be using them to ensure that I get the best from my display. The goal of HDR is to to transport more information to a display , so that more intense light can be display in parts of the image where it is required, typically you'll see these brightest points in specular highlights, explosions and the sun. Let's have a look at a few of really good examples of in game HDR. They all have slightly different settings and different approaches to how the tone mapping is going to be performed. In order to better visualize the output of the game in a non HDR10 / SDR format, I've created a method of producing maps of the luminance. Using this scale you can quickly get an overview of what is dark, what is light and what is really intensely light. The vast bulk of what we see is going to sit between 0-150 nits, anything above that is the "Extra" luma that HDR offers.Star Wars : Battlefront 2 Actually all of the HDR compatible Frostbite games I looked at (Battlefield 1,Mass Effect) all use the same setup. Metadata output will be 100000 and the actual tone mapping is performed by the game via the HDR slider, with 0 nits being the very most left value and 10k nits being the most right value. As we can see, the sun itself is outputting at 10k nits, things that should be totally dark are as they should be, you'll see that the specular highlights reflected on the top of the gun are also hitting between 4000 and 10000 nits. The Dice Frostbite games are actually really interesting in that you can turn the HDR slider down to 1 click from the left , which will given you 100/200 nits depending on which game it is, essentially toning the game to SDR. This gives you a really nice way to see where your new fancy TV is showing off. Another fun thing you can do with the DICE game is to move the slider to 0% and literally turn off the lights, you can see just how real-time the lighting as you tell the engine the brightest any source of light can be is 0.Rise of the Tomb Raider So this game has a similar setup, a brightness slider which controls the black point (move to lowest point) and then a secondary HDR slider which controls the peak brightness. Tomb Raider has been capped at 4000nits. Like the Frostbite games, either set the slider to the Max to let your TV tone map, or follow the on screen instructions to try to eyeball the peak brightness and let the game output. RotTR is particularly great as there are loads of specular highlights, not just in the normal places you'd expect to see them in real life such as on the shiny ice and the twinkles in the snow as they reflect from the sun. But also in lower light conditions on less obviously "shiney" surfaces, so as on this insanely high resolution boot. It appears that once the output reaches 4000nits, any level above this jumps straight to 10k nits (which the display will clip anyway, as presumably the metadata is telling the display it's 4000nit) It's not data that is missing, it can be brought into visibility occasionally, which suggests that it is an artistic decision or part of the process for grading the game for HDR.Assassin's Creed Origins AC:Origins is another game that really does HDR well, like Tomb Raider is also capped at 4000nits (the pink bits) and offers a brightness slider which should be set as low as possible , based upon your viewing environment. Also it offers you a Max Luminance, which is neatly labeled in nits. There is also a "Paper white" scale, so as well as having sliders that dictate the darkest something can be in game, the brightest it can be, the game gives you a scale that allows you to adjust one of the mid points: How bright is a piece of paper. Ubisoft's recommendation for the paper white slider is

Quote

adjust the value until the paper and hanging cloth in the image almost saturates to white.Click to expand...

However like the brightness slider, this is here to allow you to adjust the output of the game to match your viewing conditions, if you are in a cinema like controlled light source environment technically you would set to 80nit, however as your surrounding light increases you will prefer higher values. Setting the games to the technical correct settings also highlights how HDR10/Dolby vision is too dim for many consumers. You'll also see how developers are still getting to grips with these new technologies, you'll see that the HUD in AC becomes a little too dark as the main game image becomes calibrated correctly. Moving on, we can see 3 Microsoft 1st Party Offerings which are all working with a full 10000nit outputsForza Horizon 3 A really great example of how HDR doesn't need to be used just for crazy bright sun-spots, magic and fires. Cloudy days still have very bright skies, photographers have had to use a few tricks to deal with the contrast issues they cause. Forza really shines here, it's almost enough to make me feel a sensation cold and drabness when viewed with my own eyes. You can see the skie reaching 4000 or so nits, tires grill being suitably unlit and dark and then the headlights pumping out a full 10k nit output. In a night time environment you can see the game making full use of the darker end of the scale, whilst the explosions and headlights are still illuminating in a realistic fashion. This is also a good example , even in SDR how a very dark image which adheres to the correct HDR10 standards will be perceived as being too dark or dim or "crushing blacks" . As we can see from the Luma map, the detail is all actually there, but human eyes cannot adjust to see such details until they are exposed to low levels of light for 10 minutes or so, when certain chemical reactions occur within the cells of the eyes. This obviously is an issue for many consumers who probably aren't in lightning conditions conducive to this happening.Gears of War 4 INSECTS All 3 games work with 2 sliders A brightness slider, this controls black levels, but also controls an aspect of contrast, pushing max nit output up to 10000k The a secondary HDR slider which also allows the max output to be set below that which the other slider adjusts to. Forza and Gears simply refer to Brightness and HDR , however INSECTS refers to this as HDR Contrast and HDR brightness. Now lets look at a few of other implementationsShadow of Mordor SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV. This is interesting, as we know that the developers have never actually seen the game outputting at 10k nits as there is no such display available. There is a traditional Brightness slider which allows users to specify the black point to their taste/viewing conditions. Here we can see the obvious place to look for 10k nits, the sun and the specular highlights. You can also see that as it should be, the shadowed sided of the player character is as dark as it should be.Agents of Mayhem Similar approach here, except the game is capped at 1000nits and has obviously been graded/toned with this in mind, as it's a fairly achievable output from a consumer display. I don't think it's a coincidence that they have made the game with this in mind and it's actually one of the games that has a really great looking HDR output. Again, like other games, adjust slider to the left to improve black level, although this does appear to have an effect onDEUS EX : Mankind Divided Lots of raised blacks in this perhaps as an artistic choice, but also a bug which causes grading to to totally fail if the in game brightness slider goes below a value of 35%. This looks like the result of some kind of flawed curve adjustment. 40-45% will give you 1000nit output without raising the blacks too much.Final Fantasy XV From one Squarenix meh, to a Squarenix wow. 1000 nit fixed max output and a simple brightness slider to drop black levels. Really fantastic grading throughout and in various lightning conditions. Even the Titlescreen has 2D elements optimised for HDR. Monster Hunter World Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems. At the default brightness setting, this is what we are getting. All mids and highlights, where are the shadows? So with a quick and dirty level amendment, we can remove the extra HDR luminance from the data and take a look at the histogram If we compare this to an in game SDR shot taken just moments later we can see the significant shift that there is between the SDR and the HDR toning. Contrast and black levels are totally out of sorts in HDR. This can be remedied slightly by dropping the brightness down as low as it goes, but there is not enough of a change to make it right. This appears as if it is at least partly caused by some kind of eye adaptation that is occuring. Whew! That was a lot of images! So this explores one side of how games are made, what it doesn't explore is the metadata side of what the games are outputting. It wouldn't surprise me if there were titles that had mismatched metadata, but because the metadata is static, as soon as you make any image adjustments, it would technically become wrong anyway.

I have over 50 hours in it now, its hard to find a time when me and my friend are free together though. I'm still loving it despite some problems, and how long some of the fights are dragging out. I think we're getting closer to the end.