Posted
by
kdawson
on Thursday October 12, 2006 @05:50PM
from the details-details dept.

An anonymous reader writes, "We are seeing more and more about high dynamic range (HDR) images, where the photographer brackets the exposures and then combines the images to increase the dynamic range of the photo. The next step is going to be monitors that can display the wider dynamic range these images offer, as well as being more true-to-life, as they come closer to matching the capabilities of the ol' Mark I eyeball. The guys who seem to be furthest along with this are a company called Brightside Technologies. Here is a detailed review of the Brightside tech." With a price tag of $49K for a 37" monitor (with a contrast ratio of 200K to 1), HDR isn't exactly ready for the living room yet.

Of course one of the other principal arenas where monitors like this are valuable is in medical imaging. One of the serious shortcomings in the migration of radiology to digital formats is the reduced quality of the images as compared to film. The dynamic range of film is simply so much greater than can be achieved with standard CRTs or LCD monitors that there is a real danger of missing out on very subtle changes in X-Rays for example. While it's true that image processing can make up for some differences, digital still can't quite compete with film for many purposes including data density in many cases.

The thing to me about the 'HDR' images produced by that technique is that they look far more 'unreal' than normal photos. They have this 'hypereal' effect that reminds me of postcards from the erm... I guess 1940s/1950s that had some hand retouching done to them, or a foil look.

They just, to me, look a little silly, and that's a result of having an image with more information in it than the medium they are displaying on can handle.

Now, with a display that can ACTUALLY display the full spectrum of a HDR image. THAT I'm interested in.

Excellent point. Truth be told I'd much rather see the color depth approached first. They've gotten better but for film level work none of them display full color resolution. Frustriating that the software will handle 48 bit, three channels of 16, but the monitors won't. Mostly becomes an issue when you are working with a lot for gradient images, skies and such. You still get some pixelation that isn't in the actual image file. Then again if you're doing TV who cares. They call it NTSC, never the same color, for a reason.

Don't be an idiot. You don't have to have "a black hole in the display" for a pixel to have effectively zero brightness, you just have to have it not generate any light (excluding blackbody radiation, which is negligible in the visible spectrum at room temp). One of the photos in TFA is of the monitor displaying a black screen in a dark room; you can't tell it from the surroundings. The pixels can be individually completely switched off (actually, that's not strictly true, a group of a few pixels can be switched off), giving a contrast ratio of (max brightness)/0 -- hence the divide by zero error, as the grandparent said.

No. Think about it: unless you're really pressing your nose right up to the screen, for a monitor to display a reflection of the image on the face of whoever's looking at it, it would have to radiate at a single angle (probably perpendicular) only. You wouldn't be able to see the whole screen, only a few pixels per eye at any one time. Ever stood in front of a projector screen and looked at the projector? Like that. It would be utterly useless as a monitor.

N.B. if you have something like the left side of the screen one colour and the right side a different one, you may well be able to see that by looking at your face, but that's more due to the fact that your face isn't flat; the left side slopes backwards from centre to edge, and vice versa for the right side. You certainly wouldn't be able to see detail.

> f the room isn't entirely dark (and a room with an HDR display in it isn't perfectly dark if you're not watching only renditions of/dev/zero), then the reflectiveness of the display surface limits the contrast, unless it's a black hole, as I mentioned before

Nope. The specified contrast is the ratio of EMITTED, RADIATED light from a bright pixel to EMITTED, RADIATED light from a dark pixel. Certainly, ambient light will reduce the effective contrast in reality, but the definition of specified contrast ratio assumes no ambient or reflected light. Obviously. How could it be otherwise, or the contrast ratio would be meaningless unless you specify everything from amount of ambient light to the colour of the walls along with it.

There is no such thing as a 0.0 - 1.0 visual range. The human visual system is floating point, pretty much literally. You have your exponent, which is how well adapted your eyes are to the light, how dilated your pupil is, etc. You have your mantissa, which is the relative intensity within your current visual field. Physiologially, we have about 28 bits of exponent and about 10 bits of mantissa. So, proper HDR is floating point. But we're not quite there yet.

In both audio and video, this whole idea of quantizing a 0.0 - 1.0 interval is a compromise wrought by insufficient numerical resolution. It has nothing to do with physics or perception or anything else. Once you realize that, you should also realize that the idea of "going outside" the 0.0 - 1.0 range is absurd. You don't go outside the range, you expand the range so as to better approximate the incredible human senses. As long as we're using fixed point image formats and digital video standards, there will always be a range, and we'll always be inside the 0.0 - 1.0 range, and it will always be a compromise.

Audio professionals have worked out their terminology far better than graphics guys have. Audio guys talk about dB, decibels. The reference point is 0db, which is as loud as your amp will go. When you add more bits, you're adding more quiet, not more loud. If you want more loud you buy a bigger amp. Each additional bit gives you 3dB more quiet, and you'd better hope your equipment has a low enough noise floor that you can hear all that fresh new quiet.

So what are you saying? What's the difference between HDR and 48 bit color? To use an analogy to audio, you seem to be saying that HDR is about more loud, and 48 bit color is about more quiet. But as you go on to point out, they're really just the same thing. No matter what, you've got a clipping level (the maximum luminance of your output device), you've got a noise floor (the minimum luminance of your output device), and hopefully you've got enough quantization levels in between for perceptual linearity. That's why HDR and color depth are joined at the hip. You can't get one without the other. There is no meaningful distinction.