My software for calibrating my monitor now has a toggle for calibrating my monitor to L*. Are there any useful comments or tutorials on why to use either 2.2, Native or L*?

Most modern LCD panels are (more or less) factory calibrated to gamma 2.2, so it's rather useless and inconvinient to calibrate them to L*. AFAIK it's only reasonable when you use ECI RGB v.2 as your editing space, and LCD with 12-14bit programmable LUT, that can be hardware calibrated to TRC L*.

I have an Eizo Coloredge CG21, use the dedicated Eizo ColorNavigator software with a GMB i1Pro device. My Eizo has that extra USB cable that I have to connect when running a cal/profile. (I forgot exactly what that does, something about being able to cal in 10 or 12 bits?)

I have an Eizo Coloredge CG21, use the dedicated Eizo ColorNavigator software with a GMB i1Pro device. My Eizo has that extra USB cable that I have to connect when running a cal/profile. (I forgot exactly what that does, something about being able to cal in 10 or 12 bits?)

Yes - somethin about it CG21 can be hardware calibrated, so you can calibrate it to any TRC you'll like.

But there's still no point in calibrating the panel to L*, unless you render your images to L* based editing space like ECI RGB v.2

So 2.2 or Native is the preferred choice for digital photography in AdobeRGB working space

AdobeRGB is based on gamma 2.2, and the Tonal Response Curve (TRC) of your panel is programmable - so yes, gamma 2.2 it's definitely the best choice. I'm not sure about CG21, but newer Eizo CG-series panels come with the factory gamma 2.2 calibration certificate, so as a matter of fact gamma 2.2 is also their native TRC.

So 2.2 or Native is the preferred choice for digital photography in AdobeRGB working space?

Useless trivia dep't: IIRC, gamma 2.2 became a standard (in TV sets) simply because it corresponded to the native response of a CRT display to voltage input, as viewed in a darkened room. But L*, or about gamma 3.0, would have been a lot better for our uses in viewing, editing, and printing photos. This is because it would allocate the 8 (or 16) RGB bits evenly in perceived brightness, so that every increment in RGB levels would yield a consistent step in brightness. The curves tools etc in photoshop would then have similar responses to changes in the shadows vs changes in the highlights. (With 2.2, the highlight bits represent much larger brightness steps, and therefore are more critical in use, than the shadow bits.)

One of the posts to the ColorSync list, from Mr. Borg of Adobe presents some interesting questions that as far as I know, the L* crowed hasn't answered:

Quote

L* is great if you're making copies. However, in most other scenarios, L* out is vastly different from L* in. And when L* out is different from L* in, an L* encoding is very inappropriate as illustrated below.

Let me provide an example for video. Let's say you have a Macbeth chart. On set, the six gray patches would measure around L* 96, 81, 66, 51, 36, 21.

Assuming the camera is Rec.709 compliant, using a 16-235 digital encoding, and the camera is set for the exposure of the Macbeth chart, the video RGB values would be 224,183,145,109,76,46.

On a reference HD TV monitor they should reproduce at L* 95.5, 78.7, 62.2, 45.8, 29.6, 13.6.If say 2% flare is present on the monitor (for example at home), the projected values would be different again, here: 96.3, 79.9, 63.8, 48.4, 34.1, 22.5.

As you can see, L* out is clearly not the same as L* in.Except for copiers, a system gamma greater than 1 is a required feature for image reproduction systems aiming to please human eyes. For example, film still photography has a much higher system gamma than video.

Now, if you want an L* encoding for the video, which set of values would you use:96, 81, 66, 51, 36, 21 or95.5, 78.7, 62.2, 45.8, 29.6, 13.6?Either is wrong, when used in the wrong context.If I need to restore the scene colorimetry for visual effects work, I need 96, 81, 66, 51, 36, 21.If I need to re-encode the HD TV monitor image for another device, say a DVD, I need 95.5, 78.7, 62.2, 45.8, 29.6, 13.6.

In this context, using an L* encoding would be utterly confusing due to the lack of common values for the same patches. (Like using US Dollars in Canada.)Video solves this by not encoding in L*. (Admittedly, video encoding is still somewhat confusing. Ask Charles Poynton.)

When cameras, video encoders, DVDs, computer displays, TV monitors, DLPs, printers, etc., are not used for making exact copies, but rather for the more common purpose of pleasing rendering, the L* encoding is inappropriate as it will be a main source of confusion.

Are you planning to encode CMYK in L*, too?

Lars

Chris Murphy's has concerns with an L* based on numerical accuracy with an 8-bit workflow. He explains his position here:

No real answers. When I do the gradient test in Photoshop, my monitor renders the tonal changes the smoothest with absolutely no banding in the shadows (when I run the cal/profile at L*). If it makes my monitor display well, what's the harm?

No real answers. When I do the gradient test in Photoshop, my monitor renders the tonal changes the smoothest with absolutely no banding in the shadows (when I run the cal/profile at L*). If it makes my monitor display well, what's the harm?

In this test, you are assigning the display profiles to the gradient or letting Photoshop use the profiles for compensation?

When I do the gradient test in Photoshop, my monitor renders the tonal changes the smoothest with absolutely no banding in the shadows (when I run the cal/profile at L*). If it makes my monitor display well, what's the harm?

When you do the gradient test, you assign display profile to the gradient, so the monitor is L* calibrated, and the image is in L* based color space. But when you're working with AdobeRGB images, they are rendered to the gamma 2.2 based color space. To simulate the result you may assign AdobeRGB to the gradient, and check if it's still so smooth

When you do the gradient test, you assign display profile to the gradient, so the monitor is L* calibrated, and the image is in L* based color space. But when you're working with AdobeRGB images, they are rendered to the gamma 2.2 based color space. To simulate the result you may assign AdobeRGB to the gradient, and check if it's still so smooth

That's not the way the test was designed to work as I understand it. A normal image in a normal RGB working space is always translated (behind the scenes) to the monitor space so you can see it. Hence will always suffer from some degree of banding or other gamut compression byproducts. The test gradient is not a normal image, it is designed to bypass Photoshop's translation from an RGB working space into the monitor space. IOW, a representation of a monitor at its absolute best.

When I run a monitor cal/profile at 2.2, and do the same test, assigning that 2.2 profile to the test image, it still looks very nice, but there is a bit of a hiccup/band in the 1/8 tone area that goes away when the monitor is operated in (and test image assigned to the ) L* profile. Just seems like L* makes the monitor run a bit smoother IMO.

When I get back to the studio tomorrow, I'll leave the monitor in its L* profile, do the gradient test, leave it in my AdobeRGB working space and take a close look.

Quote from: msbc

What would be the recommendation for calibration when the working space is ProPhotoRGB which is Gamma 1.8 ?

In color space terms, the gamma has to do with how edits get applied to the file with respect to shadows/mids/highlights, so that's not the same application of gamma based on what I read here: http://www.adobe.com/digitalimag/pdfs/phscs2ip_colspace.pdf , pg 5.(I'm sure Andrew will jump in here, he wrote the article!)

That's not the way the test was designed to work as I understand it. A normal image in a normal RGB working space is always translated (behind the scenes) to the monitor space so you can see it. Hence will always suffer from some degree of banding or other gamut compression byproducts. The test gradient is not a normal image, it is designed to bypass Photoshop's translation from an RGB working space into the monitor space. IOW, a representation of a monitor at its absolute best.

When I run a monitor cal/profile at 2.2, and do the same test, assigning that 2.2 profile to the test image, it still looks very nice, but there is a bit of a hiccup/band in the 1/8 tone area that goes away when the monitor is operated in (and test image assigned to the ) L* profile. Just seems like L* makes the monitor run a bit smoother IMO.

When I get back to the studio tomorrow, I'll leave the monitor in its L* profile, do the gradient test, leave it in my AdobeRGB working space and take a close look.

When displaying an gamma 2.2 encoded image on a L* calibrated display, Photoshop must "translate it behind the scenes" to the L* monitor space - and since it's only 8 bit per channel, such operation will always introduce some small banding. It's not a big deal, but only ilustrates that there's really no benefit in calibrating monitor to L*, when working with gamma 2.2 encoded images - you're only loosing a few shades, and the images in non-color managed applications start to look odd.

When displaying an gamma 2.2 encoded image on a L* calibrated display, Photoshop must "translate it behind the scenes" to the L* monitor space - and since it's only 8 bit per channel, such operation will always introduce some small banding. It's not a big deal, but only ilustrates that there's really no benefit in calibrating monitor to L*, when working with gamma 2.2 encoded images - you're only loosing a few shades, and the images in non-color managed applications start to look odd.

Using your example, then any image in a colorspace other than 2.2 will suffer. What about images in ColorMatchRGB or ProPhotoRGB? Adobe ceretainly isn't telling ACR and LR users to cal/profile their monitors to 1.8 for better viewing, are they?

It is my understanding that the gamma of the images' colorspace is not the same application of the gamma one's monitor is calibrated/profiled to. Two entirely separate things IIR. My understanding from reading Andrews numerous tutorials and postings is that the selection the monitors gamma has to do with how optimumly the monitor will preform. New LCDs were designed to run at approximately 2.2, so cal/profiling at 2.2 will render a smoother looking tonal gradients, regardless of what colorspace the image is in.

I am aware that on my Mac, calibrating my monitor to a gamma higher than 1.8 will render graphics in noncolormanaged apps darker, in fact the entire GUI is a tad darker as a result, but no big deal.

The "gamma" (TRC) of the display and the working space are indeed separate. One's describing the tones applied through the display, the other edits applied through the document.

What Lars is suggesting is that for an L* workflow to be of benefit, we need to encode that way from start to finish which isn't likely and sometimes not desirable. That we have a disconnect in TRC (Raws linear, ColorMatch/Pro Photo 1.8, others 2.2) isn't really an issue. In the case of ColorMatch RGB, its encoding TRC was designed for 1.8 due to the use of a PressView at the time and the color space it hit when calibrated: ColorMatch RGB. ColorMatch uses 1.8 because there is less quatization on the way CMYK. Since our eyes are closer to 2.2 (luminance response) yet presses have dot gain, a source space that is a little lighter reduces the quantization when you correct for press gain.