First of all, light boxes are not sufficiently uniform to keep RGB values within 1 point across the entire box.

Second, a transparency shot on a lightbox is not the same as a reflective target shot under whatever lighting you're trying to profile to. Reflective and transmissive density measurements are made differently, so a measurement of a transmissive target is not directly applicable to reflective subjects, which is what photography is all about.

That's totally wrong. 99+% of the time you're capturing light reflected off of trees, water, grass, oxygen molecules in the air (which gives the sky its blue color), clouds, and other objects. Transmissive light only comes into play when shooting backlit translucent objects, which aren't all that common in landscape work. Blue sky is reflected light; it has to be when you see blue sky in front of you and the sun is behind you.

Jonathan, that's factually wrong on the level of "the world is flat", the light from the Sun is not reflected off the sky.

The sky is blue because of an effect named "refraction", not "reflection". It's not because of oxygen molecules only, but it's an effect of the entire atmosphere, which is mostly nitrogen.

Jonathan, that's factually wrong on the level of "the world is flat", the light from the Sun is not reflected off the sky.

The sky is blue because of an effect named "refraction", not "reflection". It's not because of oxygen molecules only, but it's an effect of the entire atmosphere, which is mostly nitrogen.

The article you cite discusses 2 types of scattering: resonant, and non-resonant. Non-resonant scattering is what causes refraction; it's a simple delay phenomenon, photon in, photon out, no change in polarization, limited change in direction of travel, and a small phase delay. Resonant scattering is quite different: one high-energy photon can be broken into multiple lower-energy photons, re-emitted photon(s) can be emitted in any direction, and there is no phase relationship between the original photon and emitted photon(s) whatsoever.

Non-resonant scattering is what causes refraction; resonant scattering is what makes the sky blue, and has a lot more in common with reflection than refraction, like the ability to cause light originating behind you to pass by you, change direction and then enter your eye, just like what happens when light from behind you pases over your shoulder, strikes a wedding dress, and reflects back to your eye. In both cases, light from behind you interacts with something in front of you, changes direction, and enters your eye. The standard term for such an interaction is "reflection". It may be caused by resonant scattering, but it's still a form of reflection.

BTW, oxygen is better at resonant scattering than nitrogen; if you ever have the opportunity to see liquid oxygen and liquid nitrogen side by side, you'll see that the liquid oxygen has a slight blue tint, while liquid nitrogen appears completely clear. So while oxygen is not the primary ingredient in the atmosphere, it is primarily responsible for blue skies.

All in all, your statements are closer to a "world is flat" magnitude of error than mine.

It seems, from what I've been reading on the internet that the main problem with profiling your camera is producing a uniformly lit test subject.

It also seems that this would be a fairly easy problem to solve if the colour checker were transparent, it could simply be placed on a light box? This would give uniform lighting without much hassle, wouldn't it?

If the transparent colour chart were made up of gels that were printed on your own printer then calibration of your entire system would come full circle, the colours coming out of your printer would be an exact match for what the camera is seeing.

Maybe I've misunderstood something about the process, they say a little knowledge is a dangerous thing

I had considered that this difference in light may make a difference, though I don't understand the mechanics of how. I read that the purpose of profiling your camera addressed the cameras imbalanced colour perception and not its luminousity, therefore I figured that as long as the colours being illuminated were correct then it wouldn't matter whether they were reflected or transmissive.

unless I'm mistaken, it isn't the lighting but the idiosyncrasies of the cameras' colour perception that are addressed when the camera is profiled.

You're mistaken, it's both. The lighting is inseperable from color perception. If you shine a green light on a white surface, it looks green. Unless you can precisely define how green the light is (aka white balance) there's no way to distinguish whether the surface is white or truly green. There's also the issue of determining how light or dark a shade of green or white the surface should be rendered.

Quote

Also, for landscape photographers, half their frame will usually be transmissive light. I don't see why it would be a necessity to profile using reflective light.

That's totally wrong. 99+% of the time you're capturing light reflected off of trees, water, grass, oxygen molecules in the air (which gives the sky its blue color), clouds, and other objects. Transmissive light only comes into play when shooting backlit translucent objects, which aren't all that common in landscape work. Blue sky is reflected light; it has to be when you see blue sky in front of you and the sun is behind you.

You're mistaken, it's both. The lighting is inseperable from color perception. If you shine a green light on a white surface, it looks green. Unless you can precisely define how green the light is (aka white balance) there's no way to distinguish whether the surface is white or truly green. There's also the issue of determining how light or dark a shade of green or white the surface should be rendered.

Hi Jonathan

You seem to be missing exactly what I'm referring to when I mention the idiosyncrasies of a particular cameras' colour perception (which has nothing to do with colour balance). Of course, if you shine a green light on a white surface the camera will record a green surface but no two cameras will record the same green. This is my point and the reason why it's not necessary to profile the camera for different light sources; when profiling you're just ironing out minute discrepancies in the manufacturing process.

You seem to be missing exactly what I'm referring to when I mention the idiosyncrasies of a particular cameras' colour perception (which has nothing to do with colour balance). Of course, if you shine a green light on a white surface the camera will record a green surface but no two cameras will record the same green. This is my point and the reason why it's not necessary to profile the camera for different light sources; when profiling you're just ironing out minute discrepancies in the manufacturing process.

Actually it is necessary, look up "metamerism" and you'll see that even when accounting for white balance, colors can change when the lighting changes. In the same way that Epson Ultrachrome prints undergo slight hue shifts in varing lighting conditions even after white balance is accounted for, the color filters in digital cameras will alter their color response slightly in different types of lighting. That's why ACR uses two internal camera profiles and interpolates between them depending on the white balance setting. Camera color filters are designed to minimize metamerism, but like everything else in photography, nothing is perfect. Color perception is tied to the color balance of the lighting, you can't separate them.

I really think we're talking about different things, I'm not denying the importance of having the test subject properly colour balanced, I'm suggesting that if two identical cameras with identical colour balance shot the same scene there would be different results and it's this difference that profiling attempts to correct. If I thought we were on the same page I'd be perfectly happy to admit I could be talking out my arse, as I've got no experience with any of this, just what I've read so far.