To give a bit more perspective, here's a couple of pics I took yesterday with the setup I described above (Although these were taken without a tripod. Click the images for higher resolution)

I'm pretty happy with how these turned out, but I'm sure there's room for improvement to get the best representation of what I'm seeing with my eyes, vs what the camera is capturing (this is real close, btw)

Think about it.. your looking at an image taken with a camera that you have no idea how it's white balance is working or the exposure, then you are displaying it on a monitor which may or may not be calibrated or as accurate as the camera or the display you are taking a photo of. Then you are posting it so other people can look at a much lower resolution copy of the above photo on there uncalibrated monitor with whatever video card they have ...and saying "This is what I see!" See how futile this is.. screen shots are fun and can be useful to show obvious problems with a display like banding, posterization, bad polorizers, collapsed light tunnels etc etc but as far as color correctness, sharpness or anything else dealing with WOW IT IS SO GOOD.. Forget it!

With respect, I get the point you are making. And I cannot argue with it. I believe that you are not getting mine. Or perhaps you don't agree with the premise, which is perfectly fine and acceptable to me.

The idea of establishing best practices is precipitated on the fact that there exist a challenge or set of challenges that cannot be overcome.

The goal then, is to establish a set of fundamentals that, when followed, seek to minimize those challenges.

I'm not trying to propose that its possible for anyone to convey with a photograph, with a degree of accuracy suitable for replacing the original, what they are able to see with their eyes in a physical medium.

In addition to camera settings there's also room lighting and and camera position (relative to the display). I can't see much usefulness unless you're comparing either different settings or different content on the same display unit. And in that case you don't really need best practices. You just have to make sure to repeat your own conditions.

Of course, the use case that airscapes mentions, showing obvious problems with a display, is a very valid one. For that it's whatever works; if the picture shows the problem well enough for others to recognize, it's good enough.

Adjusting settings according to personal preference is not calibration.

Let's say you take a photo of an apple with 100 consumer digital cameras (or even pro cameras mixed in). You will get 100 different results even if the cameras are PRECISELY positioned in exactly the same location.

There's variable #1.

You then post the photo on AVS and it is viewed by 100 people on 100 different computer displays. Every computer display looks different (for reasons mentioned in another post... the display itself, the video subsystem, etc.).

That's variable #2.

So now you have posted 100 photos of the same apple that all look different to 100 people viewing the iages, so you now have 10,000 variations of that original apple that all look different. Obviously, you can eliminate huge and obvious errors, but you can't get around the fact that nobody is seeing exactly what you see.

There is some sort of cult fascination on AVS with people posting screen shots... very useful if there is an obvious problem they are looking for help with. But when the photos are posted to show how great some TV looks, all I can do is shake my head. It's like posting photos of a car and expecting people to know how it feels behind the wheel just from seeing a photo. You CANNOT replicate the "in room" viewing of a TV via a photo, no matter how hard you try. I've seen cases where posted photos were intended to show the difference before and after calibration, and frankly, the "before" photo looked better on a calibrated video monitor. So I couldn't tell from the photos if the photos were bad or whether the calibration was bad or whether the calibration was good.

FURTHERMORE... pause on a DVR shows a degrade version of pre-degraded images. It is one of the worst possible sources. Pause on a Blu-ray is FAR better, but still doesn't represent fully what the images really look like while the movie is playing. Frame-to-frame variations in TV and Blu-ray become part of the viewing experience and typically, images will look sharper over a few seconds of a movie than any single frame will look. For cable or satellite, the difference is even more pronounced because the huge amount of compression in the images creates block and edge artifacts you can't remove. Processing in cameras tends to make the artifacts in TV images, especially the edge artifacts, even more visible in the photo than they are in person.

Screen shots are really an exercise in futility and the time spent doing it, could be better used doing something else.