I am just starting to get my head around my 5D III after a 60D. I have been to a few photography courses and have LR4.1 and PSE10. I shoot raw & L JPEG. My confusion is that some recommend sRGB however other people recommend Adobe RGB. The 5D III manual recommends sRGB but the LR books are divided. I only shoot for my own enjoyment at this stage but may join a club later when I get a little better. Any thoughts very much appreciated.

I am just starting to get my head around my 5D III after a 60D. I have been to a few photography courses and have LR4.1 and PSE10. I shoot raw & L JPEG. My confusion is that some recommend sRGB however other people recommend Adobe RGB. The 5D III manual recommends sRGB but the LR books are divided. I only shoot for my own enjoyment at this stage but may join a club later when I get a little better. Any thoughts very much appreciated.

I shoot RAW, so as was just pointed out I can just select whatever colorspace i like after the fact. But i always use sRGB. years ago i took some test shots and printed the images myself in both adobe and sRGB. The images had lots of green foliage. The human eye is most sensitive to green, and if memory serves me green shows the largest improvement in colorspace range when you compare sRGB to adobe. So if adobe was going to be an improvement, this would show it. End result, the images looked slightly different, but just barely and I couldn't say i liked the adobe more. stick with sRGB. you don't want to start fooling with adode unless you have lots of time and money to spend. you don't want to start moving sliders around , adjusting an image when you can't really see what it is you're doing. so you need a new monitor. you probably don't want to see what they cost, and that's only the beginning of the fun.

It's actually a fallacy that Adobe vs sRGB is only about the greens. People base that on a single 2D slice of the 3D gamuts and all they see is a giant chunk of green added.

Crazy saturated intense greens are actually somewhat rarer to come across in nature so it's actually reds, purples, oranges, yellows that are where you'd see the most difference between say ProphotoRGB and sRGB viewing on a wide gamut monitor. Try to make a deep red rose or deep purple petunia look realistic in sRGB and it just can't be done, same for many flowers, use prophotorgb and a wide gamut monitor and suddenly they look vastly more like real life. Shoot a sunset and in sRGB some bright saturated cloud bands disappear but pop back right out at you on a wide gamut.

No, it's not. Look at the 3d colorspace map and then look at a CIE chart and understand it. Besides, i never said it was ALL about the green, just that green shows the most improvement, and that any green improvement would be the most noticeable anyway because the human eye is far and away most sensitive to green. It's theorized that it'd due to us looking at, and hiding in foliage from predators since the dawn of man. But that is another topic all together. This is one of those simple matters that can be solved with 5 dollars worth of prints, but nobody wants to do it. Also, could you do me a solid and stop posting 3-4 times in a row.

I'm currently doing a course on colour management for digital photography, and the instructor told us to always shoot Adobe RGB.

As I understand it, the reason is that because the Adobe RGB colour space is larger than sRGB, you will have more colours to work with in post, even if you then convert your final output to sRGB.

Or what curtisnull said while I was obviously typing too slowly...

As far as I am concerned It doesn't matter which colour space you choose in camera while shooting RAW, as RAW has no colour space embedded-It uses full spectrum of the sensor. You then have to choose what colour space will be used for editing. It does matter while shooting JPGs, of course in theory due to the fact that no one is editing jpg's in professional world. Of course AdobeRGB is largest space and it is good for editing, however it depends what monitor you work with. CMYK is the smallest and final print space, but working in AdobeRGB gives you a kind of margin when editing and while having a wide gamut screen you will be able to see more colour tones in specific areas. I suggest one should read a "Real world color management 2 Edition" - imho it's the best and most comprehensive book on the market concerning the colour topic. Highly recommended:)

I shoot RAW, so as was just pointed out I can just select whatever colorspace i like after the fact. But i always use sRGB. years ago i took some test shots and printed the images myself in both adobe and sRGB. The images had lots of green foliage. The human eye is most sensitive to green, and if memory serves me green shows the largest improvement in colorspace range when you compare sRGB to adobe. So if adobe was going to be an improvement, this would show it. End result, the images looked slightly different, but just barely and I couldn't say i liked the adobe more. stick with sRGB. you don't want to start fooling with adode unless you have lots of time and money to spend. you don't want to start moving sliders around , adjusting an image when you can't really see what it is you're doing. so you need a new monitor. you probably don't want to see what they cost, and that's only the beginning of the fun.

It's actually a fallacy that Adobe vs sRGB is only about the greens. People base that on a single 2D slice of the 3D gamuts and all they see is a giant chunk of green added.

Crazy saturated intense greens are actually somewhat rarer to come across in nature so it's actually reds, purples, oranges, yellows that are where you'd see the most difference between say ProphotoRGB and sRGB viewing on a wide gamut monitor. Try to make a deep red rose or deep purple petunia look realistic in sRGB and it just can't be done, same for many flowers, use prophotorgb and a wide gamut monitor and suddenly they look vastly more like real life. Shoot a sunset and in sRGB some bright saturated cloud bands disappear but pop back right out at you on a wide gamut.

No, it's not. Look at the 3d colorspace map and then look at a CIE chart and understand it. Besides, i never said it was ALL about the green, just that green shows the most improvement, and that any green improvement would be the most noticeable anyway because the human eye is far and away most sensitive to green. It's theorized that it'd due to us looking at, and hiding in foliage from predators since the dawn of man. But that is another topic all together. This is one of those simple matters that can be solved with 5 dollars worth of prints, but nobody wants to do it. Also, could you do me a solid and stop posting 3-4 times in a row.

You forget that the greens that it adds are hyper bright saturated greens for much of the additional chunk of the space and that your average natural scene doesn't have much of that. Shoot some glow in the dark green clothes and crayons or some deep green emerald-colored minerals and yeah but that stuff is not nearly so common as flower, sunset/sunrise, fall foliage,bright red clothing,evening lighting shots etc. I've compared tons of images and it's not the green where you see the most difference by any means.

Yes we are more sensitive to green in the way you mention but that is irrelevant to this.

Take some shots of some red roses, some deep purple petunias, some sunsets and view on a wide gamut monitor and flip between sRGB an ProphotoRGB and tell me you don't see a noticeable difference and one far larger than you see the greens change in most shots.

Shoot in Adobe RGB then dumb it down to sRGB yourself if you need to for printing.

Actually, if shooting in Adobe RGB matters, you've already dumbed it down a lot, because that means you're shooting JPG. If you're shooting RAW, color space is irrelevant - you can set it later.

You are right, I set mine to Adobe RGB but use raw, so it really made no difference. I use Lightroom 4 which has a prophoto gamut that is even wider. I can do a soft proofing to my printer / paper profile and bring the colors into gamut as required.

Actually, if shooting in Adobe RGB matters, you've already dumbed it down a lot, because that means you're shooting JPG. If you're shooting RAW, color space is irrelevant - you can set it later.

You are right, I set mine to Adobe RGB but use raw, so it really made no difference. I use Lightroom 4 which has a prophoto gamut that is even wider. I can do a soft proofing to my printer / paper profile and bring the colors into gamut as required.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

Actually, if shooting in Adobe RGB matters, you've already dumbed it down a lot, because that means you're shooting JPG. If you're shooting RAW, color space is irrelevant - you can set it later.

You are right, I set mine to Adobe RGB but use raw, so it really made no difference. I use Lightroom 4 which has a prophoto gamut that is even wider. I can do a soft proofing to my printer / paper profile and bring the colors into gamut as required.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

I received my $2750 5D MK III from Adorama yesterday, but haven't bothered changing the gamut setting and likely won't. I've not yet setup custom file naming either. I want to get at least some of my lenses AFMA'd for a shoot coming up Saturday, but time seems hard to find.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

I think you just made a really good argument for leaving the camera set to sRGB. It's the smaller color space so if you don't see any clipping in the tiny, questionably precise histogram in sRGB, you damn well won't have any clipping in the image when processing the RAW file.

Actually, if shooting in Adobe RGB matters, you've already dumbed it down a lot, because that means you're shooting JPG. If you're shooting RAW, color space is irrelevant - you can set it later.

You are right, I set mine to Adobe RGB but use raw, so it really made no difference. I use Lightroom 4 which has a prophoto gamut that is even wider. I can do a soft proofing to my printer / paper profile and bring the colors into gamut as required.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

That is why many set AdobeRGB and lowered contrast and slightly lowered saturation in neutral profile when shooting RAW to make the jpg histogram a bit closer to RAW while still making the the image look somewhat normal and not crazy flat and hard to judge.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

I think you just made a really good argument for leaving the camera set to sRGB. It's the smaller color space so if you don't see any clipping in the tiny, questionably precise histogram in sRGB, you damn well won't have any clipping in the image when processing the RAW file.

Nah it just means you crippled RAW even more than you had to (although if all you ever care about is final sRGB output I suppose not).

I think you just made a really good argument for leaving the camera set to sRGB. It's the smaller color space so if you don't see any clipping in the tiny, questionably precise histogram in sRGB, you damn well won't have any clipping in the image when processing the RAW file.

Nah it just means you crippled RAW even more than you had to (although if all you ever care about is final sRGB output I suppose not).

I don't see how this would cripple the RAW output. The setting only affects the preview, so at worst you take a picture, look at the sRGB based histogram and decide the exposure was good, when it might have been possible to push the exposure a little bit more.

Now look at the converse situation. You set the camera to AdobeRGB, take a picture, look at the histogram and see that it is at the very limit of the cameras dynamic range...any more exposure and you would have clipping. Now you import the RAW and when output to sRGB find that there is clipping.

Honestly, the margin we're talking about (if not imagined) must be tiny.

Actually, if shooting in Adobe RGB matters, you've already dumbed it down a lot, because that means you're shooting JPG. If you're shooting RAW, color space is irrelevant - you can set it later.

You are right, I set mine to Adobe RGB but use raw, so it really made no difference. I use Lightroom 4 which has a prophoto gamut that is even wider. I can do a soft proofing to my printer / paper profile and bring the colors into gamut as required.

You know, this has me thinking (a dangerous pasttime, I know...). I've often made the argument that the in-camera jpg settings do matter if you shoot RAW, indirectly, because the in-camera settings are applied to the JPG preview image that's reviewed on the LCD and used to generate the histograms. So, to the extent that you make exposure decisions based on the preview image, histograms, or blinking highlight alert, those JPG settings matter.

I wonder...what is the gamut of the camera's LCD, would sRGB vs. Adobe RGB make a difference in color channel saturation, a difference in the histogram or highlight alert calls, etc.?

I do not notice any difference in the preview for any color difference. I have a Pacific Blue Tang in my saltwater tank and when I take shots of it with sRGB the color is awful off it looks purplish not blue. With AdobeRGB it blue. The tang still shows up purplish in the preview. Clipping is a good question though.