The specs for SD-DVD (BT.601) & Blu-Ray (BT.709) states that the YCbCr range is 16 to 235.

PS3 Games & the XMB use RGB, but I have no idea what the native range is. Is it Limited (16 to 235) or Full (0 to 255).

My Sony XBR4 has a RGB Limited/Full setting on it as well. Folks in the "PlayStation Area" of AVSForum seem to think that the native output is RGB Limited for Games and the XMB. Others agree with me that the nVIDIA RSX GPU is outputting your typical *computer* range of 0 to 255 by default.

The specs for SD-DVD (BT.601) & Blu-Ray (BT.709) states that the YCbCr range is 16 to 235.

PS3 Games & the XMB use RGB, but I have no idea what the native range is. Is it Limited (16 to 235) or Full (0 to 255).

My Sony XBR4 has a RGB Limited/Full setting on it as well. Folks in the "PlayStation Area" of AVSForum seem to think that the native output is RGB Limited for Games and the XMB. Others agree with me that the nVIDIA RSX GPU is outputting your typical *computer* range of 0 to 255 by default.

There isn't a native range for either YPbPr or RGB. The two ranges you are describing are generally referred to as "PC levels" and "video levels." The important thing is that the display device is expecting what you are feeding it. TV's normally use "video levels" (16-235).

Without going into a lot of detail, you want to use limited on both the TV and PS3. All DVD/Blu-Ray will use those levels, and they would have to be re-mapped otherwise. I'd be willing to bet that the XMB, games, etc. can use one just as easily as the other because the graphics are generally created on the fly, instead of being stored on disc.

The whole notion of what levels a given game should be viewed at is entirely down to the creators of the game. If they are not idiots they will realise that a video console will be connected to a video display and when creating their imagery they should be appraising the end result on a display calibrated for video levels.

The fact that you get so many games with additional brightness and gamma sliders is fairly indicative of the pretty pathetic record games companies have in maintaining standards. As the majority of games creators have little concern as to how their imagey is displayed the best advice is to set your display to where you are happy with.

Remember when King Kong came out for the 360? Lots of people complained about the overly dark image : Ubisoft even claimed the 360 had a problem with its video output: in fact Ubisoft created the game in a PC level display environment without any consideration for it ending up in a video console.

Now if most companies have only a glimmer of understanding when it comes to an issue as straight forward as video v PC levels how much consideration do you think they give to colour standards ..... that's right zilch.

Yeah...I did call Sony and after being put on hold more times than I could count, I was told that the XMB uses RGB Limited and games use whatever the developer uses.

Good luck finding that out though since no info about RGB levels is made public as far as I know.

Generally speaking I've found the higher profile 360 games ( Halo3 , Mass Effect) seem to know about video levels: Ubisoft seem to generally screw this up Far Cry on the consoles is painfully using PC levels. Oblivion looks to be using PC levels too but at least they have included a brightness slider ( which they wouldn't need if they knew what they were doing , when's the last time you saw a dvd or BD/HDDVD with a brigtness slider on the menu screen?).

I actually expect games artists to know literally nothing about this sort of stuff these days. I've even had one or two suggest its not an issue and standards don't matter: they soon get to grips with it when I thrown their CG back at them a few times because they can't tell me what colourspace it is. I'm not going to waste my time figuring out what they intended it to look like when they can't even tell themselves.

Generally speaking I've found the higher profile 360 games ( Halo3 , Mass Effect) seem to know about video levels: Ubisoft seem to generally screw this up Far Cry on the consoles is painfully using PC levels. Oblivion looks to be using PC levels too but at least they have included a brightness slider ( which they wouldn't need if they knew what they were doing , when's the last time you saw a dvd or BD/HDDVD with a brigtness slider on the menu screen?).

I actually expect games artists to know literally nothing about this sort of stuff these days. I've even had one or two suggest its not an issue and standards don't matter: they soon get to grips with it when I thrown their CG back at them a few times because they can't tell me what colourspace it is. I'm not going to waste my time figuring out what they intended it to look like when they can't even tell themselves.

I used to think that games were made to take into account that most gamers have no idea about a calibrated set... so the game was made "darker" than it really should be to compensate for things like "vivid" modes and so forth.

When I really started looking, I found the same thing as you explain here. It looks like it is just all over the place.

What if we ever get x.v.Color (aka xvYCC) sources for YCbCr? Will that require 0 to 255?

You dont think that some games may look better using the full rendered 0 to 255 RGB range?

It would almost seem to me that the best route would be to toggle on RGB Full for games and RGB Limited for movies.

xvYCC takes advantage of 0-255 by using the values that are normally outside "video" range to expand the gamut, as far as I can understand it. When xvYCC games/apps become available, I would expect that the change to 0-255 would be handled intrinsically since xvYCC seems to require it. All of that is speculation, I haven't really read up on it that much.

Regarding games rendering in 0-255. I doubt you would see that much difference, but I haven't really tested it. If you want to try it and report back, then set your PS3 to Full and the A3000 HDMI to full. You should be able to keep the same calibration as long as the PS3 and A3000 are set to the same thing. I doubt you are going to see any appreciable difference if both are set correctly, it will just map one set of levels to the other.

xvYCC takes advantage of 0-255 by using the values that are normally outside "video" range to expand the gamut, as far as I can understand it. When xvYCC games/apps become available, I would expect that the change to 0-255 would be handled intrinsically since xvYCC seems to require it. All of that is speculation, I haven't really read up on it that much.

Regarding games rendering in 0-255. I doubt you would see that much difference, but I haven't really tested it. If you want to try it and report back, then set your PS3 to Full and the A3000 HDMI to full. You should be able to keep the same calibration as long as the PS3 and A3000 are set to the same thing. I doubt you are going to see any appreciable difference if both are set correctly, it will just map one set of levels to the other.

I thought xvYCC was a YCbCr feature (aka video content)...not a RGB feature (games). The only sources that I know of currently are certain Sony Camcorders. Now Deep Color is a a HDMI 1.3 feature that will work in YCbCr and RGB. I have heard that PS3's Folding@Home uses RGB Deep Color, but I have no proof.

And yes, I have tried RGB FULL on both my XBR4 and the PS3. Looks exactly the same as RGB LIMITED on both the XBR4 and PS3 to me. (no need to re-calibrate)

I thought xvYCC was a YCbCr feature (aka video content)...not a RGB feature (games). The only sources that I know of currently are certain Sony Camcorders. Now Deep Color is a a HDMI 1.3 feature that will work in YCbCr and RGB. I have heard that PS3's Folding@Home uses RGB Deep Color, but I have no proof.

xvYCC and so forth wouldn't necessarily be restricted to RGB or YCbCr. They are just different ways of representing the same signal, in simple terms.

Quote:

And yes, I have tried RGB FULL on both my XBR4 and the PS3. Looks exactly the same as RGB LIMITED on both the XBR4 and PS3 to me. (no need to re-calibrate)

I think there is an Auto setting on the most Sonys for HDMI range. If Auto detects the range correctly, then you should see pretty much the same image.

Try telling the PS3 to output limited and force your XBR4 to full, and vice versa. You should see a change then, most noticeably in shadows and dark areas.

I am thinking the RGB AUTO setting on the XBR4 is not working...or the PS3 is not truly sending a 0-255 signal in the XMB when setting RGB to FULL on the PS3.

I found the same thing with my PS3 and A3000. I honestly don't know how it is expected to work, unless there is something in the HDMI protocol that allows for metadata about the signal range to be sent. I would image it would be pretty hard to guess at it by the data being passed.

That is another reason I suggest Limited. Full is just asking for problems, with no real benefit. Most people think they are getting some extra deep blacks or something, but they just misunderstand what is going on.

I was under the impression that my picture was dramatically better when I switched to full, til I started researching just what exactly these settings do. My current tv does not support FULL RGB, however when I initially set it to FULL, the picture did seem to give the picture deeper blacks, and more contrast to the images displayed. Upon further tweaking and testing, I found that it was just crushing the blacks on my screen, and all detail was lost in darker areas. This was especially annoying when gaming. I have recently just tweaked my set with limited set, and I now have a picture that rivaled what I thought I was initially getting with FULL set. Nice contrast, and decent blacks, without the loss in detail. So in my opinion if your tv doesn't support FULL RGB, use limited and calibrate your set. Since your tv does, I cannot help in the matter, but hopefully this will help those who may read this topic with tv's that don't support FULL RGB.

The only control I altered was the brightness control. Both systems are calibrated for standard/limited and I mainly wanted to get an idea of what an unknowledgable person would be looking at if they didn't calibrate after changing to expanded/full RGB. You can see why many people would think expnaded/full RGB would look "better" since the low end gamma is so dark. It would make everything look richer. The expanded settings on the 360 require the contrast control to be much lower then usual to avoid the massive run out of red that occurred. This effect was not nearly as pronounced with the PS3.

I did some quick measurements with HCFR and an i1 pro to see exactly what was going on between the different modes on the PS3 and 360 (both HDMI to my 70 XBR2 using AVS HD discs).

The only control I altered was the brightness control. Both systems are calibrated for standard/limited and I mainly wanted to get an idea of what an unknowledgable person would be looking at if they didn't calibrate after changing to expanded/full RGB. You can see why many people would think expnaded/full RGB would look "better" since the low end gamma is so dark. It would make everything look richer. The expanded settings on the 360 require the contrast control to be much lower then usual to avoid the massive run out of red that occurred. This effect was not nearly as pronounced with the PS3.

.chc files attached.

Nice work. Someone should write up a detailed explanation of the video related settings for both consoles and include information like this. If that was made a sticky, then we wouldn't have to answer the same questions about this stuff a billion times.

The PS3 graphs have the contrast set conservatively while the 360 graphs have the contrast set high like many people do. If you like to keep your contrast set high, then pay attention to the 360 graphs because you will see that expanded/full RGB has a dramatic effect on gamma and color tempertaure, especially at the high end. The PS3 graphs would have looked very similar had contrast been set higher.

The expanded/full RGB graphs with crushed blacks are what most people are seeing if they don't use test patterns to set thier brightness control. As you can see it causes a darker gamma which is most pronounced at the low end. This will have the appearance of giving richer colors and a deeper picture. This comes at the expense of losing detail around black (severe black crush). This deeper and richer picture is why many people declare expanded/full RGB to look "better". However if you actually measure the TV with test patterns, it's very obviously an incorrect picture.

If you use expanded/full RGB with the correct black level setting, but do not lower your contrast, you will get a very blown out and discolored high end. It ranges anywhere from slightly blown out at the high end (see the PS3 graph) all the way to extremely discolored with a totally broken gamma at the high end (see the 360 chart).

Limited/standard with black levels set correctly makes for a very easy time setting your brightness and contrast control because it's not using the extreme ends of the video levels. You can get a reasonable picture very easily just by using your eyes to set the contrast and brightness controls. With expanded/full RGB you have to use test patterns to set the levels correctly or you will end up with a very incorrect image (though many might find it pleasing and actually prefer it to a "correct" image).

One thing to keep in mind is that if you are using a plasma or CRT based displays, there is a good chance that you may be badly overdriving the phospshurs using expanded/full RGB if your brightness is set correctly and you do not have you contrast setting farily low. Where a digital set like mine simply run out of a color (red in most cases) when overdriven, a phosphur based display will just try to display it by going brighter and brighter which may lead to a shorter phospher life and possibly uneven phoshur wear (burn in). If I had a plasma or CRT and wanted to use expanded/full RGB I'd be very sure to use test patterns to set my contrast/brightness controls correctly and be very conservative on how high my contrast was set. Video games have a lot of static images and could be a potential threat for burn in.

If you use expanded/full RGB and calibrate it correctly, it will result in a picture that is near identical to limited/standard because in the end the picture can only be as good as your TV is capable of. Both black and white will be mapped to the same points. The only advantage that expanded/full RGB could give you is having more shades of color for RGB sources which may or may not actually be noticable.

What about the effect of Super White? I believe that without Super White, BTB and WTW will not show up in calibration discs? What is the best setting for Super White if we are going to calibrate the sets?

In terms of calibration, it helps to give you some reference when setting black level and white level, but it isn't required to set either one.

In terms of processing (this is the important one), it prevents hard clipping at 16 and 235. You never want a clipped waveform as you are processing it.

In terms of PQ, BTB doesn't really offer anything because you shouldn't be able to see it anyway if your brightness is calibrated correctly. However, it has been found that commercial DVDs often times use the WTW region for highlights, so there is actual video material in the WTW region and it is best to calibrate and show that material when possible.

Games and the XMB are rendered in RGB. I believe it is natively in the 0-255 PC range. (which you would expect, as it is an nVidia graphics card in there) People have also reported seeing additional posterisation when sending Video Levels as there are less steps of gradation available.

The levels are remapped automatically whether you are in Full or Limited RGB. By that, I mean that whether it is in limited or full RGB, you will always see information below/above 16/235 in the photo viewer etc.

BD/DVD is natively YCC, 16-235. If you set BD/DVD output to YCC rather than RGB or Auto, the Full/Limited RGB setting will have no effect whatsoever.

Super White enables/disables the display of BTB/WTW information with BD/DVD playback in YCC. If you play back BD/DVD in RGB, the PS3 will always clip it.

Many displays do not support Full RGB, or do not work with it correctly when in Auto mode. With some displays (Pioneers for example) forcing them into PC levels will stop them auto-switching into YCC for BD/DVD meaning that you have to either use RGB for everything (prioritising game image quality) or use limited RGB for games and YCC for BD/DVD. (prioritising BD/DVD IQ)

To test whether your display supports Full RGB or properly switches into it automatically, first set the system to Limited RGB.

Adjust the brightness control on your display notch-by-notch until the 1 is just visible a notch lower and it would disappear.

Now, switch your PS3 into Full RGB mode. Is 1,2,3,4 still visible or are you now looking at a black screen?

If 1,2,3,4 is visible, your display properly supports auto-switching into PC levels. If not, see if you have an option in the menus to switch between PC/Video levels. (often labelled something like black level high/low') Switching to PC levels should restore the 1,2,3,4 on your display.

The 1 or possibly even 2 may disappear so you might need to adjust brightness a couple of notches to get it to display properly again.

If it all disappears and you have to make significant changes to brightness (e.g. 10, 15, 20 notches, assuming a scale of 1-100) then it is not working correctly.

If you can't see 1,2,3,4 at all after switching the PS3 into Full RGB and don't have a black level option on your screen, it does not support it and you should set it back to Limited RGB.

Personally, I feel that it is a bit of a misnomer calling it Full or Limited RGB people are more likely to turn it onto Full RGB as they think they'll be getting a better image. It should really just be referred to as PC or Video levels.

Sperron, those results look like your display either renders PC/Video levels drastically differently from each other, which would be quite unexpected or, more likely, it does not support PC levels.

That is what I would expect to see from a display that does not support Full RGB, but doesn't have a hard clip at 16/235 allowing the information to be brought out with large changes in the brightness/contrast controls.

I'd say that my TV's behavior is representative of most HDTVs in the marketplace (at this time). As far as I have read around here, only some Samsung owners have reported observing autoswitching levels in thier sets (noticed by the fact that thier YCrCb calibration makes limited RGB look "washed out"). My TV does not support autoswitching between the 2 ranges, nor does it have a full RGB mode as some of the newer XBR LCDs have. If I use test patterns to set my levels along with HCFR and a probe, my TV supports PC levels just fine (just not automatically). If the manufacturers make a concerted push towards autoswitching between PC and video levels, then the info I posted would start to become obsolete.

Using expanded results in a sizable increase in light output over what you would normally get using the same exact contrast settings with standard (this is part of why people think expanded looks "better"). Using expanded is like taking your picture/contrast control and cranking it up. The 360 footlight results would have been much higher, but expanded actually maxed out red, green and blue on my set which limited the light output.

As I have said above, it looks like your display does not handle PC levels properly hence the increased light output and terrible gamma response. Your results are exactly what I would expect to see from a screen that only accepts video levels but is being sent PC levels.

If your display handles PC and Video levels correctly, the above two situations should look virtually identical. The only difference that you may or may not see, is improved gradations when sending PC levels.

PS3 outputting Video levels/display expecting PC levels: washed out display lowering brightness and increasing contrast may allow you to set black/white correctly, however you will almost certainly not have the proper gamma response.

PS3 outputting PC levels/display expecting Video levels: crushed shadows/highlights, apparent increase in contrast. Depending on how the display handles this, you may see an increase in light output and information below 16/above 235 may or may not be hard clipped. By that, I mean that you may be able to bring this detail back by increasing brightness/decreasing contrast, but this typically results in brightness compression and a skewed gamma response.

It does not matter whether you are using PC or Video levels, what matters is that you are using the correct levels for your display. Just because you may be able to send the wrong levels and then fix' it with the brightness/contrast control does not mean you should.

Typically, if your display does not offer an option in the menu to change between PC/Video levels, it is only expecting video levels. (unless it is a monitor where the opposite is true)

As I have said above, it looks like your display does not handle PC levels properly — hence the increased light output and terrible gamma response. Your results are exactly what I would expect to see from a screen that only accepts video levels but is being sent PC levels.

This is definitely incorrect. Calibration is about tailoring your display to the incoming signal. Just because my video level calibration is incorrect for PC levels doesn't mean that my "display does not handle PC levels properly". There are so many aspects of a TV that must be "fixed" that it's silly to even make such a statement (color decoder, color temperature, gamma, etc...).

This isn't even taking into account that all comercially released "video level" content on DVD, HD DVD and Blu-Ray uses levels 16-254 for picture information. Alluringreality's tests linked here. Whiter then white is extensively used by "video level" material. A strict calibration for "video levels" leaves you with discolored highlights above 100% white.

Quote:

Originally Posted by andrewfee

It does not matter whether you are using PC or Video levels, what matters is that you are using the correct levels for your display. Just because you may be able to send the wrong levels and then ‘fix’ it with the brightness/contrast control does not mean you should.

This like saying that if taking your TV out of the box the default brightness and contrast settings aren't correct for either video or PC levels that you are "fixing" it by adjusting those controls (which you seem to cast doubt upon the wisdon of even correctly adjusting them). By your definition, a HDTV supports neither video nor PC levels and we just cludge a "fix" to get them working correctly. The real situation is that unless you have one of those rare TVs that auto adjusts for different levels, it's merely a calibration issue.

Also if your TV does autosense PC levels, does your comment mean that you shouldn't calibrate the PC levels by using your "brightness/contrast control" as you imply is unwise?

Sorry, I don't think you're understanding the difference between PC/Video levels. It is not simply a brightness/contrast calibration issue.

If your display wants video levels, that's what you send it. If it wants PC levels, then that's what you send it. If you have the option on the display (e.g. a PC/Video levels toggle in the menus) you generally want to go with PC levelsat least from the games consolesbut not always.

With XMB/Game content, the PS3 (and 360 for that matter) will not clip shadow/highlight details when outputting Video levels, they simply remap the rendered 0-255 range to 16-235. What this means is that you will see the same information in eitheryou just lose some brightness steps in the middle.

You will not gain anything by sending a display expecting Video levels PC levels and fixing' it with the brightness/contrast controls.

You have already posted data which shows exactly why you should not attempt to force a display into working' with PC levels if it expects Video levels: vs

vs

As for BD/DVD content, which is a separate issue, you must be running in YCC with Super White on to view BTB/WTW information. Viewing in RGB automatically clips it whether you are using PC or Video levels so this is irrelevant.

Sorry, I don't think you're understanding the difference between PC/Video levels. It is not simply a brightness/contrast calibration issue.

I fully understand the difference.

Quote:

Originally Posted by andrewfee

You will not gain anything by sending a display expecting Video levels PC levels and ‘fixing’ it with the brightness/contrast controls.

You will gain exactly what you'd gain sending it to a TV expecting PC levels. You gain the "additional brightness steps in the middle". All the bightness and contrast controls do is map "black" and "white" in the source signal to the "black" and "white" of your display. All a TV that autosenses PC levels does is offset the current contrast and brightness settings by a fixed amount. I'm not sure what you think it's doing.

Quote:

Originally Posted by andrewfee

You have already posted data which shows exactly why you should not attempt to force a display into ‘working’ with PC levels if it expects Video levels:

I can calibrate for PC levels and the resulting gamma graph will be identical to the video level graph. All I have shown is that if you don't properly calibrate for the source signal, you get inaccurate results.

Games are rendered natively at PC levels on PS3, Xbox 360 and PC and get converted to whatever the console is outputting. If games are a primary concern then the console is bet set to output RGB at PC Levels.

If you have the console set to output video levels then when you play a video game the range is compressed so 0 becomes 16 and 255 becomes 235 and the levels in between are remapped. There is nothing <16 and >235 so those levels are wasted.

With the right combination of settings, the PS3 can output games at RGB PC Levels and Video at YCbCr Video Levels.