No matter what the configuration, I could not get black to look correct, and all the colors were washed out.

I finally found out why. The Nvidia driver detects 1080p COMPUTER MONIOTRS, capable of RGB 0-255, as "HDTV's" and ONLY uses RGB 15-235 with them!

This means that no matter how well you attempt to calibrate a full RGB capable monitor, your color/black level will be totally screwed up!

I am currently using a registry edit fix, which sucks, because it's absurd that everyone with Titan's in surround etc. have paid all this money, and have to manually edit their registry to get the right color! I found out about this, and I am SERIOUSLY pissed off!

There's a post from 2009 with the BS fix I had to use! There are posts all over the internet about this, and AMD has a toggle in their driver control panel specifically to fix it. Nvidia has one that only applies to videos, and DOES NOT correct this.

Setting a custom refresh rate/resolution fixes it, but that gets reset most of the time when you play a game, and causes issues and crashes.

Being the Nvidia fanboy that I am, I am even more pissed off about this image destroying issue that goes unfixed! Everyone noticed how broken CrossFire was, and that was only known about since 2011. This goes even further back and its as simple as them adding a few lines of code to their driver installer!

If you have an HDMI connected(or DVI to HDMI) Nvidia setup, please try setting a custom resolution 1920x1080 at 59Hz, make sure the desktop color is RGB(NOT YCrBr4:4:4 which is compressed!), and watch the black become black. If you calibrated with the wrong black level, your colors will be weird too, since you likely have contrast set too high in an attempt to compensate.

You can do the registry edit BS like I did, but I want a REAL FIX.

With the CrossFire issues being fixed, thanks to the articles on here and PCPer, I hope that this issue can get enough attention for Nvidia to do the simple work required to fix it! It's way easier than making multi-GPU work, which they've done!

I'm running a Hannspree HF199H (1440x900) and a Dell 1905FP (1280x1024, set to portrait mode) from a 650ti boost. The Hannspree's only digital input is HDMI, and it's connected with an HDMI to DVI cable. It can accept a 1080p signal despite not supporting the resolution directly. Windows 7 Pro, 64-bit. Nvidia driver version 331.65.

Colors on the Hannspree are normal at 1440x900, but if I switch to 1920x1080 it does indeed reduce the dynamic range by a small but noticeable amount. Setting a custom resolution of 1920x1080 at 61Hz restores the black and white levels to where they were at 1440x900.

Edit: I just checked in Antichamber, and switching to 1920x1080 in the game also causes this issue.

It does more than reduce your dynamic range. It literally defaults to a different color space entirely, which means you will NEVER get black or accurate color, no matter what the image is showing! Even an OLED monitor will display gray instead of black with this issue.

I have a GTX 560ti and i thought that my display is not working properly since the clors seem a bit bad even after calibration. I will try the fix when i can. Does it make a huge difference? If it does i might actually give up the idea of upgrading the monitor.

Edit:Why do you guys have only 2 posts each? Is this some kind of attempt of trolling nvidia?

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

No, I am actually a huge Nvidia fanboy, which is why I'm pissed. I also have commented on articles, but I've never posted in the forums. I have never had a reason to post in the Tech Report forums, but I read Tech Report and PCPer, along with Tek Syndicate for my tech news.

Look around the interent. Do searches for Nvidia HDMI full RGB, and you'll see that this is a serious issue that's been going on for years.

The reason its huge is simple. Most monitors that people have are expecting RGB 0-255, which is what you want for desktop use and games. However, Nvidia's drivers default to limited RGB, which is 15-235, when it detects a "TV" resolution connected with an HDMI cable.

Using the wrong color space leads to blacks being gray, and colors being washed out. If you do the simple test, by creating a custom resolution with a 59Hz refresh rate, you'll be able to see the difference(assuming your monitor is set up properly to RGB 0-255).

Basically, if you go into settings and you see your resolution listed under the Ultra HD, HDTV, SD section instead of PC, your color is screwed up!

Intel also has this problem but I don't know how to fix it, and I don't care. This issue probably affects tens or hundreds of thousands of people. How many people have a 1080p monitor hooked up to an Nvidia GPU through HDMI?

Last edited by BlackDove on Fri Nov 15, 2013 8:25 am, edited 3 times in total.

Arclight wrote:Why do you guys have only 2 posts each? Is this some kind of attempt of trolling nvidia?

I've mostly just posted comments on news items so far. If you look at my join date you'll see I clearly didn't register just before posting in this thread.

As for trolling Nvidia, I don't think this is nearly as big a deal as BlackDove seems to. Not using HDMI to connect the monitor, running a different refresh rate or resolution, and editing a .inf file before installing the drivers will all make this problem go away. It does exist though, and there should be an option on the control panel to turn this setting off.

Arclight wrote:Why do you guys have only 2 posts each? Is this some kind of attempt of trolling nvidia?

I've mostly just posted comments on news items so far. If you look at my join date you'll see I clearly didn't register just before posting in this thread.

As for trolling Nvidia, I don't think this is nearly as big a deal as BlackDove seems to. Not using HDMI to connect the monitor, running a different refresh rate or resolution, and editing a .inf file before installing the drivers will all make this problem go away. It does exist though, and there should be an option on the control panel to turn this setting off.

Creating the custom resolution doesn't REALLY work, because you have to change your monitor's ideal resolution, and it will revert back(when launching a game for example).

The .inf file is a pain in the ass. You literally have to manually edit 46 lines in a file to get the right color on your monitor!

Is this only via HDMI? Or does it affect anything with HDMI in the chain, i.e. DVI or DP to HDMI converter will also trigger this behavior?

I just bought some U2413 monitors and K600 video cards specifically to get them color calibrated for work. I'd hate to think that we get them calibrated and then the users think they are too "small" and go to an HDTV resolution so stuff looks "bigger" and then the color calibration is jacked up. I'm using DP 1.2 for 10 bit per channel color support and went with nvidia since they typically have better support in the Adobe ecosystem.

Can you guys post some high quality pictures of the monitor before and after the the .inf file fix? I'm really interested in knowing if it actually makes a huge difference or not.

imgur.com would be an ok site to upload the images and ofc, it's free.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

I have a Dell S2240L (which is now $60 more than I spent on it - wtf?) which is connected to my GTX 460 1GB via HDMI (because it only has HDMI inputs). I assume because they use the same driver it affects Windows 8 and 8.1 as well?

I don't do any color-critical work, and I'm happy with how it looks (compared to the 19" Samsung 1440x900 TN panel I replaced), so I'm not likely to do anything about it but I'm curious if I'm missing out.

I do not understand what I do. For what I want to do, I do not do. But what I hate, I do.

BlackDove: You should be able to run 1080p at 75Hz over HDMI. I keep my monitors at 75Hz and nearly every game I've tried runs at that refresh rate (it's noticeably smoother than 60Hz BTW). The ones that don't usually let you set the refresh rate manually. It may be that games are looking for a refresh rate of at least 60Hz, so they'll change it if you have it set lower, but not if you run a higher refresh rate.

And yes, I know that's beside the point, which is that they should fix this.

Scrotos: Nvidia's fix is to paste "HKR,,SetDefaultFullRGBRangeOnHDMI,%REG_DWORD%,1" in a bunch of places in a .inf file, so it would seem to be an HDMI thing. I'm using a DVI to HDMI cable and I see the issue, so I'd guess anything you plug into an HDMI port on the monitor will trigger this "feature".

Arclight: Go into the Nvidia control panel, then "adjust video color settings" then the Advanced tab, and switch between full and limited dynamic range when playing a video. It's the same as that, but for the entire display.

I remember this annoying me back in the day, but I seem to remember fixing it without too much drama.

Nvidia definitely has issues with Video overlay colorspace though. The grey blacks caused by 16-255 instead of 0-255 annoyed me so much that I used to force MPC to render in software just to avoid Nvidia's hardware overlay altogether.

Some people ask me why I have always enclosed my signature in spoiler tags; There is a good reason for that, but I can't elaborate without giving away the plot twist.

steelcity_ballin wrote:My 560TI does HDMI-out and my Monitors can accept it, but why not just use DVI?

My monitor only has HDMI and VGA inputs. Even using a DVI -> HDMI cable, it's apparently treating it like HDMI, because I still get sound through my monitor even over DVI.

That's really odd! I wonder why they'd include an old standard like VGA, a new standard like HDMI, but not DVI. Worse, if you use an adapter for DVI (supposing you had one) it'll just convert the signal from analog as I understand it so it'll still look like crap.

BlackDove wrote:The reason its huge is simple. Most monitors that people have are expecting RGB 0-255, which is what you want for desktop use and games. However, Nvidia's drivers default to limited RGB, which is 15-235, when it detects a "TV" resolution connected with an HDMI cable.

So this is only limited to HDMI? I can't think of many full RGB monitors that ONLY offer HDMI input. Use DVI or DP. Problem solved.

DVI and HDMI can be adapted to each other without doing any analog conversions. DVI and HDMI use the same signalling for the actual picture, just HDMI has additional setup for audio and the DRM is mandatory instead of optional.

I don't see how this is an Nvidia bug. The colour space is dictated by the EDID that is sent from the monitor via HDMI. Most monitors that are 1080p and have HDMI present TV modelines when hooked to HDMI to comply with CEA/EIA standards. A refresh of 59.94 in most monitors are presented as a TV and utilizes those standards and colour space. In 60Hz it utilizes the PC resolution of modeline and as such gives you your full 0-255 colour gamut.

This again is NOT a bug but expected behavior given the EDID information that is sent to the video card.

BlackDove wrote:The reason its huge is simple. Most monitors that people have are expecting RGB 0-255, which is what you want for desktop use and games. However, Nvidia's drivers default to limited RGB, which is 15-235, when it detects a "TV" resolution connected with an HDMI cable.

So this is only limited to HDMI? I can't think of many full RGB monitors that ONLY offer HDMI input. Use DVI or DP. Problem solved.

OK so out of morbid curiosity I did the .inf edits. There were 50 of them to make, and then I couldn't install the driver because of signature enforcement. Temporarily disabled signature enforcement in Windows 8.1 Pro and installed. Rebooted normally and the driver still seems to be functioning. Can't tell a difference, though - looks teh same to me. Probably because I have a relatively cheap IPS monitor that doesn't support full RGB..

I do not understand what I do. For what I want to do, I do not do. But what I hate, I do.

I switched back to DVI cable, now how do i find out if the RGB is now set to the full range? Do i have to reinstall the driver?

Edit: I feel like i'm taking crazy pills, now the background of every two posts in this thread appears to me as being light blue-ish. Is it just me? This was not the case half an hour ago when i had HDMI.

I know this is counter intuitive posting a screen shot but i'm fairly convinced that this 2 tones are actually used for the forum postshttp://i.imgur.com/6nRF4dF.png

Is there somebody else that didn't (doesn't) notice this while using an HDMI cable with an nvidia card and a monitor running at 1080p with 60Hz refresh rate?

Last edited by Arclight on Fri Nov 15, 2013 1:44 pm, edited 2 times in total.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

I knew there was a reason ATI cards always seemed to have better colors/picture for gaming and especially video watching...hummmm no wonder when I selected the 0-255 SD HD shader in MPHC the blacks popped and overall picture looked better also on my 55" plasma for Watching videos at least.

Idk if i'm imagining things, but there seems to be a nice improvement in colors...anybody else?

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

Arclight wrote:Edit: I feel like i'm taking crazy pills, now the background of every two posts in this thread appears to me as being light blue-ish. Is it just me? This was not the case half an hour ago when i had HDMI.

You mean you never noticed the ever so slight colour change in every second post before?

Yeah it's been like that for a very long time. On a TN panel you didn't see it all the time because of the angle you looked at the screen, but on a CRT or my current monitor I have it.

morphine - I figured that was the case. I wonder if the teal I use as the main color in my Windows color scheme is more teal-ish or something (though I suspect it's a placebo) but I dunno. I guess I'll wait for antoher driver update and install without the mod and see what happens.

I do not understand what I do. For what I want to do, I do not do. But what I hate, I do.

BlackDove wrote:The reason its huge is simple. Most monitors that people have are expecting RGB 0-255, which is what you want for desktop use and games. However, Nvidia's drivers default to limited RGB, which is 15-235, when it detects a "TV" resolution connected with an HDMI cable.

So this is only limited to HDMI? I can't think of many full RGB monitors that ONLY offer HDMI input. Use DVI or DP. Problem solved.

I'm not positive on this, since I don't have a DisplayPort monitor to test, but I believe that 1080p over DisplayPort is also affected.

I don't have a DVI input, or I wouldn't be so pissed.

And pretty much any monitor with an HDMI only can display full 0-255 RGB. They may be 6bit+FRC TN panels and not $1-40,000 IPS panels, but they're still totally capable of RGB 0-255!