Okay, so recently I built a high end rig. For those who are curious, my original post was here: http://techreport.com/forums/viewtopic.php?f=33&t=84708&p=1143785#p1143785. Anyways, I am doing HDMI out using an HDMI port on my EVGA Geforce GTX 670 FTW to my LED 1080p Viore TV monitor. Now the issue, and I find it hard to describe so bear with me., Previously I did mini DisplayPort out from my Macbook Pro to my TV using a Mini DisplayPort to DVI adapter connected to a DVI to HDMI cable connected to the TV (complicated, I know but it worked for me). My Macbook Pro had Intel HD 4000 built in so the picture quality was absolutely stunning. I ran Windows off of it to play games and the picture quality was good. Everything look vibrant. Recently, I built a gaming rig to play high end games that I own on Steam. I finished the rig and plug it into my TV.

Installed the drivers and the first I noticed was that my GPU had a native resolution of 1280 x 720 720p. I thought it was an odd quirk but I ignored it. After using it for a while, I noticed something. The display bugged my eyes, it has never done that before when I did HDMI out using my Macbook Pro. The video quality was amazing on the Macbook Pro. With the new rig, the 1920 x 1080 60 HZ 32 bit setting I had (which was the same when I did HDMI on the Macbook Pro) looked different. The contrast was off (very subtle), the texts appeared bluer (or off black) at certain angles, and everything appeared fuzzier. I notice this especially when I am using my browser, The tabs look lower resolution and the entire screen just doesn't look as good. I don't really know how to describe this. It's just an intuitive sense that something is really off. To make sure that my vision wasn't going bad, I hooked my Macbook Pro up to my HDMI 2 port and did a comparison (videos and pictures below). The contrast is absolutely stunning. I tried doing a video but I don't think anyone can really see what I see unless they are actually here. The best example I could give you are the shots with the Task Manager and the parts involving Firefox. The Firefox on my Macbook Pro looked more vibrant where the Firefox browser on my rig looked grey and washed out. Check it out.

All in all, I tried to get the best picture for the rig where as for the Macbook Pro, I didn't try that hard. Even so, the picture quality on the Macbook Pro was far superior. This is on the same display, just switched between HDMI 1 and HDMI 2 (which is my Macbook Pro). I've searched around the internet and found that my TV should be set to PC so that it can detect that it's PC connected.MY TV monitor has no such settings and I've assumed I never needed as it worked fine with my Macbook Pro. However, since connecting my TV monitor to the Macbook Pro and the rig, I discovered one thing; the Macbook Pro power state doesn't have any effect on the monitor, that is if it is off, the monitor is unaffected. If my Macbook Pro is turned on while connected, again it does nothing. In fact, it acts just as its name entails, as a TV monitor. However, when plugged into my rig, if I turned on the rig and the monitor is off, it will turn on (the reverse is not true). I conclude then, that the rig has some control over how the monitor functions, differentiating it from the Macbook Pro. This is odd.

I should also mentioned that my Macbook Pro could not do HDMI audio out (possibly because I purchased a cheap mini DisplayPort to DVI connector). The display looks the same whether I connected it using an HDMI to HDMI or HDMI to DVI connector. Haven't tried VGA yet but I will probably soon though I'm not sure my monitor supports it (will update later, either here or in the comment). My TV monitor settings did not change between switches. My NVIDIA settings, especially display and colors, are set at default. My native video resolution is 1280 by 720, which I upped to 1920 x 1080 for 1080p resolution. I've never encountered this before. Previously, to up my Macbook Pro's resolution to 1080p, I merely had to close its built in display so it will switch to the native resolution of the TV monitor. Here, Windows explicitly tells me that 1280 x 720 is my GPU's native resolution. I'm not sure if that has to do with anything. Another thing that I should note; whenever I move from program to program (say from Steam not maximized to the desktop by minimizing Steam) the contrast changes, albeit very subtly. Using the Steam example, the contrast changes whenever I move between it and the background.

I am also including a video for people who do better with motion visual information. The video follows this format: 1) me comparing my taskbars, the rig first and then the Macbook Pro, 2) me comparing the task manager, again rig first, at different angle to highlight weird coloaration, 3) me showing the contrasting issue on the rig, 4) me changing the background on the rig to match to the on the Macbook Pro and highlighting the contrasting issue on the rig vs the Macbook Pro (don't be fooled, the contrasting issue is not present on the Macbook Pro despite what the video may show. That's just my shaky camera work and an overhead lamp). http://www.youtube.com/watch?v=8t6uItuQI4M&feature=youtu.be

Last edited by QuantumInteger on Sun Dec 02, 2012 7:49 am, edited 1 time in total.

I don't quite understand your question. The 670 doesn't come with monitor drivers. It has one GPU driver that you download from the EVGA page (http://www.evga.com/support/download/default.aspx). My monitor is an LED 1080p TV that you just plug and play. My television is calibrated independently of the computer. I can do so using its TV remote to adjust contrast, brightness, color, sharpness, etc.

It looks like you're perhaps not getting 1:1 pixel mapping between the desktop and the display. Have a look in the Nvidia control panel settings to see whether any overscan/underscan correction is being applied. You can find this under "Display", then "Adjust desktop size and position". Experiment with whether changing this setting makes things look better or worse. I had a similar issue with my parents' new desktop computer. Using an AOC monitor which only has an HDMI connection, the AMD display drivers were automatically applying a small overscan correction that resulted in fuzzy-looking pixels.

Alternatively, if it's mainly just the text the that looks wrong, you might get some improvement by running the ClearType Text Tuner utility (just type "Cleartype" into the Start menu search bar and it should be the first thing that appears).

It'll be the TV doing "helpful things" with the inputs. Go to your inputs menu:You can configure inputs in different ways on a lot of televisions. Certainly the Samsung at home as well as The Toshibas and Panasonics here all do different things with DVI/HDMI ports depending on how you set them up in the OSD.

Sometimes it'll be listed by device "game console/pc/dvd/camera" other times it'll be more cryptic such as PC/DVI/HDMI (refferring specifically to an HDMI port). Those settings actually do horrible things like edge enhancement, fake contrast boosting, saturation tweaks, pixel smoothing, noise-reduction. To turn all that junk off you normally want PC but you'll probably just have to play with the settings.

So many TV's lack a clearly-labelled "1:1pixel mapping with all-enhancements off" option. They usually have it, but you have to fart about in sub-menus turning off AMR, sharpness, game-console settings etc.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

It is quite possibly some combination of everything people have said above. If the video output of the card is not set to run at the native resolution of the TV you will get some fuzziness, especially on text. Furthermore, depending on the TV's settings it may be trying to "help" when it sees the non-native input resolution, and could actually be making matters worse.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

Whilst you are right in that it is often more than just one factor, I am fairly confident that it's mapping at 1:1 from the screenshots provided.

That just leaves image enhancements which are largely the TV's domain and not the graphics card.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

My HDTV is just a cheap off brand TV I bought last winter. It does HDMI, Xbox 360, and TV well but it doesn't have the advance features to switch to PC input. It just takes what ever signal comes. My question here is why does my Macbook Pro's HDMI 1080p output looks better than my supposedly superior gaming rig with a dedicated graphics card? Being curious, I went into my BIOS and switched on internal graphics. I installed the Intel HD drivers and switched to an HDMI slot on my motherboard. The visual was practically indistinguishable. Both my Macbook Pro and my gaming rig has Intel HD 4000 and but the display quality was inferior on the latter. Was it because my Macbook Pro uses display port? Maybe it has something to do with scaling (as in the Macbook Pro does 1080p automatically since the 720p built in monitor was switched off). Is there any way for me to change my native resolution on my GPU? If I could do that, I can fix the 1:1 pixel scaling issue.

EDIT: Okay, so I looked around the internet and found this article: http://www.howtogeek.com/119117/htg-explains-why-using-your-monitors-native-resolution-is-important/. It made a lot of sense about why I am having this issue. My only question is why does my GPU, a top of the line graphics card, have such a low native resolution? I doubt it has anything to do with the TV as my TV lacks any enhancement software of any kind. Is it possible for me to change my native resolution so that it scales perfectly to 1080p?

Bad cable is a distinct possibility. I have two cables, one HDMI to HDMI and another HDMI to DVI. The latter, I previously connected to a DVI to mini DisplayPort adapter to hook it up to my Macbook Pro and do HDMI out to my HDTV. I know for a fact that the HDMI to DVI cable is perfectly fine since it was the same one I used to connect my Macbook Pro for screenshots and comparison with the gaming rig running on the HDMI to HDMI cable. I've used both, and connected the both of them to my DVI and HDMI ports on my GPU and motherboard. The visual quality is the same. I can conclude that it has nothing to with the cables, at least not with the HDMI to DVI. Here's my desktop running at "native resolution" of 1280 x 720 (quick note: My LED TV is 1080p so its native resolution should be 1920 x 1080. I could prove this by hooking my Macbook Pro into it with the built in display turned off and it would revert to the lowest setting possible) http://i.imgur.com/F5Q9C.jpg. As you can see, the TV overscans my desktop. This was not the case when after I installed Windows but before the drivers. Windows appeared, well, windowed with black areas between the desktop and the edge of the display.It only fits the screen after the drivers were installed for the GPU. The same thing happens during bootup at my BIOS screen. The BIOS menu is also windowed. Not sure if it's a BIOS setting.

EDIT: I just consulted my TV's manual. It says that my native resolution is 1920 x 1080. http://www.viore.com/pdf/manual/LED22VF50.pdf. If that's the case, why does Windows and Nvidia report my native resolution as being 1280 x 720?

Last edited by QuantumInteger on Fri Nov 30, 2012 8:39 pm, edited 1 time in total.

It definitely looks like your TV is applying some post-processing to the image. If the TV's menus don't offer a method to disable these "features", you may have to edit the monitor EDID data to get it to display properly. See this post for more info.

It definitely looks like your TV is applying some post-processing to the image. If the TV's menus don't offer a method to disable these "features", you may have to edit the monitor EDID data to get it to display properly. See this post for more info.

My OS reports my monitor as a Mitac MTC26T42 and since googling that, I've come across a post with a similar issue.

As per your post, I have a similar circumstance. My TV occasionally accepts audio from HDMI though that tends to be the case where I did a fresh install of my GPU drivers. I can switch it off using my HD VDeck software. Though the circumstances that he described sound eerily close to mine. Basically, I'm having a hard time understanding what I am supposed to do with the Monitor Manager.

EDIT: I've tried following some of the steps that was outlined. However, I don't know what my monitor's GUID is. There are 8 entries in the registry under my monitor's name. How do I know which one is which? I ran Monitor Manager and it tells me that the native resolution is 1280 x 720. Again, that is wrong because my monitor's native resolution is 1920 x 1080. This is the same resolution that my MBPro and my Xbox runs at when they are connected.

EDIT: So I did what Crazybus suggested in his link. It took me a while because the instructions were all over the place. I used Monitor Asset Manager to make an .inf file. I used Phoenix to extract my checksum, went into the .inf file and changed the last two parts to 0x00 and 0x9B. I installed using device manager and loading the .inf as a disk, and restarted. I can only say one thing; the difference is like night and day. Here's a photo. As you can see, the resolution has improved dramatically, the fuzziness has gone away. Only problem is that now my display is too bright, too high of a contrast, and I can no longer change some of the settings for my monitor http://i.imgur.com/eBGnR.jpg. All in all, your solution worked. I'm surprised Nvidia hasn't found a fix for this, after all this time.

Last edited by QuantumInteger on Sat Dec 01, 2012 1:38 am, edited 1 time in total.

Yup. Didn't do squat. After trying what Crazybus suggested, I was able to fix my issue (check previous post). Only difference is that now Nvidia shows my connection as DVI, rather than HDMI, even though I'm using an HDMI to HDMI cable. Also, it doesn't have 1080p as preset resolution, only 1080i, so I had to make a custom one with progressive scan.

Now that you fixed the fuzzyness you can now right click the desktop and go into the nvidia ctrl panel and lower your brightness contrast and gamma to your liking I would imagine. Also run the cleartype as i said in your other post.Gluck

All 3 of my Pcs are connected to TVs a 37" & 47" both vizios and a 55" Panasonic Plasma vt30 .Only problems i ever had was was they would not fill the whole screen so i would go into either the ati catalyst ctrl panel and adjust the pane scaling, Same for NVidia.

Min'es not anywhere near as sophisticated or advance as a samsung. It's an off brand (Viore) LED TV I got from walmart for 150.

But it does have subtle differences in the HDMI ports. It picks which audio input to use for based on which HDMI port is in use. When HDMI-1 is being used, it takes audio from the analog "computer audio" jack. With HDMI-2 in use, it takes audio on the SPDIF and/or component audio inputs. That's why I suggested changing ports as who knows what other subtle differences there might be that aren't documented in the sorry excuse for a manual that it has.

OKay, so more bad news. I turned my rig on this morning after powering it down last night to discover that the settings had relapsed. This occurred because I was connected using the HDMI to HDMI cable instead of an HDMI to DVI. I switched after instaling the .inf file because when my rig was connected using the HDMI to DVI cable, I could only do 59hz. To get 60hz at 1080p, I had to create my own custom resolution (weird, I know). I switched to HDMI to HDMI and after restart, the picture was fuzzy again. Nvidia still reports the connection as DVI when it clearly wasn't. I did the original steps over again to no result. I then switched to HDMI to DVI cable, redid the steps, and an now everything is back to normal. It would also revert everytime I disconnect the TV from the rig. Anybody got any idea about this? I would prefer to use my HDMI to HDMI cable as it is smaller, lighter, and less cumbersome than my HDMI to DVI one.

As for the audio ports, both my HDMI 1 probably handles HDMI audio and HDMI 2 does SPDIF. I'm not sure because I send all of my audio signal to an audio jack hooked up to my Altec speakers.

EDIT: Okay, so I was able to successfully install the .inf file for the HDMI to HDMI connection. Not sure if I were to disconnect it whether it will revert or not. Also, my entire display looks greyer too. Can anyone tell me more about this?

Sorry that the instructions in my previous post weren't terribly clear. The reason the nVidia control panel is reporting the monitor as being connected over DVI is that by editing the monitor's inf file, you're culling the extra HDMI data that would normally be sent to the TV. Some TVs only associate computer input with VGA and DVI and apply unwanted post-processing to all HDMI input signals.

Your display may look greyer because the TV is no longer applying contrast enhancing post-processing to the image. You may need to fiddle with the TV controls to get the best picture quality. I reccommend http://www.lagom.nl/lcd-test/ for basic calibration and testing.

Sorry that the instructions in my previous post weren't terribly clear. The reason the nVidia control panel is reporting the monitor as being connected over DVI is that by editing the monitor's inf file, you're culling the extra HDMI data that would normally be sent to the TV. Some TVs only associate computer input with VGA and DVI and apply unwanted post-processing to all HDMI input signals.

Your display may look greyer because the TV is no longer applying contrast enhancing post-processing to the image. You may need to fiddle with the TV controls to get the best picture quality. I reccommend http://www.lagom.nl/lcd-test/ for basic calibration and testing.

Cool, thanks. What about the settings reverting after reboot? Every time I have to install the .inf file, it creates a new entry in my under Generic PnP Monitor when I try installing the new file. Is there any way for me to prevent that from happening?

Also, my Assassin's Creed games (brotherhood and revelations, the ones I've tried to run so far) don't work anymore. Every time I load them up, I get a white flashing box at the top left corner against a black background. I can't use my Task Manager to kill it because every time I tried to access Task Manager, the flashing screen loads in front of it, preventing me from using it at all. My only solution is to log out. This didn't happen before I installed the .inf file. Curiously, AC3, which I got through a redemption code with my purchase of the GTX 670, launches and runs just fine though it runs independent of steam.

EDIT: Nvm, my display is not reverting anymore. It does though when it comes out of sleep mode or other specific cases like when AC Brotherhood fails to load and I have to log out. Being curious, I went into device manager and uninstalled the driver for my monitor so it would revert back to a generic PnP monitor. Doing so fixed the launching issue associated with my Assassin's Creed games but my display still looks like crap. I can do the same method for my HDMI to HDMI and my HDMI to DVI cable and the results would be the same. Like you said, my TV may be applying some kind of post processing on my HDMI signal. So why does it do that with my HDMI to DVI cable as well? (quick note; I can get audio to run over that cable to my TV despite what I've read on the internet telling me that DVI can't carry audio signal). Why doesn't it do that to my Macbook Pro when I had it connected using a mini DisplayPort to DVI connector? Is there an alternative method?

Quick Update: I was able to get my AC Brotherhood and AC Revelations working but it involved a clever work around. Whenever I try running the game, it would showing a white flashing box at the top left corner against a black backrgound. The games won't load and I can't close the game because it loads on top of Task Manager. My only recourse is to log out and re log back in. I decided to run the games in 640 x 480 resolution and the game would run. I can go back later and turn that off so my display doesn't default to that low resolution whenever I load the games. In game, I can adjust the resolution to something higher if I want but here's the kicker; I can't load it past 1920x 1080 30 hz. I can't get the 60hz that I previously was able to get. I have no idea why this is the case. Fixing one issue has broken a ton of others.

IMPORTANT UPDATE: Okay, so after many hours of wrangling with settings, I've finally came upon an answer. I did some google search into the matter and discovered the root cause of this issue: Nvidia. See, every TV outputs EDID (extended display identification data). Nvidia, in their infinite wisdom, decided to program their GPUs to take that signal and "enhance" it, causing, or forcing, the display to native to standard definition or 720p. That's why my rig reported the TV as having 1280 x 720 resolution (for the full text, go here or here). I was able to finally fix the issue by following the instruction of a brilliant user here and using his program to force my GPU to accept the new custom EDID information. I restarted and everything was fixed. Previously, my some of my games couldn't run (due to refresh issue, etc), now they all run perfectly.

For anybody with a similar issue like this, which is a lot of people since Nvidia refuses to fix or even acknowledge this issue, I suggest you try that program that ToastyX provided. Don't worry, it's clean. I ran MSE, Spybot, and Malwarebytes on it.

Odd, my GTX 460 was able to go through my Pioneer receiver and then to a Panasonic plasma with no funny business. I get 1080p (almost, need to change overscan mode on the TV) no questions asked.

Odd indeed. It may just be endemic to certain monitors or GPUs. Again, my TV monitor is from Viore. Their website is almost lacking of any kind of bells and whistles. You should check it out, it's like going back in time to the days of AltaVista. It could be that your TV is more advance and has built in function that automatically recognizes that it is connected to a PC and broadcast as a monitor. Who knows. However, this can be useful for people with similar issues in the future.

I believe with all my heart its that wonky TV that just does not want to play nice with your Nvidia card.

My 560ti's played nice with all 3 of my vizio's and my panasonic from day 1.

Hats off to you for getting it to work correctly. Those are the kinds of problems that drive everyone crazy. Thank god for google search right? Google and youtube have fixed so many things in my and others lives. They are a fantastic asset to fall back on, you can find answers to everyday household problems, cars ,tools ,pc's pretty much anything you need help with.

Also google Chrome is the best browser and since they own youtube also it's really great to b able to log in with your Google ID on any device and have all your bookmarks fav channels and recently watched youtube videos. I also use Gmail instead of say thunderbird since outlook vanished from windows

Bit off topic, but yeah I like Chrome. I prefer Firefox mostly because of the much broader customizations. Outlook is utterly broken, no unified inbox, messy interface, no running in the background, seriously why don't they fix this? Thunderbird is nice but it's missing some things that I really miss from Apple Mail like cloud integration, superior handling of HTML, cleaner aesthetics, etc. But it's functional. Google (and web engines in general) are great tools for learning. I know so much about computers as I do now, without blowing thousands of dollars turning to crappy tech repair services like Geek Squad or paying someone to teach me, because of Google and its power to unified the web.

Now back on topic, it could be my TV. Again, my TV isn't from a mainstream and famous TV manufacturer. I'm sure those monitors receive broader and more comprehensive support. Some of the common complaints on this issue that I've ran into were mostly about TV monitors that are either really old, from a mainstream manufacturer that was made for niche markets, or from brands like Westinghouse (if I spelled correctly). My gripe here is not with the GPU. It has performed splendidly. My gripe is with Nvidia and their horrible softwares. If ATI/AMD can fix this compatibility issue with a simple driver update, why can't Nvidia? Why are they ignoring this issue and leaving the thousands of others who weren't fortunate enough like me to find those resources? It took me literally 5 minutes to create the custom resolution on the app and restart took 30 seconds. Problem was fixed in less time than a commercial break on Sunday football. How can they ignore this?

Glad to hear you got it figured out. I'll have to check out that program. It's strange because the instructions I gave you should have done the same thing (disable the HDMI instructions), but then again, I've only had to test it on a couple of monitors. I don't think this is an actual nVidia bug, but it would be nice if they gave you an option in the control panel to force a DVI signal rather than HDMI signal. Most video cards will output an HDMI signal to an HDTV even through a DVI->HDMI cable, so switches cables doesn't help any. The big problem is that some TVs don't offer a method to disable over/underscan, sharpening, etc. On my Samsung TV, to get a clear picture you have to both use the HDMI1/DVI input AND change the input name in the TV menu to one of the "PC" options. Still, I'm currently using VGA since that's the only way I can get the TV to sleep properly.

Definitely check it out. It works like a charm. It doesn't involve any complicate messing around with the registry as it does it for you, and the solution looks cleaner as it doesn't break compatibility with some games. Your fix was a nice work around though my graphic card still reports my monitor as having a lower native resolution so whenever I put my computer to sleep, might revert. This fix corrects that issue permanently and with minimal fuss. Have you tried using HDMI? Whenever I cull my monitor from sleep, it automatically turns on the display. There are other perks like this too that I don't remember off the top of my head but it offers more integration with the display.