Previously I have tried to use an HDMI cable from my graphics card to either my 32" monitor or my 60" Sony TV. Each time, I have noticed that the screen on either is projected to be bigger than the actual screen. I have previously resolved this by cutting down the resolution to less than the TV native resolution and adjusting the game settings to similar. But it is always clunky and I abandon it. For the 32" monitor its normal resolution is 1366x768 (The Sony is 1080i as an older model but sitting ~5' away isn't the same). The NVIDIA control panel acknowledges this, but I can only properly frame the desktop if I cut it down to like 1260x720 or something. How can I get it to properly frame the desktop using an HDMI cable. Note, both my laptop and desktop have this problem, but my desktop also has a VGA out, which does not cause this issue. My laptop is newer and has a better card, and I would like to use it on a larger monitor if I could resolve this issue, but it only has an HDMI out.

Now with the 4.1 Graphics changes I am interested in doing this again. I now have a Dell XPS15 laptop with an i5-560 w/420M graphics card and 8GB-1333 RAM, so I think I don't think graphics throughput should be an issue, but I still suffer from this HDMI resolution issue.

If it matters, the desktop (3 yrs old) has some kind of Turion dual core, 8GB-800 RAM and a 2650-XT graphics card.

It didn't even occur to me that it was obviously a TV at 32" after you used the word monitor.

My flat screens behave properly when you plug an analog VGA cable into it since it knows for sure that it is a computer connection. If I hook my computer up with an HDMI cable, overscan kicks in since it seems to think I am watching tv/movies and decides I don't need the parts of the screen where action is probably never going to happen. On wide-format media, it helps get rid of the black bars at top and on bottom at the expense of signal fidelity. Most videophiles are pretty annoyed that it comes default on just about any reasonable tv these days.

Would that also cause the "fuzzy" text issue? I have a new laptop plugged into my 52" TV (pretty sure its a rear projection) and cant find a good resolution to use. Movies play fine, but surfing the internet is painful as the resolution is set huge (like 720P @60Hz) so the text is blurry and the only way to read the text is to make my browser display at 200%.

720p is pretty low resolution, for text reading. My monitor here has 900ish lines, my one at home has 1200, etc. That's a substantial resolution difference, so it's not surprising that text looks less readable. You might see if you can tweak subpixel rendering. Go play with the windows power toy for ClearType, as it can change the strategy for how it does subpixel rendering. (It's like antialising that operates on the individual R,G, and B LEDs in your screen.)