I've tried this on two different TVs now and the vga signal is clearly better than the hdmi at the same resolution. Using hdmi gives a very grainy image. It should be noted that I am using a dvi-hdmi adapter and an hdmi cable to run to the tv. Could that be the culprit?

HDMI and DVI are signal compatible - so that converter is essentially just a passive thru, so that in itself isn't the issue.

Generally it is highly unlikely that a "bad" cable would give you a grainy picture. It is serial digital stream, so like any other it is either going to work or not, or perhaps produce chunks and areas where the picture breaks up or tears just like digital TV when the signal is poor. But a uniform degrading of the picture is not likely to come from the cable - VGA sure, not HDMI/DVI.

I suppose the video chipset could be systematically corrupting a bit or two for a pattern of pixels that might give that effect. But changing the card should sort that out. Driver? Seems unlikely - this is nvidia?

The native resolution suggestion is an interesting one - likely that the TV is using very different systems to scale a digital image than an analogue one. Digital images are often much more difficult to scale well(cheaply!) - definitely a good idea to find the real native resolution of the panel (read the specs very carefully because they will often state 1080, but when you read the detailed specs that just means it will accept a 1080 frame and scale it down to the native 1280x768 panel resolution). Make sure you are matching the panel to your xorg. nvidia-settings has some useful data on scaling and native sizes..

This may be a rediculous question but by any chance is the cable over 50-60 feet.

When an HDMI cable length gets too long the signal loses strength. This results in picture distortion, sparklies (where single pixels drop out of the picture), or no picture display at all, only sound capabilities. If any of these problems occur than the cable length is too long, and the signal needs to be adjusted by additional devices.