Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!

Yeah, they are both 1920x1080 pixels with 32-bit color precision, there is no inherent difference in quality. Digital is less susceptible to loss or interference, but a solid analog connection can look just as good. Of course it also depends on how well the particular display handles each input, some displays do better with one or the other.

In most cases you'll see no discernable difference, but you should be aware that some HDTVs limit the display resolution when using a VGA input. My brother's LG for example would limit itself to 1024x768.

the arrogance of officialdom should be tempered and controlled... People must again learn to work, instead of living on public assistance. Cicero , 55 BC

My understanding is it will come down to the hardware used. CNET was disappointed when hooking up a PC to the 32" Samsung A450 via HDMI. They stated:

Quote:

C: With analog sources delivered via the VGA input the Samsung performed very well, resolving every detail of the 1,360x768 signal according to DisplayMate and delivering crisp, sharp text and other on-screen details. Surprisingly, HDMI tested far worse. The set didn't handle all of the detail of the source, and edges looked blockier and softer as a result.

and in a larger Samsung display they noticed no difference between VGA and HDMI.

I'm using DVI/HDMI on mine. I thought about using the VGA but prefer using a digital, versus analog, feed. It seems that VGA is being phased out on newer video cards. You can use converters buy why? Ultimately, it depends upon the quality of picture too. I haven't tried the VGA on mine so I don't know.

I know VGA will outperform HDMI is the display does not do 1:1 pixel mapping which introduces artifacts otherwise when the TV scales the image to 16:9 etc. Thus producing blurred characters and other artifacts etc

With VGA the signal must be convert twice. Digital to analog then back to digital again. That conversion can be good or poor quality.

I'm not familiar with all the digital tricks used by HDTVs but it may be that once the digital signal is in the TV it attempts to manipulate the image via software.

An analogy is using DVI on a LCD and finding the VGA looks better.
People often run a LCD in non native rez. Even though the LCD is digital it uses algorithms to display images that are run in non native rez. This introduces lag, artifacts etc. This is why some people will run the display in 1:1, that way the display is still in native rez (Where its fastest) and have to deal with black bars around the image.

VGA can look smoother than DVI, again just using computer LCD here. DVI can only handle 60hz while VGA can do more. Now we have dual link DVI than can handle 120hz.

Anytime you get worse image using pure digital vs analog I would suspect some kind of post processing going on in the TV or LCD.

Quote:

Originally Posted by gunbunnysoulja

On my Sammy HL-S6187w, I very much prefer VGA over HDMI for my HTPC.

I wanted to use HDMI to keep it simple, but now I have to use VGA for video and HDMI for audio

I compared many times, and VGA was superior every time, particularly with text, both getting 1:1 with overscan off.

On the other hand, my Panasonic plasma only accepts up to 1366x768 progressive or 1080i though dgital ports, but accepts at least up to 2560x1600 progressive over VGA.

It might be able to display 2560x1600 over VGA but your display definitely is not 2560x1600, all it's doing is downscaling your picture. As stated earlier, your best bet is going through the HDMI/DVI ports. I bet your display's native res is 1366x768.

XBOX Live: WagmmanPSN: WaggBFBC2: WagmanSteam: Wag

My Second Life character looks and acts exactly like me except he can fly.