There is a difference in quality and in the manufacturing of the cables, but the price ranges are really just exaggerated in my opinion: just capitalism at work. Did you hear about the $1000 a meter HDMI cable? It was in a ZDnet blog once:

The difference it seems is the throughput. While a bit is a bit, the speed of the bit (or the quantity transmitted in a given timeframe) is what is important. The more bits you try to push through a cable, the more likely it is that there will be interference, problems with transmission (data corruption), and so on.

There are different HDMI standards from HDMI 1.0 to HDMI 1.4. And they vary by the amount of data you can get through the cable.

But here's the interesting part: even the oldest HDMI cable (HDMI 1.0 standard) should be able to run a TV at 1920x1200 at 60Hz with no problem. So as long as a cable is really an HDMI cable, it should be OK. All the other stuff is just extra.

Who has a 3840×2400 (WQUXGA) TV? I don't. But maybe down the road I might, and then I'll buy a higher rated (and more expensive) HDMI cable.

As for payola, I doubt every home entertainment magazine and website has been paid off. I was going to offer some sort of expectancy effect - the reviewer knows that it is more expensive so he expects it to be better - but I don't really think that's it either.

I think a part of it is simply pragmatism. Put simply, these magazines/shows/websites owe their existence to the existence of high-end stuff. If there's no difference between the $1 cable and the $150 cable, or the $200 amp and the $2000 amp, or the $500 television and the $5000 television, etc, why are they even around? In order to justify their presence, there NEEDS to be "something better" for their readers to aspire to, and for them to showcase in their upcoming issue.

Of course, in all fairness, the flip-side of this argument is that there IS some real, genuine crap out there. And when something is bad enough, the quality of the output actually WILL suffer. But it's difficult to know where that line is, and there are plenty of people willing to take advantage of consumer uncertainty.

HDMI is a standard interface, by that standard anything which bears the logo should provide a quality image and audio. However, there is nothing stopping a manufacturer from using components above the bare bones standard to decrease losses caused by the resistance of the wire or the plating of the connector.

I agree with what's been said, cheapie is just fine for short distances. I have no proof but I can see how for long distances having no/poor shielding could result since the SNR could cause data loss. But again no need for an expensive unit just a better shielded one.

Thanks! I think salespeople who misinform consumers are jerks. I don't know why any action isn't taken against the companies who (ostensibly) promote this kind of conduct.

Quote:

Originally Posted by MAK

There is a difference in quality and in the manufacturing of the cables, but the price ranges are really just exaggerated in my opinion: just capitalism at work. Did you hear about the $1000 a meter HDMI cable? It was in a ZDnet blog once:

That's hilarious - 1000$?! When you refer to quality and manufacturing are you referring to sturdiness (cable will last longer) or actual picture and sound quality?

Quote:

Originally Posted by MpG

I think a part of it is simply pragmatism. Put simply, these magazines/shows/websites owe their existence to the existence of high-end stuff. If there's no difference between the $1 cable and the $150 cable, or the $200 amp and the $2000 amp, or the $500 television and the $5000 television, etc, why are they even around? In order to justify their presence, there NEEDS to be "something better" for their readers to aspire to, and for them to showcase in their upcoming issue.

That's a good answer. I never thought of it like that. Do you think these reviewers are consciously, wilfully doing this?

In regards to interference, doesn't that not matter for digital information? I'm no expert in this field so please correct me if I'm wrong.

I'd also like to make the distinction between perceivable difference and theoretical (for lack of a better word) difference. It's obvious that for almost all products there reaches a point when the difference is only theoretical and not perceivable - system memory for example. It has been mentioned in many hardware reviews for super fast RAM that you probably won't notice any gains, but benchmarking shows that there is in fact a difference. That's fine.

What gets me, and this is probably why I'm dwelling on this issue so much, is that logically there should be absolutely no theoretical difference between HDMI cables, and yet there is such a wide range in pricing, and so many reviewers willing to state the contrary. I can't even think of an analogy for it - this is the only case I can think of where there is no qualitative difference amongst products.

__________________

"CFL receiver Sylvain Girard announced his retirement today. His party will be held at The Keg, right after he and some other players finish their shifts there." - Air Farce

In regards to interference, doesn't that not matter for digital information? I'm no expert in this field so please correct me if I'm wrong.

Actually, it does matter, but you need a lot more degradation and more noise to signal ratio to kill a digital signal than an analog one. So digital signals can suffer from interference, but because of their nature (digital) they handle it better.

A digital signal is just a kind of bits of ones and zeros. Think of it as a square waveform, or in an even cruder way, let's replace the bits with voltages, say 0 volts is the zero bit and 5 volts is the 1 bit.

Now suppose you send the code (the bits) down a wire. You get 5V and 0V on the line. Now imagine getting some outside interference in the wire. The interference (and wire resistance etc..) causes a voltage drop and change, so that instead of getting a series of five and zero volts at the other end of the cable, you get, say 3.5V and 0.5V.

That is still OK if you think about it. The higher voltage is obviously the "1" bit and the lower is the "0" bit. The signal is still understood. But if the degradation continues, you won't be able to see the difference between the "1" and the "0" bits....signal is lost.

This is why digital TV is so incredible. In the old days with over the air analog TV, a little interference with transmissions and you had banding, snow effects.., if the signal bounced off a surface and you got a strong direct signal and a very weak reflected signal you got ghosting (ghost images) etc... and all this even with just a little interference.

With a digital TV signal coming over the airwaves, the receiver is able to decode a signal and present it clearly even if there's a whopping huge interference (compared to analog).

On one channel, my tuner reported receiving something like 60% signal strength but was still crystal clear. But in some cases, the degradation is so great that the signal is lost.. so you either get a great picture... or no picture.