Please do not misquote me. I do not believe in exotic cables such as "Jade audio Vermeil gold speaker cable New Retail $2900.00"

Any speaker wire over $20.00/linear ft (not including ends) is getting into "silly money" territory. I personally have been involved in a double blind test and could not hear the difference between the cables I have and cables costing up to 150% more. My ears are perfectly fine; I can hear better than most people I know, with the exception of a friend of mine that is a concert pianist with perfect pitch.

If you are willing to spend $2900 on a set of cables, like I said go for it; I never will. I would much rather spend that money on better speakers.

By the way have you ever popped the top off some of your electronics and looked inside? Most of the wiring is not even copper (unless you are paying big dollars).

Since this question gets asked like 15 times a day, and I usually end up responding to them, I'll make a general post... Sure would be nice to be stickied, but since that won't happen, at least highlight it and keep the URL so you yourself will have an easy time "replying" to the onslaught of questions...

I originally wrote this as a reply to a post, but thought it made more sense standing on it's own... So here goes...

"Question: Is there any difference between a cheap (i.e. $10 HDMI cable) and an expensive (i.e. $150 HDMI cable)???"

I have an EE degree. I work as a broadcast engineer. I live and breath digital and analog signals every day. So yes, you could say I'm qualified to give the answer to this question...

That answer is, "No, an expensive HDMI cable will make NO difference in the quality of your picture OR sound"

I'll give you the more complex reason first, then an analogy... Hopefully one will make sense... If you don't want all the real technical stuff, just skip down to B for a real simple explaination...

A) Wires send electrical signals... Plain and simple. Anything sent over a wire is ultimately just a voltage/current applied to that cable. Let's say we're talking about an analog video signal that's 1 volt peak to peak... In other words, measuring from the LOWEST voltage to the HIGHEST voltage will give a result of 1 volt... With an analog signal you have "slices" of time that are "lines" of signal... It's too complex to go into here, but basically you have a "front porch" which is known as the "setup"... This is what helps your tv "lock onto" and sets the "black level" for the signal. After that you've got each line of the image (455 half cycles per line). Again I won't go into how chromanance (color information) and luminance (picture or brightness information) is combined, seperated, etc.. It's too complex for this discussion, but irregardless, just know that following that porch you've got all the lines of the picture (and some that don't show up on the picture... these carry closed captioning, test signals, etc...). All of these "lines" of information when you look at them on a scope look like this...

That waveform is all of that information in analog form... In other words, if you look at one VERY SMALL timeslice of that waveform, the EXACT position of the form (i.e. what voltage is present) represents what information is at that position...

Because of this, it's VERY EASY for other radiated signals to get "mixed in" with that information. When this happens, the more "noise" you get mixed into the signal, the more degraded the picture will be... You'll start to get snow, lines, weird colors, etc... Because "information" is getting into the waveform that doesn't belong there...

With digital however, (i.e. the signal sent over an HDMI cable), the information is encoded differently... At it's lowest level, it's nothing but a string of bits... In other words, each signal is either ON or OFF... It doesn't care if a particular timeslice is 4.323 volts or 4.927 volts... It's just ON... See on the right side here, the "square wave" pattern?

That's what a digital signal looks like... For each "slice" of the signal, the "bit" is either on (if the signal is high) or off (if it's low)...

Because of that, even if you mix some noise, or even a LOT of noise into the signal, the bit will STILL be on or off... It doesn't matter...

Now, for a slightly easier to understand analogy...

B) Think of it this way... Let's say you have a ladder with 200 steps on it... An "analog" signal represent information by WHICH step the person is on at a certain time. As you move further and further away (get "noise or interference in the signal), it's very easy to start making mistakes... For example, if the person is on the 101st step, you might say he's on 102nd, or as you get further away, you might start making more and more mistakes... At some point you won't know if the person is on the 13th step or the 50th step....

NOW... In a digital signal, we don't care if he's on the 13th or 14th or 15th step... All we care about is rather he's at the TOP or the BOTTOM... So now, as we back you up further and further (introduce more noise), you might have no idea what STEP he's on, but you'll STILL be able to tell if he's a "1" or a "0"...

THIS is why digital signals aren't affected by cheaper cables, etc... Now eventually if you keep moving further and further back, there may come a point where you can no longer tell if he's up or down... But the good news is, digital signals don't "guess"... If they SEE the signal, they work... If they DON'T, they DON'T.. LOL

So if anyone ever tells you they can "see the difference" between HDMI cables, etc... You can knowingly laugh to yourself and think about how much money the poor sole wasted on something that was pointless.

Now, I've seen others say that they make a difference in audio... ALL audio carried over HDMI is STILL in digital format... So again, since it's a digital signal, it will not make ANY difference at all....

I've also seen various posts in regards to things like "Make sure you get a v1.3 cable"... The various HDMI versions determine the capabilities of the DEVICES on either end of that cable (most of the HDMI versions (other then 1.0 to 1.1) have to do with AUDIO and how many channels / type of audio are carried...) Because of this, the cable itself is NO DIFFERENT... It's just marketing that some companies charge more for a "v1.3" cable then a "v1.1" cable, etc... The cables themselves will work now and WELL into the future for any other HDMI versions that come along the way....

My first system was $6000 and I connected it all together with a grand total of about $200 bucks of what I was told was "worthy" wire...not good. I switched to an entry level MIT design and could not physically believe the increase in sound quality. Sometimes, it's about how much of the analog signal actually makes it to the speakers and the ability of the wire to allow "most" of the full signal through. There has to be something to the millions of R&D that goes into some of these designs.

But I also think audiophiles should have a quick peek inside their components too. This can help gauge how much one could need to spend on wire. I've had a few components in the past where the soldering and wire running to the RCA and Balanced outs are so poor, there would be no real need to go crazy on cable. But there are quite a few higher end companies that hard wire their RCA and Balanced outs to the circuit board and I believe this to be beneficial to applying a better cable to the system.

I sold high-end audio and video systems for twenty-five years. Being able to play with different combinations of equipment is very instructive. Certain components go well together and sound terrific, others less so. In a basic but fine-sounding system comprising a $499 CD player, a $499 integrated amplifier and a $450 pair of speakers we took turns listening to very familiar music while a colleague substituted different interconnect cables. Each staff member chose his own favorite tracks for his turn in the listening chair. We compared only two interconnects at a time. Sometimes cable A was used first, followed by cable B. Other times cable B was in use first. In all cases the person changing the cables would go behind the equipment rack (located behind the listener!) and fiddle around for the same amount of time. Each piece of music was played for two minutes; the interconnect was then changed and the same music was again played for two minutes at the same volume setting.

Under these conditions, with familiar music I preferred the resulting sound from a particular interconnect on seven out of eight music selections. On one piece of music the results were both similarly enjoyable.

The next day, using the same exact system and interconnects, we played the same eight music selections again at the same volume setting. This time, instead of choosing a preference, we attempted to identify which interconnect cable was in use - cable A or cable B. Two of my colleagues correctly identified which cable was in use six out of eight times. Myself and one other guy got all eight correct.

Over time, similar comparisons were performed on higher performance systems. For the most part it was easier to choose a preference and to correctly identify which interconnect was in use on those higher performance systems than it was on the "basic" system.

Certain caveats may be applied to these listening sessions. They took place in a purpose-built listening room, lightly furnished, with very low ambient background noise. No electrical AC power filters of any kind were used. All of the components including the speakers were set up on low-resonance racks, stands, or spikes as appropriate. No visual or audible clues (or cues) were available to the listener. All of the system components and interconnects that we tested or compared had previously been in use for over one hundred hours.

Our early comparisons pitted the standard RCA cable supplied in the box with the CD player against four different aftermarket cables (one at a time, of course). The standard RCA cable was chosen as the "preferred" sound 6% of the time across all four listeners.

Later, using a high-performance system with all of the components from the same manufacturer, the interconnect cable supplied in the box with the CD player was chosen as the "preferred" sound 75% of the time across all four of us. On the remaining music selections it was felt to be equally pleasing to one particular aftermarket cable.

After numerous unhurried comparisons over several months I took home the interconnect that I felt was the best of the interconnects that I could afford with my employee discount. It replaced another good interconnect that had been in my home system for nearly two years. After a few minutes my girlfriend arrived home. She walked in and said "That sounds nice. What did you change?" I declined to tell her but offered to play her favorite CD. She made tea and then sat down to listen. Halfway through the first track she said the female vocals were "more natural". By the third track she pronounced it to be a definite improvement, saying that the four voices were more easily distinguished and more nuanced and that sibilance was more realistic. When I showed her that the new component was an interconnect cable that would cost me around $400 she said "Buy it. I thought you were going to say it was a new whiz-bang amplifier or something that cost thousands".

I could have saved months of listening if I'd just let the girlfriend listen for five minutes. And, hey, she makes a great cup of tea, too.

I lived in a similar world as the engineer who claims there's no difference in cables. I'm a electronics technician and have done all the bench tests. True with a scope, spectrum analyzer, SPL meter etc you don't 'SEE' the difference. Thank God our ears are quite a bit different as far as sonic response goes. I was a myth believer until some knowledgeable salesman asked me to try a set of Quest interlink. The difference was astounding. Needless to say I converted all my interlink cables including guitar, keyboard, multitrack recorder, etc using Monster Interlink 400. All one has to do is A/B tests. The same goes for speaker cables. If you can't hear the difference then don't waste your money on audiophile equipment. Don't feel bad though, you'll be in the 99.9% who can't hear any difference. For us poor sods who can . . . we are willing to spend the extra, for the extra performance.