I am one of the believers, I hear differences between analog interconnects, blindfolded, tricked or conscious, I have heard and chosen my cables because of the wonderful differences.

However, when you touch digital territory I start to get a bit more suspicious. I have my reservations with spidf connectors, they still transmit un-packeted digital data that might get affected by some reason in the path. But when it really gets ridiculous is with HDMI, USB and Firewire connectors which transmit packeted digital data, which means that it either arrives 100% or it doesn't arrive, no chance for distortions, jitter or emi/rfi to get involved, and if they get involved and affect the signal .00000000000001% you will not hear anything anyway. So a super cable is absolutely ridiculous to the point that manufacturers cannot even accuse (with some credibility) the usual bad guys (jitter, metal purity...)

Quality HDMI cables come into play over long runs.. I heard of sparkling & white dots.. For short runs any quality HDMI cable will do.. I bought a audio quest Forest HDMI cable. 3FT.. They retail for 40.00. I paid 11 new.. I do find they fit more snug then my cheaper HDMI cables & look better, but as far as PQ, just a quick A/B I couldn't tell a difference.. But when I connect my DVI to HDMI to my TV & PS3, the picture is less sharp when I connect my DVI to HDMI to my recievers out, then a HDMI cable to the PS3.

I am also a cable believer and from what I saw at the last Canjam so is just about everyone who is prepared to present their equipment.

Your argument has a ring of truth about it but I feel with this type of issue that what is happening is not fully understood. The real issue I suspect is how different types of data stream interact with different DAC chips, i.e. how sensitive different chips are to pertubations in the data stream.

Yes, it does. But what I understand it's more like it won't work on higher resolutions if the signal is not getting through. So if the cable is not good enough there will be no picture and you have to use a lower resolution. So it still either works or it doesn't.

The point Im trying to make is that these connectors, when functional are not subject to data loss or data out of order, period. The "Transition Minimized Differential Signaling" transmission protocol in hdmi manages this transparently.

In USB and Firewire the digital signal that gets transmitted via serial connections, a 1 and a 0 at a time, therefore, in order to provide error checking, flow control and to synchronize the devices, information is organised in the form of packets and frames. Its not an analogue signal yet, nor a single digital stream, therefore the only purpose the cables need to comply with is to transmit the data packets of 1's and 0's precisely. If a packet doesnt get transmitted precisely, it simply doesnt play it. Its just how it works.

All three connections are affected by cable length, but if they are affected, then the signal interrupts.

This would be completely different if we were talking about a cd transport to external dac situation where also bits get transmitted, but in a single digital stream, no packets or frames. In this transmission types jitter and other issues can be an issue with the link, and degradation can be audible without interruption.

The point you are trying to make is wrong. Certainly with regard to HDMI. Less obviously so with SPDIF or USB with regard to our context.

HDMI certainly is susceptable to loss. It is in fact PROFOUNDLY AWFUL for it. Blue Jeans have a marvellous shake down on just why HDMI is a dreadful dreadful piece of engineering. We needed a 40ft run for when we upgraded our projector setup to HD. Anything over 1080p/24 has horrible display artifacts [NOT cutting out, think of it as like a worsening analogue television or radio signal, snow artifacts, noise, blue speckles, sometimes a green or red saturation in the colour balance] (1080p/30, 1080p/50 1080p/60) and thats with the thick as your wrist long run Blue Jeans HDMI cable. Because our player cannot send DVD as 1080p/24, we're stuck on 1080i/50 for Blu-Ray and upscaled DVD until we have a chance to replace the cable.

We had a componant cable which carries 1080p/60 over that distance no problem of course. But Sony doesn't like people to use analogue connectors with Blu-Ray. Wabs.

I wish I'd got one of those boxes that converts HDMI to 2xCAT6 and back again. The longer the HDMI cable, the greater the signal degradation until the point where you cannot get a lock between the devices at each end and have to reduce the picture quality to something with a lower bandwidth.

As for SPDIF. Cables play less of a part than the connectors. Most SPDIF cables use RCA connections despite the fact that it is mechanically/electrically impossible to actually achieve a proper 75ohm connection because of the physical construction of an RCA connector. Regular 75-ohm BNC connector and regular 75-ohm cable solve all and any problems with regard to this because they're physically up to what the spec is supposed to be.

With USB, anything which has the USB logo on i is required to conform to the USB spec. However, policing this is very difficult when we are dealing with small companies slamming together whatever they like with a big bore and audiophile claims. They may well sound audibly different to all of the USB cables that came with your camera or printer precisly because in terms of USB transmission, the audiophile cable is not providing the same sound electrical connection. Pleasing sounding jitter, however you want to phrase it.

Just because a signal is digital and even if it locks at both ends like HDMI does with its handshake and HDCP protocall, doesn't mean its either 100% or 0%.

The point you are trying to make is wrong. Certainly with regard to HDMI. Less obviously so with SPDIF or USB with regard to our context.

HDMI certainly is susceptable to loss. It is in fact PROFOUNDLY AWFUL for it. Blue Jeans have a marvellous shake down on just why HDMI is a dreadful dreadful piece of engineering. We needed a 40ft run for when we upgraded our projector setup to HD. Anything over 1080p/24 has horrible display artifacts [NOT cutting out, think of it as like a worsening analogue television or radio signal, snow artifacts, noise, blue speckles, sometimes a green or red saturation in the colour balance] (1080p/30, 1080p/50 1080p/60) and thats with the thick as your wrist long run Blue Jeans HDMI cable. Because our player cannot send DVD as 1080p/24, we're stuck on 1080i/50 for Blu-Ray and upscaled DVD until we have a chance to replace the cable.

We had a componant cable which carries 1080p/60 over that distance no problem of course. But Sony doesn't like people to use analogue connectors with Blu-Ray. Wabs.

I wish I'd got one of those boxes that converts HDMI to 2xCAT6 and back again. The longer the HDMI cable, the greater the signal degradation until the point where you cannot get a lock between the devices at each end and have to reduce the picture quality to something with a lower bandwidth.

As for SPDIF. Cables play less of a part than the connectors. Most SPDIF cables use RCA connections despite the fact that it is mechanically/electrically impossible to actually achieve a proper 75ohm connection because of the physical construction of an RCA connector. Regular 75-ohm BNC connector and regular 75-ohm cable solve all and any problems with regard to this because they're physically up to what the spec is supposed to be.

With USB, anything which has the USB logo on i is required to conform to the USB spec. However, policing this is very difficult when we are dealing with small companies slamming together whatever they like with a big bore and audiophile claims. They may well sound audibly different to all of the USB cables that came with your camera or printer precisly because in terms of USB transmission, the audiophile cable is not providing the same sound electrical connection. Pleasing sounding jitter, however you want to phrase it.

Just because a signal is digital and even if it locks at both ends like HDMI does with its handshake and HDCP protocall, doesn't mean its either 100% or 0%.

Nice post.. Albeit, I didn't expect it to be that bad at 40FT.. When people say, with digital you either get it or you don't, it's a half truth.. For short runs, it rings true IMO, for long runs it's a whole other ball game.. Of course sony doesn't want you to use analog, you know what SPDIF stands for, right? Sony, Philips, Digital interface.. & wasn't it Sony/Philips that created or heavily back CD? & spread FUD about how much better CD sounds compared to analog/Vinyl.

The whole HDMI thing is just a mess, they come out with too many versions to quickly.. 1.3 still has yet to be fully utilized yet.. Anyone remember 36bit deep color.. Now we are on 1.4, which makes our 1.3 receivers obsolete for the most part.. If you want to watch 3D movies with a 1.3 compatible receiver you will need a 3d BRP with 2 HDMI out puts.. You connect one to your display & the other to your receiver..It's getting silly.. It's going to take yrs before 1.3 is fully recognized, now they want to push out 1.4 due to the 3D rage.. 3D films are doing well, so they want to capitalize on it. I seen a 3D demo on a Panasonic plasma.. & it was amazing.. Much more realistic then the Samsung LCD.. But the glasses were uncomfortable.. Plus, the whole time I was watching the demo, all I could think of was gaming, not movies.. Imagine being chased by a 300lb DL men in Madden 2014 in 3D. The glasses cost 150 too..

There can well be audible differences between digital cables.
The point that is usually brought forward in this discussions is digital = digital.
Because all we want is to transfer the digital signal, but that doesn't have to be all there is on that cable.
There could well be noise, it seems f.i. USB is infamous for being 'dirty'. I can imagine that some USB-cables reject more noise than others, some USB-cables may even act like antenna's.
Which is probably why some of my -not ment for audio- USB-cables have a ferrite clamp.
So while the digital signal won't be different from one cable to another, noiselevels can very well differ, influencing circuits further downstream.

There can well be audible differences between digital cables.
The point that is usually brought forward in this discussions is digital = digital.
Because all we want is to transfer the digital signal, but that doesn't have to be all there is on that cable.
There could well be noise, it seems f.i. USB is infamous for being 'dirty'. I can imagine that some USB-cables reject more noise than others, some USB-cables may even act like antenna's.
Which is probably why some of my -not ment for audio- USB-cables have a ferrite clamp.
So while the digital signal won't be different from one cable to another, noiselevels can very well differ, influencing circuits further downstream.

This is a good point but a common point used to discuss expensive cable performance.

Duggeh, what you said is right and wrong, thus reinforces my point. The artifacts, blackouts, bad pixels, that are caused by "bad" or long hdmi cables, prove the point that when the cable fails, it simply doesnt transmit a packet of data, something that in audio would be equivalent to audible noise, blackouts, pulses, etc.

However it never has to do with the quality of the signal that passes, which is 100% perfect and no super cable can change it! so a good shielded cable cat 2 that does the job will do it 100% equally than a super platinum cable. And I repeat, this is very different in analogue and single digital streams, where the audio signal can pass with degradations.

This is a good point but a common point used to discuss expensive cable performance.

Duggeh, what you said is right and wrong, thus reinforces my point. The artifacts, blackouts, bad pixels, that are caused by "bad" or long hdmi cables, prove the point that when the cable fails, it simply doesnt transmit a packet of data, something that in audio would be equivalent to audible noise, blackouts, pulses, etc.

However it never has to do with the quality of the signal that passes, which is 100% perfect and no super cable can change it! so a good shielded cable cat 2 that does the job will do it 100% equally than a super platinum cable. And I repeat, this is very different in analogue and single digital streams, where the audio signal can pass with degradations.

What is the basis for your assertion that Duggeh's display artifacts are from "packet" loss and not signal degradation?

I think point number four on BJC's website explains the signal degradation issue quite well:

You are trying to conflate signal degradation of the kind we are discussing in a video context with HDMI with signal degradation in an audio context concerning digital transmission by equattion of the visible display artifact with the aural onomatopoeia of that artifact.

Your take on signal degradation in audio is absolutely true with all analogue signals and possible in direct digital streams, but sorry Duggeh, not on packeted digital transfer protocols, its theoretically not possible. You are confusing what you know, with what you dont know.

Your take on signal degradation in audio is absolutely true with all analogue signals and possible in direct digital streams, but sorry Duggeh, not on packeted digital transfer protocols, its theoretically not possible. You are confusing what you know, with what you dont know.

^^pwnd^^

But yea, you are completely right. If you have issues with an HDMI cable, they will be blatant. If the signal is going through uninterrupted, that's as good as it gets.
As for noise being transmitted over USB, perhaps some cables are better for this than others, but I would first look at what you have on each end. Well designed USB hardware should isolate the USB connections from both noise sources and the audio stage.