We’ll admit it: we’re a little annoyed. A few years ago we sat down and definitively proved that expensive HDMI cables don’t make any difference (see the Original article page in this article). We seem to have helped stop the deluge of pointless HDMI cable reviews online, but we don’t seem to have completely won the war yet.

Despite our scientific testing, we’ve seen editorial in Hi-Fi magazines extolling the virtues of HDMI cables and staff explaining how they can see and hear differences with different digital cables. We’ve even had our work cited by a reader who filed a complaint with the press complaints commission against a magazine over its coverage of digital cables. The magazine's lawyer’s response was that our original testing didn’t use motion (we had the video paused), therefore the results were invalid. It’s a stupid technicality, as paused video outputs as many frames as normal video (the screen is refreshed as often), but the complaint wasn’t pushed.

As a result, we’ve decided to revisit this subject for the absolutely final time, testing moving video. To make sure we cover absolutely everything, we’ve split this article into two sections: the theory of why HDMI cables don’t make a difference (a FAQ of common questions), which includes the science of why HDMI cables don’t make a difference, and the testing proving that HDMI cables don’t make a difference, which we've put on page two of this article.

FAQ - Why don’t expensive HDMI cables make a difference?

Q. HDMI cables can make a difference can’t they – reviewers have noticed that grass is greener and flesh tones are better between models?

A. This is impossible and suggests that an HDMI cable has intelligence built into it. As a result, the reviewers are at best delusional and, at worst, lying to you. Inside, an HDMI cable has 19 individual wires connected to 19 pins, each designed for a specific job, but these are effectively just bits of metal designed to conduct an electrical signal. It’s important to note that there’s absolutely no processing in the cable and, as far as the wires go, they could be carrying a picture, some audio or anything else.

For an HDMI cable to make flesh tones better, for example, it would have to decode the video signal, process where people are and then tweak the image all before re-encoding it. Think about a cable that can apparently make foliage better – how would it cope with Kermit the frog standing in a green field? Would it make his green tones better, or would it be able to discern the grass and just make that better? Of course, the answer is neither, as cables don’t have processors.

^If HDMI cables can improve foliage, how can they tell the difference between the trees, the blankets and Kermit?

Think of it another way, if HDMI cables can improve the quality of a picture, could a more expensive SATA cable make a word document better to read? Imagine Dan Brown buying a £2,000 SATA cable and opening up his latest novel to find out that it had been transformed from his usual quality into Dickensian beauty. Of course, that’s nonsense and we’re stuck with Dan Brown’s prose and HDMI cables that can’t touch image quality.

Q. If HDMI cables don’t have any processing in them, how comes you can buy active cables, which can only be plugged in one way?

A. Active cables draw power from the HDMI port to power a signal booster. This helps with longer cable runs (say more than 5m) or lets you use a thinner HDMI cable. Some of these active cables have to be plugged in one-way round to work properly, as plugging them in the ‘wrong’ way round will introduce some errors. Note that an active cable doesn’t boost image quality but cuts down on errors.

Q. So reviewers definitely can’t ever see differences?

A. Differences can be seen in testing, but this is down to the kit used. Every TV will interpret the signal slightly differently, displaying different colours. Every Blu-ray player outputs a slightly different picture, too. This is the reason that calibration is recommended. However, it’s definitely not the cables causing these issues.

Q. You’re saying that all HDMI cables are identical and one can’t be better than another?

A. We’re not. The issue we have is with the word ‘better’. This implies that a one HDMI cable can make an improvement over another, which it can’t. With a digital signal, everything is sent as 0s and 1s. You get an error, if a 1 is received as a 0 or a 0 is received as a 1. In this way, an HDMI can either correctly transmit everything or it can introduce errors. It’s technically possible that a poor HDMI cable will transmit more errors than an expensive one.

Q. So errors do mean differences and you can see them, you’re backtracking aren’t you?

A. Certainly not. We freely admit that a digital signal will occasionally cause an error in transmission, but this has to be put into context. The HDMI standard allows for one error per one billion bits, which is known as the Bit Error Rate (BER). Assuming that a picture is transmitted using 24-bit colour (8-bits each for RGB) at 24fps at a resolution of 1,920x1,080 (2,073,600 pixels), that’s a total of 49,766,400bits per frame. At 24fps, the film standard, that’s 1,194,393,600bits (1.1bn bits). In other words, the HDMI standard allows for the worst cable to have a single error in one pixel, in one frame per second. You’re going to have to have incredible eyesight in order to spot that mistake. Besides the one-bit error could occur in the HDCP copy protection or the audio track, which would mean that you wouldn’t even see it. Finally, the video picture has error detection to look out for these kinds of things.

Q. Ah, but HDMI video doesn’t have error correction, does it? So, there could be differences in the picture.

A. Assuming that your one-bit error causes a problem in the picture and one pixel is incorrect, error detection on the TV will let it know where the problem is. The TV can analyse the surrounding pixels and make an educated guess as to what colour the error pixel should be. So, you may get one pixel per second that is ever so slightly the wrong colour. If you can spot that, you have the best eyesight of anyone that has ever lived. Well done you.

Q. What about HDMI standards and certification, surely different cables make a difference?

A. Although there are different HDMI standards, with HDMI 2.0 being the most recent, there are only two HDMI cable standards: Standard and High Speed. Standard is out-of-date now and supports lower resolutions; High Speed supports everything, including the HDMI 2.0 standard, which gives you 4K TV at 60fps (Ultra HD).

Q. What about audio, though; you said that you could get errors here, too. Surely that can make a difference?

A. Again, the one-bit in a billion steps in here. Realistically, it means that the occasional error will pop into the audio track. This kind of error could cause a noticeable pop or blip if it wasn’t corrected; fortunately, audio has error detection and correction. This means the receiving equipment can detect an error and correct it, as though it never happened. In other words, there is no longer an error. In the event that there are too many errors you simply won’t get any sound, as receiving kit is programmed to shut down audio rather than output potentially damaging and irritating sounds.

Q. What about jitter? HDMI’s really bad for this, right?

A. Jitter is the described as the deviation from true periodicity of a presumed periodic signal, which probably doesn’t mean a lot to most people. In simpler terms it's describing how a signal might not be properly in sync. This is because all digital data uses a clock to synchronise transmission, with each clock cycle (a tick if you will) used to send a bit of data. How the data is sent differs from system to system, but a simple view is that when there’s data, the voltage rises to maximum; when there’s a zero, voltage falls to 0. When plotted on a graph, you get a square graph with the values going up and down.

When the clock is accurate, each bit is sent at a perfect time interval, but the timing is often not as accurate as you may think. For example, rather than sending data every second, you may find that the first signal goes out at 0.99s, then 1.12s then 1.05s. We’ve exaggerated the example to make a point, with digital transmissions occurring much faster and with less error, but it shows how timing affects the signal.

^In the graph at the top, the digital signal is sent perfectly with the vertical lines matching up with clock (the dotted lines) completely; the graph at the bottom shows what happens is the clock isn’t so regular, with the graph distorting.

With jitter distorting the transmission, two things can happen. First, repeated errors can make the receiving device believe that is has a 0 instead of a 1 (or vice versa), introducing an error. As we’ve discussed, these errors are corrected to the point where you don’t notice them for both audio and video.

Secondly, if you’re outputting in real-time (or close to that), the mistiming can mean problems with audio, as a sound may occur too early or too late, distorting the analogue waveform as it's converted from digital. However, receiving devices buffer some audio to help eliminate these issues and minor differences in timing can’t be heard.

More importantly, as far as this article is concerned, it’s not the HDMI cable that’s at fault for jitter, but the HDMI standard. In other words, the HDMI cable can’t and doesn’t make any difference to jitter. A bigger difference is made by the quality of the Digital-to-analog Converter (DAC), which takes digital sounds and converts it back to the analog sounds we hear, but even this pales in comparison to the quality of your speakers and AV receiver.