But pre-set expectations and testing questions make us wonder about our results.

says manufacturer AudioQuest. "For best results have the arrow pointing in the direction of the flow of music. For example, NAS to Router, Router to Network Player.""/>

Enlarge/ Close-up of the AudioQuest Vodka Ethernet cable. Note the arrow on the connector. "All audio cables are directional," says manufacturer AudioQuest. "For best results have the arrow pointing in the direction of the flow of music. For example, NAS to Router, Router to Network Player."

Share this story

Update, 7/31/15:We've published the results of the our cable analyzer test on the Audioquest Vodka, courtesy of Kurt Denke at Blue Jeans Cable. Those serve as a complement to our feature, which appears in full below.

"Vegas again," I thought, as the noisy A320 plonked down onto the runway at McCarran. I was in the front, thanks to a plethora of reward miles on United, and across the row through the portal I could see the Vegas Strip—hungry, pulsing. It was only a few months since I'd last been here for CES, and coming back to the city felt a lot like putting back on that dirty, comfortable sweater you just can't seem to bring yourself to throw away.

Further Reading

But this time I wasn't here to report on gadgets or meet vendors or anything else quite like I'd done before—this time, I was going up on stage myself. After calling out the audiophile cable gods, I'd come to settle the score. I'd brought a $340 "audiophile grade" Ethernet cable, and I was ready to put it to the test with the assistance of the James Randi Educational Foundation in front of a live audience of several hundred people.

My room was waiting for me at the Mandalay Bay hotel—the very first place I'd ever stayed in Vegas on my very first conference back in 2003 and the place that always springs to mind before anything else when I think of the city. After taxiing to the lobby, the place even smelled the same: fresh, in a vaguely artificial floral way. I'd sleep, and then I'd sing and dance up on stage. The goal was to find out if a $340 Ethernet cable made any difference when you're using it to connect a computer to a NAS on which music was stored. To all common sense and science, the answer was "no," but that hasn't stopped a certain subset of audiophiles from believing in them—and from other silliness like decrying the efficacy of the scientific method when it comes to audio testing.

Did it or didn't it

Let’s get this out of the way first: the overwhelming majority of subjects could not tell the difference between a $350 AudioQuest Vodka Ethernet cable and a $2.50 "Cable Matters" cable from Amazon under our specific testing conditions. I don’t think anyone was expecting anything different from this test, including the true believer audiophile set. However, it’s possible that our test didn't account for some variables—which means, at worst, our results aren’t broadly applicable.

Let’s backtrack, though, and set the stage here before we dive into how it all went down. As we noted last week, this all happened at the James Randi Educational Foundation’s "The Amazing Meeting" conference in Las Vegas. The claim being examined was that when connected between a computer and an Ethernet switch while listening to music hosted on a NAS, the cables produce changes to the quality of music that are, in the words of reviewer Michael Lavorgna, "not subtle or slight" and "as plain as day."

Ars partnered with the James Randi Educational Foundation to test the cables on the basis of the JREF’s long history of conducting controlled tests with clearly defined methods and protocols on supposedly otherworldly phenomena. Famously, Randi and his team have been instrumental in debunking alleged psychics like Uri Geller and James Hydrick, along with faith healers like Peter Popoff; the group seemed to be a natural fit for plumbing the pseudoscientific depths of audiophilia.

The Foundation was happy to assist and offered to test the cables on-stage as this year’s "Million Dollar Challenge" at the conference. The JREF offers a million dollar bounty for anyone who can demonstrate a paranormal ability under mutually agreed upon test conditions; at least once in the past, the JREF has offered this prize to audiophile cable manufacturers if the cable manufacturers would submit to controlled testing as well.

JREF agreed to the proposed collaboration for several reasons. One is that the foundation regarded the claims being made—that the Ethernet cables can make a "plain as day" difference in audio quality—as pseudoscientific, and therefore worthy of testing. Also, one of the foundational principles of scientific skepticism is consumer protection; the JREF says that this is why it engages in debunking other similar pseudoscientific claims of homeopathy or of "power band" bracelets (a version of which the JREF has tested at past events).

But what precisely would we test for, and how? After some discussion, the folks at the JREF decided that the best thing to do would be to construct what’s called an "A/B/X" listening test. In this type of test, a listener first hears an audio sample through one cable, and then hears the same audio sample through the second cable. The listener is told which cable is being used for both "A" and "B" (even-numbered listening subjects would hear the AudioQuest cable first, while odd-numbered listeners would hear the Amazon cable first). Then, one of the two cables is randomly selected and the same audio sample is played over that cable (the "X"). The listener is then prompted to choose whether the third "X" audio sample was played over cable A or cable B.

Enlarge/ Audiophile-grade "Vodka" Ethernet cables, from AudioQuest. They even have directional indicators!

Lee Hutchinson

The choice to use an A/B/X format was in response to reviews like Lavorgna’s, where the difference in audio quality between a standard and expensive Ethernet cable is repeatedly described using terms like "plain as day." If the difference truly is that dramatic, listeners should be able to easily detect it.

Additionally, the test wasn’t attempting in any way to quantify which cable was producing better audio, since "better" is too subjective to quantify. The test was instead attempting only to determine if subjects could tell any difference between the two.

Share this story

Lee Hutchinson
Lee is the Senior Technology Editor at Ars and oversees gadget, automotive, IT, and gaming/culture content. He also knows stuff about enterprise storage, security, and human space flight. Lee is based in Houston, TX. Emaillee.hutchinson@arstechnica.com

485 Reader Comments

Different cables can have different velocities depending on materials used. I don't know whether it makes a difference with Ethernet cabling (doubt it). But is measureable and makes a difference in ham radio.

What do you mean, different velocities? Do you mean, when being shot out of a potato cannon? If that, then I agree. Other than that, I have no idea what you are on about.

Signals in a wire or cable travel slower than lightspeed. Different factors like signal frequency, cable capacitance and inductance, etc, can make a difference (more to phase and amplitude than velocity per se, although because of the different twist rates of pairs in Ethernet cables, the actual conductor length could be different -- which is taken into account by the max cable-length part of the spec. So no, it doesn't make a difference in Ethernet cabling, which either meets spec or doesn't. Apparently it makes a difference in ham radio (probably in the antenna cabling). It makes a difference in old analog telephony. It really has nothing to do with in-spec network cabling.

The sampling in A-to-D conversion is, of course, time based. You would have to start the sampling at exactly the same time (within a small fraction of the sampling rate, i.e. order of a microsecond at 128kHz, 16-bit sampling), otherwise the time intervals of the samples in the two streams will not line up and therefore, the samples will not match.

I guess technically that's not impossible, but the hardware budget would be something that would make even an audiophile pale.

Edit to clarify: by "at exactly the same time" above I mean, of course, at exactly the same position on the analog waveform, each time you do the AD capture.

Lining up samples for analysis is technically trivial. In fact, many audio editors are able to do that automatically. This problem has been solved ages ago. You do not need a microsecond sampling frequency to do that. 192 kilosamples/second is plenty accurate.

Lee - Are you trying to give them ammunition? I realize they will never be satisfied with scientific testing, but this article reads like everyone involved just bumbled their way through this testing and/or wanted to help the case for the audiophiles. Lets give a third option in an ABX test that our statistician waffles on. Let's use a cheap dell instead of some high end media player so the audiophiles can blame the "low end gear". Let's not swap out both cables so they can fall back to blame that. Let's stop testing at 6 people. Let's do such a poor test that we can't even claim the results hold outside the sample set. The entire point of tests like this is to be able to make claims about a larger population than the sample set. If I wanted to read three pages about an Ethernet cable that only holds true for the few people who listened, I'd read audiophile reviews.

Sorry for the rant, but I was expecting you all to figuratively destroy this cable. Instead, we get this?

Yes, the cables are pseudo-scientific junk. But you hurt the case for a scientific explanation of why they're junk when you do such a poorly constructed publicity stunt of an experiment.

The problems with this "experiment":

- You only use one cable in the setup even though you own two of them. You rationalize this by giving a scientific explanation of why it doesn't matter. Except the audiophiles already reject the science, so using that as a justification for cutting corners defeats the entire point you're trying to prove!- You use a *tiny* sample size of 6 people. Your statistician friend should be fired because anyone who tells you that you can arrive at scientifically meaningful results with a sample size of 6 has not the slightest clue about how science is done.- You use a population of skeptics at an event specifically designed to prove the cables are bunk. Talk about biasing the outcome! The fact that you rationalize this by saying you're sure they're honest folk is completely laughable. Bias is just as often subconscious as it is conscious. - You violate your own ABX methodology by letting participants opt out of making a choice. Again, being they're already primed to believe they shouldn't hear a difference, this further contaminates your "experiment".

This whole spectacle ends up being a big joke and proving nothing either way because it was so poorly constructed. It does nothing other than offer click-bait for a series of articles on Ars. You get your clicks while your readers get jipped of any actual meaningful results. Extremely disappointed with the poor quality of this.

Signals in a wire or cable travel slower than lightspeed. Different factors like signal frequency, cable capacitance and inductance, etc, can make a difference (more to phase and amplitude than velocity per se, although because of the different twist rates of pairs in Ethernet cables, the actual conductor length could be different -- which is taken into account by the max cable-length part of the spec. So no, it doesn't make a difference in Ethernet cabling, which either meets spec or doesn't. Apparently it makes a difference in ham radio (probably in the antenna cabling). It makes a difference in old analog telephony. It really has nothing to do with in-spec network cabling.

I did signaling courses while studying for my later unfinished CS engineering degree and I understand the theory very well. However, the OP doesn't make any sense in his claim precisely because specced cable by its very definiton is equal to the task. Otherwise it wouldn't be specced in the first place, unless of course it comes from the mystic factory (which nobody knows) in China that also produces all the non-specced specced high speed HDMI cables.

why you would start with a subjective abx test is beyond me. 1000 subjective results, let alone 8, is meaningless. less than meaningless, it could lead you to believe something detrimental to understanding anything later on. i think i would have held this until the signal tests came through.

side note, if you're going to do a complete tear down on the expensive cable, you should also do one on the cheapo one. i'd love to see what the mechanical differences are.

As a musician with an unusually good ear (it runs in the family, with a Grammy nominated family member whom I grew up with) there ARE certain things that make a difference. There are even differences I can tell that not everyone can tell, with the most recent example being a radio program on how professionals can't tell the difference between a Stradivarius and a modern violin. Funnily enough, I could tell the difference every single time, with 100% accuracy. They ran a little test...

That's not bragging, it's just my talent... Like how I can identify piano players I know when they are practicing, simply by their playing. I've even taken tests and I'm able to tell a difference between two tones to a ridiculous degree.

That's just to establish I'm not a layman. So I can say with complete confidence, with some of this audiophile equipment there is really no discernable difference. It's hard to think it's anything but a scam based on the placebo effect. Who wants to think they were stupid enough to be duped into buying $360 cables?

Lee - Are you trying to give them ammunition? I realize they will never be satisfied with scientific testing, but this article reads like everyone involved just bumbled their way through this testing and/or wanted to help the case for the audiophiles. Lets give a third option in an ABX test that our statistician waffles on. Let's use a cheap dell instead of some high end media player so the audiophiles can blame the "low end gear". Let's not swap out both cables so they can fall back to blame that. Let's stop testing at 6 people. Let's do such a poor test that we can't even claim the results hold outside the sample set. The entire point of tests like this is to be able to make claims about a larger population than the sample set. If I wanted to read three pages about an Ethernet cable that only holds true for the few people who listened, I'd read audiophile reviews.

Sorry for the rant, but I was expecting you all to figuratively destroy this cable. Instead, we get this?

Doesn't matter - they don't need ammunition. Any debate about if the cables do anything is purely imagined. There is no debate - no one involved in the making and selling cares if there is a difference. The only people who care are those who see friends and family suckered into buying them. After people have spent the money they'll want to defend the product - this is normal - most people subconsciously want to convince themselves they have not been ripped off.

The cables exist because people will buy them. People buy them because the salesperson standing in front of them is more convincing than all the scientific studies they've never heard of.

Don't bitch about a poorly executed test. Go out and do your own testing and get the results where people will see them.

Again - not defending the network cables - which I suspect don't really do a thing - but I *have* heard the difference with changes in power cable, speaker cable (and sometimes connectors) and other minor things.

Have you passed ABX tests for each of these claims? If not then it's likely just placebo effect.

a) Couldn't you just assign a 0 for a failed guess, a 1 for a correct guess, and 0.5 for "can't tell/no difference"?

b) I agree there's an issue with a potential bias to NOT hear any difference. So, have an A/B/C/X test, where C is a slightly degraded version (e.g. down-sampled).

c) Try to induce some actual analog interference over a poorly shielded cable, to the point where you CAN hear the difference. Then test a properly shielded cheap cable and the expensive cable.

d) Try it out on both a cheap audio setup (e.g. as done, with a laptop) and with a really high-end audio system. See if the interference from (c) affects the high-end system (or is the DAC nicely shielded and rejects any such noise). See if the high-end system makes a difference.

I thought I remember from the earlier articles that different network cables made a difference when measuring the output with an oscilloscope (I'm skeptical if it's grounded/shielded properly). Maybe there is a difference but it's not perceptible to human ears? Obviously it will still be pointless to use the cables but I'm curious as to whether there is any impact at all.

I'll be running the piece with the electrical testing of the cables—courtesy of Blue Jeans Cables—tomorrow around lunchtime, so stay tuned. Lots of graphs in that one.

That will be interesting in its own right, but to me the thing to test is whether or not this cable makes any difference in the bits as seen from the decoding software. You could go multiple ways with this, looking for network jitter, out of order packets, retransmit rate. Ultimately it is all buffered by the computer and reassembled before it even hits the software, so the only thing that matters is if the bits read into memory pass a checksum, but being that it's a data cable those things would be far more interesting than conductivity and other cable-oriented measurements. The cable measurements, if significant, would present themselves as losses like jitter and retransmits anyway.

Just out of curiosity, why can't this be tested more scientifically (if that is the right word)?

if this is network cable, it should be possible to capture the bits going in one end and coming out the other and compare. I suppose audio noise would show up as packet losses and retransmissions, or something like that.

If the cables perform the same, the level of packet loss should be the same.

I know that packet sniffers and network testers can introduce noise, but if you are doing the same test with the same equipment, then this should balance out.

I'm really just curious here.

They are beyond that. The people believing that these cables help claim that it is not about bit loss or anything like that (they accepted that the bits will arrive correctly), it is about the shielding of the cable now, i.e. that cheaper cables will somehow allow electrical inteferences to mess with the digital-to-analog conversion.

I'm 100% sure these cables are silver-plated snake oil, but as others have pointed out, this test was handled badly.

The right way to do the test would have been to ask for volunteers for a "hearing test". Tell them you're going to play two "very similar" recordings and see whether they can tell the difference. Heck, offer them $20 if they guess correctly, so they're incentivized to do their best. Then do the ABX test with a forced choice, so you can (statistically) pick up even subliminal differences. If the cables are useless, as they almost certainly are, the results should still be not significantly different than random chance.

However, now the this show is over, you can do one better. Offer to go Lavorgna’s house and use his own stereo system to perform the test. Bring along the AudioQuest cable and the cheapo Ethernet cable. Then blindfold him and do the ABX test over, say, 20 trials of randomly using either the AudioQuest cable or the cheapo, using his own choice of musical source material.

If he can perform significantly better than random, then maybe there is something interesting happening. Much more likely, he won't, and this should settle the matter for once and for all.

Just out of curiosity, why can't this be tested more scientifically (if that is the right word)?

if this is network cable, it should be possible to capture the bits going in one end and coming out the other and compare. I suppose audio noise would show up as packet losses and retransmissions, or something like that.

If the cables perform the same, the level of packet loss should be the same.

I know that packet sniffers and network testers can introduce noise, but if you are doing the same test with the same equipment, then this should balance out.

I'm really just curious here.

They are beyond that. The people believing that these cables help claim that it is not about bit loss or anything like that (they accepted that the bits will arrive correctly), it is about the shielding of the cable now, i.e. that cheaper cables will somehow allow electrical inteferences to mess with the digital-to-analog conversion.

if they really claim that, then they surely also swapped out all other cables they were using with directional, shielded cables as well, right? I mean, a non-shielded cable connected to something that is not remotely related to your DAC is the same as any other. I hope they had shielded HDMI cables, etc. Or, you know, just make sure your computer is grounded properly and call it a day, like the rest of us.

Lee - Are you trying to give them ammunition? I realize they will never be satisfied with scientific testing, but this article reads like everyone involved just bumbled their way through this testing and/or wanted to help the case for the audiophiles. Lets give a third option in an ABX test that our statistician waffles on. Let's use a cheap dell instead of some high end media player so the audiophiles can blame the "low end gear". Let's not swap out both cables so they can fall back to blame that. Let's stop testing at 6 people. Let's do such a poor test that we can't even claim the results hold outside the sample set. The entire point of tests like this is to be able to make claims about a larger population than the sample set. If I wanted to read three pages about an Ethernet cable that only holds true for the few people who listened, I'd read audiophile reviews.

Sorry for the rant, but I was expecting you all to figuratively destroy this cable. Instead, we get this?

Totally agree. Somehow in this whole debacle it's Ars Technica and the James Randi Educational Foundation that end up looking like quacks.

I would love to see a retest with a better test methodology, a larger sample size, better audio equipment, and the second "standard" ethernet cable removed from the setup.

The "best" answer is to assume that the listeners were being honest and that if there were any audible differences, they would have heard and reported them regardless of priming or their own prejudices

It isn't about 'honesty' of the test subjects. Preexisting belief directly impacts our ability to perceive. As does group membership. Your position of authority in the group will result in group members adopting your beliefs and that adoption of belief will directly impact their perception.

Why don't you play the same file, through the same outputs, with each set of cables, into a high quality recording device that turns it from analog back to digital and then perform statistical analysis on the data?

It's because "quantum physics". It is well known that by measuring the outcome, you change the outcome, which obviously means your measure would not be valid, simple as that.

Furthermore for this type of ABX test - I think it would best done when there sometimes are differences in some of the samples, e.g. mp3 at 64kbps vs lossless - but without telling the user which cases do that.

That lets you know who was influenced by the belief that there shouldn't be a difference and thus weren't listening for a difference (or people who have damaged hearing)

I think that level of rigorousness is considered unnecessary because the cables claim a plain advantage.

If a simple test can't ID a difference between the cables, then there's no plain advantage.

But if only 1 of 7 people can tell the difference, with 6 straight up saying they can't, then it's pretty plainly not possessed of a plain advantage.

If you want to debunk the claim that there's a plain advantage, I guess you have a point. However, the interesting question is if there COULD be any difference at all, or if this truly falls inside the realm of pseudoscience. In order to settle that question, a greater degree of rigorousness is absolutely required. Even allowing the subjects to pick "I don't know" clearly shows that the test was carried out by people with an agenda.

I can respect that agenda. I too want the magic properties of these cables thoroughly debunked. So please go ahead and do that, instead of the ridiculous farce of a test.

I could buy the idea that there would be some difference if analog audio was passing through the cable. But digital, I don't think so, even more when you consider that ethernet has CRC.

The only thing that could possibly affect this is a ground loop that makes it all the way through and into the analog channel after the DAC on the computer. The obvious fix is to use fiber, not copper, as the cable interconnect - it completely isolates the two components electrically. Of course, using a built-in DAC on a laptop is an audiophile's worst nightmare - there's no TUBES!

Most audiophile cables/devices are obviously phooey & can be easily proven through lab testing. If you insist on doing a human test to counter the ethereal "but I can tell!" factor, then do it correctly. Your on-hand statistician shouldn't realise he's wrong after-the-fact, especially not over something as obvious as that third variable.

Worse, it's not just your statistician's fault; it's an obvious flaw to anyone who's ever solved basic logic problems or has even a cursory understanding of the scientific method.

Science is Science. Faith is Faith. Audiophiles are fish in a barrel. How did you manage to miss so badly?!

Most audiophile cables/devices are obviously phooey & can be easily proven through lab testing. If you insist on doing a human test to counter the ethereal "but I can tell!" factor, then do it correctly. Your on-hand statistician shouldn't realise he's wrong after-the-fact, especially not over something as obvious as that third variable.

Worse, it's not just your statistician's fault; it's an obvious flaw to anyone who's ever solved basic logic problems or has even a cursory understanding of the scientific method.

Science is Science. Faith is Faith. Audiophiles are fish in a barrel. How did you manage to miss so badly?!

And for Diety's sake -- didn't these guys think about and work out both the methodology and the statistics well before staging the event? Or were they, perhaps, simply thinking more about putting on a good show, and demonstrating the "correct" answer to the audience (and the press) than about doing it right?

While I agree that audiophile Ethernet is a silly proposition, you guys are giving AudioQuest a hell of a lot of attention. They say there's no such thing as bad publicity. And when your product is a fraud anyway, I'd say any publicity given to it isn't going to hurt.

Bill Nye's debate with Ken Ham was the best thing to happen to the young earth movement. Simply engaging the believers lends credibility to their position. Same thing with the voodoo audiophile crowd.

The true believers don't care about the science, no matter how compelling it might be. You cannot convince them otherwise and trying to do so only strengthens their positions.

I use to be a creationist. It was the environment I grew up in, how I was raised.

Real scientists calling out the bunk are what eventually got me to see the light - especially cladistics.

Cladistics was claimed by creationists to be bunk but when I learned what it really was, it made a lot of sense to me. Not only did it make a lot of sense to me, but I saw how it gave us an opportunity to make predictions and test evolution in a practical way - and predictions made and tested passed.

So yes, engaging the "True Believers" can be beneficial, it can change them.

Many "true believers" are intelligent people and the bias of their former beliefs will erode when presented with the truth.

I'm a neuroscientist who studies human visual perception. From a rigorous point of view, ABX testing is a pretty bad way to do psychophysics, and allowing a participant to say they are uncertain is especially problematic. Thats doubly true if your participants might have reasons to be biased towards saying they uncertain by their skepticism and your pre-trial presentation. Your statistician was correct to be nervous about that, but I think you also would have been better served by speaking with someone who does psychophysics, not a general statistician. There's almost two centuries worth of best practices that has been built up for these kinds of studies, much of which is related to human behavior and not statistics per se.

If you guys do more things like this (which would be great!), I would suggest looking through some of the psychophysical studies published in places like Journal of Vision or Hearing Research to get a sense for the kinds of methods that are used to find out if humans can perceive the differences between two signals.

As an example, a better method would have been a 3 interval forced choice procedure where in each trial you play back a small song segment three times. Two of the playbacks use one cable and the third uses the other. The order of the playbacks is random. The participant's task is to select the segment that sounds different from the others. If the participants can't tell the difference, they will select the odd-cable-out 1/3 of the time, since they'll just be guessing. Here's a study that uses a procedure like this to examine if patients can detect different channel activation patterns in a cochlear implant.

I'll also point out that its completely irrelevant if there are detectable differences in the two cables via electronic analysis of the signals they provide to the DAC. I mean sure, you might expect that to be a prerequisite to producing any difference in perception, but the former certainly doesn't entail the latter.

The embarrassing and amateur nature of the test setup itself has been covered pretty well already I think so i'll leave that but there is another point which could be made regarding the test subjects.

If you're judging high cuisine you don't just grab Steve and Dave off the street because they won't have the sophisticated palate required to detect the all the subtle nuances of the dish. You bring in top chefs, high end food critics etc to do the judging.

I don't for one second believe that these cables do anything other than lighten the wallets of the gullible but for a true test you should use the 'audiophiles' as the subjects. At the very least use some real musicians who would be able to detect any difference if it should happen to be there.

You have to look at what you're really trying to achieve. What data are you actually trying to transfer?

A regular Audio CD (16 bit, 44 Khz) completely uncompressed raw sample data, has a throughput of about 150 KB/s. Even 'studio quality' sampling standards like 96 khz at 24 bits per sample will still require under 600 KB/s for an uncompressed stereo stream.Any means that enables you to get that amount of data correctly transferred will do. In the case of a gigabit ethernet connection, even high quality uncompressed audio will require less than 1% of the maximum throughput of your medium. So even if 99% of your packets come through corrupted, and have to be resent, you will STILL have a perfect audio stream at the end!

The embarrassing and amateur nature of the test setup itself has been covered pretty well already I think so i'll leave that but there is another point which could be made regarding the test subjects.

If you're judging high cuisine you don't just grab Steve and Dave off the street because they won't have the sophisticated palate required to detect the all the subtle nuances of the dish. You bring in top chefs, high end food critics etc to do the judging.

I don't for one second believe that these cables do anything other than lighten the wallets of the gullible but for a true test you should use the 'audiophiles' as the subjects. At the very least use some real musicians who would be able to detect any difference if it should happen to be there.

Interestingly in many ABX studies I have seen, self-proclaimed audiophiles frequently do worse than people off the street (tests for lossy transparency at various bitrates).

The theory I have seen proposed as to why is that audiophiles may have damaged their hearing by listening to their music too loud too often.

That's why I suggested a test like this include some cases where there is a known audible difference. It lets you reject the people who can't distinguish audios that generally are different, so you can analyze the results of people who don't have bad hearing.

Hearing starts to decline once you are an adult. For most adults, lossy music is transparent at much lower bitrates than they think.

Foobar 2000 has some built in ABX stuff. Give it a try on your own system, you may be surprised at the results.

As an example, a better method would have been a 3 interval forced choice procedure where in each trial you play back a small song segment three times. Two of the playbacks use one cable and the third uses the other. The order of the playbacks is random. The participant's task is to select the segment that sounds different from the others. If the participants can't tell the difference, they will select the odd-cable-out 1/3 of the time, since they'll just be guessing. Here's a study that uses a procedure like this to examine if patients can detect different channel activation patterns in a cochlear implant.

Good point. Basically, ABX test here had two issues that audiophiles could (did) have issue with:- "I do not know" option is sure way for skeptics to influence test- at best, this test would prove that *majority* of people could not hear difference. I believe audiophiles will state that it is enough if *some* people can reliably hear difference (since their position is that audiophiles can hear better and are small subset of population)

My initial idea was to remove "I do not know" answer, and repeat test for EACH individual enough times to be statistically low probable to guess right - for example, chance to guess all correct in 8 tests is <0.4% (1/2^8).

This have downside of making test much longer, so probably not suitable for live audience - testing 20 people would take playing audio 20people*8runs*3audios=480 times.

Listening to random 3-pattern suggested above is even better idea, since it skip playing 'AB' part - test subject just need to find difference, does not need to hear both versions up front. Also, each triplet of played audio reduce chance to guess by 3x , instead of just 2x in standard ABX above. So to achieve similar <0.4% chance to guess, it would need just 5 runs (1/3^5) per person, or 20*5*3= 'just' 300 sessions, which is 60% faster than ABX above, and is more scientifically sound (no introduction to 'AB' means less chance for prejudice).

I'll also point out that its completely irrelevant if there are detectable differences in the two cables via electronic analysis of the signals they provide to the DAC. I mean sure, you might expect that to be a prerequisite to producing any difference in perception, but the former certainly doesn't entail the latter.

I would expect that, if there is no detectable differences in the two cables on electric level, there should be no difference in perception.

Other way around does not hold: if there is *some* difference on electric level, it does not mean it can be noticed, and then human tests would be required.

For a resolutely practical & balanced consideration of "Objective VS. Subjective" Ross Miller provides a brief, but very worthwhile perspective at Hydrogenaudio vs. Stereophile. It amounts to 2 or 3 minutes well spent for anyone attracted to this topic.

For a resolutely practical & balanced consideration of "Objective VS. Subjective" Ross Miller provides a brief, but very worthwhile perspective at Hydrogenaudio vs. Stereophile. It amounts to 2 or 3 minutes well spent for anyone attracted to this topic.

From that article -

Quote:

These charts provide incontrovertible objective evidence that CD players by their very digital nature butcher music

which is a bunch of bologna. He doesn't understand the math behind sine waves and what happens when waves combine.

What the Nyquist-Shanon sampling theorem proved, long before digital audio, is that if you take the highest frequency sine wave in those basic sine waves that add to create the more complex wave form, and you double the frequency of that sine wave, that frequency is sufficient to take discrete sample from which you can then perfectly reproduce the wave form being sampled.

If there are frequencies that are higher than half your sample rate you will get aliasing, but if the highest frequency is <= half the sample rate, you can perfectly reproduce the original wave form from those discrete samples.

Now with digital music, those samples themselves are also discrete. They are not the precise location on the wave. This does result in a slightly different wave form.

The higher the number of bits in the sample, the closer to the original wave form you will get. That difference is noise

With 16 bit audio at levels that are safe to listen to without causing hearing different, that noise is below the threshold of human hearing.

You can try it yourself, resample a red book audio to 8 bit instead of 16 bit and you will hear the noise, it sounds like tape hiss.

As the human ear is not capable of hearing audio above 20 kHz - a 44.1 kHz sample rate with 16 bits per sample is enough information to produce an analog wave that we can not distinguish from the original.

To avoid aliasing we do need to use a cut-off filter to avoid frequencies above 20 kHz from getting into the audio that is digitized. Those cutoff filters are during the analog stage of the ADC and can introduce distortions into the high end, which is why recording studios record at even high sample rate.

Recording at an even higher sample rate, the analog distortion caused by the cut-off filters are well outside our hearing range. It can then be digitized at that higher sample and the higher frequencies digitally removed, leaving us with a perfect representation of the frequencies we can hear.

The bits per sample are also higher during recording and mastering to avoid buildup of noise (hiss) that we can hear as various filters are applied during mastering.

But the final wave form at 44.1 kHz 16 bits per sample produces wave forms that are as good as we are capable of hearing, suggesting they butcher music is a load of crap.

Any butchering that happens is poor mastering and/or a poor DAC when decoding back to wave, it is not the digital CD format.

I made this comment on the cable teardown test article, but I thought it worth repeating. Vodka in the USA is legally neutrally flavoured with a minimum percentage of ethanol. That means that vodka brands have to differentiate themselves through perception, marketing, distribution, etc., and they can't, legally, market vodka based on difference in flavour. These 'Vodka' ethernet cables are the same, they either meet spec or they don't. Exceeding spec will make no difference in data transmission.

If, from a statistically significant number of trials, Michael Lavorgna can correctly determine the X 85% of the time, then offer him the million bucks. If he can't, then he publicly admits he is full of shit.

Who thinks he accepts that challenge?

Michael Lavorgna is full of shit.

Don't expect to see him ever submit to a challenge that would demonstrate that he's full of shit. If ever he did submit to a challenge proving him to be full of shit, he would still never admit to being full of shit.

For a resolutely practical & balanced consideration of "Objective VS. Subjective" Ross Miller provides a brief, but very worthwhile perspective at Hydrogenaudio vs. Stereophile. It amounts to 2 or 3 minutes well spent for anyone attracted to this topic.

From that article -

Quote:

These charts provide incontrovertible objective evidence that CD players by their very digital nature butcher music

which is a bunch of bologna. He doesn't understand the math behind sine waves and what happens when waves combine.

I'm disappointed. I know that there should be no difference between these two cables if the devices are properly grounded, but all this test proved is that there is no clear difference between the two. I would have loved to have seen this same test done with golden eared testers. I know it wasn't necessary for the question you were asking, but I think you were asking the wrong question.