Originally posted by Netlist
Mike,
It would be nice to hear what you found.

/Hugo

Hugo, i did that test, but there was something odd...
The rmaa-measuring shows that the signal did not get degraded. I ran rmaa on its own supplied .wav and from cd (with this file burnt) through spdif.

But subtracting the 2 waves (the original and the captured) did not result in exactly zero. Some strange 2-bit noise was left, repeating every 2.7 seconds, having a gap of silence of ~0.7 secs. This noise was in no relation to the original signal.
I am paranoid enough to suspect that some software/hardware added a watermark...

Found the source to the odd noise... I did not use exactly the same wave-file rmaa puts out and the burned one. Rmaa adds white noise to each test pattern to "improve" fft ? This gives different wave file each time it gets generated.
Well, gladly i found the original wavefile that was burnt to cd.
The result: perfect digital zero.

So: Wave file generated -> burnt to CD -> played from player -> recorded via spdif. The result is a perfect digital copy, subtracting the recording with the original wave file gives complete zero, not a single bit lost !

So much about the voodoo, that error correction replaces samples, burning CDs changes the data and so on, bla bla bla. (burnt onto cheapest CDR available, one year ago)

BTW, my spdif coax-cable was a 2meter plain RCA-cable, nothing special/high quality.
The only real thing with the digital-voodoo is jitter, no data is somewhere lost/changed.

Yes, the soundcard is a nice one. I especially bought it for measurings and absolutely wanted spdif coax in/out. Measured thd of this soundcard is ~0.001%.

I hope this test can end many of the debates...
The results are plain and measured effects, no data gets lost or changed in the digital chain. I'm not sure whats left to argue about ? Identical data is identical data, even without high performance hardware.

You've proved the strength of error correction but did some still maintain audible differences in digital playback were due to sample error? (BTW, I can't hear a diff between my Toslink or coax connect to a Benchmark DAC.)

That's exactly what i wanted to justify, there is no sample error, if there is an audible difference it is not related to data errors, jitter is the only thing that seems to be responsible or this.
I intentionly used mainstream hardware, if this is able to deliver all data without a single error, high end hardware really should have no problem with that.

Jitter has null effect in this test, as only the datas transfered are compared. (Just like downloading from the internet)

Error correction typically takes place when bits are decoded from disc, but obviously no problem occured during my test. (with a cheap cdr)

Originally posted by soongsc Very interesting experience. I think if you are comparing on the computer visually, then you compare the data samples, and if the timing is aligned, there is no difference. But when you copy, I am not sure whether the data is in the same sequence or whether the burn quality is the same. If the burn quality is not the same, might there be some resampling until the data gets read without error? This would effect the power requirements and all that inter related effects, wouldn't it?

There will surely be burning or reading errors, often 1000's on a CD. The point is that the error correction correct all this 100% (unless uncorrectable errors cause error concealment but thats really rare these days). The fact that Hugo saw exact copies is a testimony how good the CD system is even with many, many errors occuring.

Originally posted by I_Forgot You didn't compare an analog signal to a digital one as the title of the thread suggests. You compared two digital signals, one generation apart via the CDR. Your source could just as well have been a commercial CD as an analog disc.

It would be more interesting to compare an analog disc playback to a commercial CD of the same music. Then you can see/hear the difference between the analog and digital playback mechanisms.

I_F

I don't think that exists. You may have the same piece of music on LP and CD, but I would bet that the mastering and mixing and recording would be done different. In fact, it is my believe (which I cannot prove, but Hugo's and MikeB's findings support it) that the difference between LP and CD is SOLELY due to mastering/mixing etc differences and NOT due to any problems with either medium.

Originally posted by MikeB That's exactly what i wanted to justify, there is no sample error, if there is an audible difference it is not related to data errors, jitter is the only thing that seems to be responsible or this.
I intentionly used mainstream hardware, if this is able to deliver all data without a single error, high end hardware really should have no problem with that.

Jitter has null effect in this test, as only the datas transfered are compared. (Just like downloading from the internet)

Error correction typically takes place when bits are decoded from disc, but obviously no problem occured during my test. (with a cheap cdr)

Mike

Mike,

Didn't you just prove that S/PDIF jitter also has no effect?? S/PDIF in essence being an analog signal.

Hugo, Mike, this is one of the most relevant threads I have ever seen here (no offense to others, including myself). This really furthers our understanding. Thank you for taking the trouble to figure this out.