"But there is something more. Itís known by signal processing experts, and absolutely not popularized amongst music lovers, that converting an analog signal into a discrete-time one (as it happens when converting from analog to digital) destroys the phase information in the two top octaves of the resulting spectrum. In a CD-standard digital recording, all phase information are lost from 5.5kHz up to 22kHz,"

Can any signal processing experts comment on whether this is true or not, and if so why. As a mere layman I am not able to understand why it even might seem be true.

Thanks everyone, I thought it seemed like nonsense but I wanted to check unless it referred to some little known effect. I can understand that analog brickwall filters could mess with phase, but am relieved that there does not appear to be an insurmountable problem.

Thanks everyone, I thought it seemed like nonsense but I wanted to check unless it referred to some little known effect. I can understand that analog brickwall filters could mess with phase, but am relieved that there does not appear to be an insurmountable problem.

I may take this up with m2tech.

Above some frequency in what we call midrange, the ears lose their ability to discern phase. This is because our ears are built like spectrum analyzers, but only what we call the real portion or amplitude is conveyed to the brain. If you wish to obtain phase information from a spectrum analyzer you need two independent kinds of (quadrature) data for each frequency band. Our ears only pass one kind of information to the brain for each frequency band above medium frequencies.

One consequence of this loss of phase information in our ear/brain interface is that massive amounts of phase shift (e.g. 1,000 degrees or more) can be applied to critical high resolution audio signals, with no discernible change in perception. The only caveat is that the phase shift applied to both channels must be essentially the same or else the phase shift will turn into response changes that will be audible. Above 5 Khz this situation dominates with total supremacy.

So even the old CD players with analog filters did have fairly well matched channels, and while what they did to phase was not numerically pretty, there wasn't any serious effect.

Another ugly thing that old CD players sometimes did is share the same DAC between the 2 channels so that their outputs were 1/2 sample time apart. If you electrically summed the two channels this led to a minor frequency response roll-off that on a really good day might be mildly audible.

massive amounts of phase shift (e.g. 1,000 degrees or more) can be applied to critical high resolution audio signals, with no discernible change in perception. The only caveat is that the phase shift applied to both channels must be essentially the same or else the phase shift will turn into response changes that will be audible.

That's what I would have said if you didn't beat me to it. The only other time phase shift is audible, even when the channels are matched, is if the amount of shift is currently changing. My AES Audio Myths Workshop video demonstrates this. You can jump ahead to that part of the video at 47:45.