I've been reading all over the place about higher sampling rates create zero difference in the playback 'quality', however I can hear a very clear difference between CD-audio 16/44.1 and DVD-audio 24/48... I've also noted that I can hear a very clear difference between 24/48 and 24/96 using the same albums. A strategy I use to gauge the difference is to pick out an instrument located in the higher ranges-mostly cymbals or auxiliary percussion like shakers-that are typically dull, or possibly not audible for most people, in a low-quality digital audio file. I then change my Windows audio settings to play different qualities of digital audio at their respective playback settings (for example, PCM WAV CD rips @ 16/44.100 and digitally downloaded PCM WAV "HD" tracks @ 24/96.000) and listen through foobar2000.

During testing I often find that in audio with higher sample rates (48&96 specifically), these instruments don't cut out as soon as their lower-quality counterparts, and the overall sound captured is heard clearly with much less fatigue even without the use of an amplifier. Ride cymbals, for example, have more of a "boom" to them (as if I they have more presence across different frequencies) and reverb doesn't decay as quickly in 24/96 as it does in 24/48 using a 24/96 audio file. The number one difference I can hear between 48&96 is that the highs sound... well, higher. I don't want to focus so much on the difference between 16 bit and 24 bit though, because I imagine that the average person can hear the difference between those.

So, I've come to the conclusion that I'm definitely hearing the difference between and not simply imagining it (as most 'experts' claim). However, what I'm wondering is:

1. Are these higher sample rates actually bringing out otherwise hidden 'quality' in digital audio sources?2. Or am I just hearing a type of loss/distortion (or some other playback effect) between audio of different sampling rates?