You are here

Is Loud Music Better?

Is music really more likely to sell if it's louder? We come up with some hard evidence — and some surprising answers.

SoundOut: the ideal platform for loudness tests?

The last 15 years or so have seen mastering engineers put into an increasingly uncomfortable position by their clients. On the one hand, it's their job to make music sound as good as it can when it reaches the listener on the final delivery medium. On the other, it's considered commercial suicide to release a good-sounding CD, if that means one that preserves a wide dynamic range. Many in the music business see brute loudness as overriding all other sonic considerations in creating a commercially viable product. Mastering engineers are thus forced to abuse peak limiting, clipping and the other tools of their trade, not to make music sound better but to make it louder.

But is there actually any evidence that louder CDs are more likely to sell? If there is, it hasn't found its way to SOS Towers. Of course, some of the albums that raised the threshold of acceptable RMS level have been big hits — one thinks of Oasis's What's The Story Morning Glory?, the Red Hot Chili Peppers' Californication, and so on — but it's impossible to say how many copies those albums would have shipped had they been mastered less loud. (I've seen it argued that Californication's extreme mastering aided its rise to the top, but also that its mastering has been responsible for surprisingly poor ongoing sales as a catalogue album. Who can say?)

It's not easy to find a way to test objectively the idea that louder tracks have more commercial appeal, but late last year I hit upon something that seemed to offer a way of doing so. Two months ago, in SOS January 2011, I reviewed a service called SoundOut (read the review online at /sos/jan11/articles/soundout.htm), which places your music anonymously before disinterested peer reviewers on the Internet. Each track you submit generates at least 80 reviews, which are then statistically analysed in various interesting and informative ways.

SoundOut is the nearest thing there is to a completely 'blind' service for having your music evaluated by other human beings. And from the reviewer's point of view, tracks by different artists appear in a random order. It's like listening to an MP3 player in Shuffle mode. Your track is presented within a sequence of other tracks — and, unlike on the radio, there's no broadcast compression to even out relative levels, and no DJs or jingles between tracks to interrupt. In other words, if there is any sort of link between the loudness of a track and how much listeners will like it, you'd expect SoundOut ratings to illustrate this link particularly clearly.

With the kind co-operation of SoundOut, I set out to test this. We chose three tracks in different musical styles, and created three different versions of each: the unmodified original mix, a second version that was simply reduced in gain by 3dB, and a third version that was 6dB down. This gain reduction was the only difference between the three versions — we were not taking any advantage of the potential sonic improvements that could have been made by 'properly' mastering to a lower level. We then commissioned Standard SoundOut reports on each of the nine tracks, meaning that each would be auditioned and rated, in a random context, by 80 reviewers.

The results — summarised in the table — were as fascinating as they were, to me at least, unexpected. It should be remembered, first of all, that there is a natural degree of variation in the results of SoundOut tests. In other words, if you fed the exact same track into the system several times, the results would not be absolutely identical each time: SoundOut say that the standard variance is 95 percent, meaning that you could expect multiple reports on the same track to differ by up to five percent. It should also be pointed out that the Overall Track Rating is not the only important result that a SoundOut report delivers. The Market Potential Rating gives a measure of a track's standing within the stated genre, while the Passion Rating tells you to what extent your track was loved rather than merely liked. These two measures can be more significant than the Overall rating, especially in niche genres.

With that in mind, it's interesting to note that the Overall Track Rating in every case is within the standard variance figure of 95 percent. In other words, the overall ratings for the 0dB, -3dB and -6dB versions of each track were so close that, when you factor in normal statistical variation, it is not possible to separate them. Or to put it another way: in every case, cutting the levels of the the tracks by half made no significant difference to how highly they were rated overall.

When we turn to the Market Potential Rating, things get a bit more complicated. With our pop/rock test track, the difference again fell within the bounds of normal statistical uncertainty. With the jazz track, by contrast, the loudest version scored significantly higher than the -3dB version — though its rating remained constant when the loudness was reduced further. Our chosen electronica track, however, turned things completely on their head, with the -6dB version rated a massive 13 percent higher than the loudest version!

The Passion Rating category, meanwhile, produced different results again. Our electronica track remained consistent across the three versions, while both of the others showed a significant boost for the loudest version, but no difference between the -3 and -6dB tracks.

What are we to make of these results? Of course, we can't draw too many general conclusions from a test involving only three tracks, especially when the results seem contradictory in some respects. And, as I've already pointed out, this test takes no account of several factors that, in the real world, might actually work in favour of quieter mixes, such as their not being compromised by clipping and hard limiting, being less fatiguing in the context of an entire album, and coming through broadcast processing on radio better.

Much remains murky, but what does seem clear is that if there is any sort of relationship between how loud music is and how much people like it, that relationship is much more complex than might be expected. That it wouldn't be consistent across all styles of music could perhaps be predicted — but who would have expected a positive link between loudness and likeability in jazz, and a negative one in electronica? More research is needed, but there is plenty of food for thought here. Something to remember next time your finger is on the Threshold control...

Three tracks were used for these tests, and were deliberately chosen to represent different genres. For a commercial pop/rock song, we used my mix of Superfox's 'Hear It In My Head' from December's Mix Rescue column, while SoundOut chose Sabrina Rabello's 'Drury Lane' and Jax Walker's 'Escapa!' to represent jazz and electronica respectively. The results were as follows: