I have been a fan of Quentin Tarantino ever since I saw Pulp Fiction—on premiere night—in a movie theater with the woman who wound up marrying me. The director's particular combination of irreverent sensibility, cinematic flair and captivating dialog have frequently resulted in films that are among my personal favorites. I put Django Unchained squarely in that category. His rethinking of the spaghetti Western, set in the pre-Civil War South, is quite a brutal movie. I suppose that is something else you can always expect from Mr. Tarantino.

Christoph Waltz and Jamie Foxx in Django Unchained

Compression is also brutal, turning perfectly great cinematic footage into a scrambled mess of flickering pixels. At least that's the conventional wisdom about online delivery of high-definition movies. It is certainly what I believed, until just a few months ago when I purchased a copy of Skyfall from iTunes and watched the 1080p version on my home-theater PC. What I saw that night was a whole lot better than what I had seen previously, as far as streaming or downloadable movie formats were concerned. That is probably because I had sworn off online delivery formats after a number of negative experiences a year earlier.

Fast forward to April 2013. I am walking through my local neighborhood Walmart, and I spy a Blu-ray case for Django Unchained. It is a digital pre-release, and inside the case is nothing more than a piece of paper with a number on it. The number is a Vudu code, and since I already have an account with that service, entering the code granted me access to the digital early release of the film. It also automatically set up a shipment of the physical Blu-ray, which happened to arrive several days before the official release date. For a total of $20, I was able to watch the movie early and then receive the physical Blu-ray early. The only question is, was it worth watching the film on Vudu at all? Some would argue that only Blu-ray is good enough quality, especially for the first viewing of a great movie. Others would argue that Vudu HDX is more than adequate—not easily differentiated from Blu-ray in most cases.

A few weeks ago, I ran a poll about the suitability of online formats for home-theater usage. The results indicated that Vudu HDX has some fans, but not nearly as many as Blu-ray does. It also revealed that iTunes HD movie files are not at all popular for home-theater use.

From a practical perspective, this makes sense because Vudu does allocate more bandwidth to its HDX format than Apple does for its HD format. Blu-ray, Vudu HDX and Apple HD all use the same compression technology, so it comes as no surprise that the ranking in terms of quality has frequently been in line with the allocated bit rate. However, there are exceptions, based on the platform. Some movie releases have had restrictions, whereby the Vudu HDX version was not playable on a PC or Mac. Check out two examples of that approach in my recent comparisons of Wreck-It Ralph and Lincoln.

In the past, I have struggled to find a way to quantify differences in sound quality between formats. I have no way of performing double-blind A/B comparisons, and audio memory is notoriously unreliable. However, audio memory is all I have to work with. I needed a way to maximize what I was listening to, an audio equivalent to "pixel peeping." My solution: to listen to only the surround channels as if they were the front channels. That means turning my head around, turning off the mains, and turning up the volume. I chose the intense action scenes toward the end of the movie for a comparison—bullets and ricochet noises always make for good surround sound.

Blu-ray enjoys a significant theoretical advantage in terms of sound quality because it does not employ lossy compression. It is a remarkable fact—the highest-quality audio formats that have ever been available to consumers are on discs sold for only a few bucks in Walmart, Best Buy and Target across the country. The quality is so high, it would be impossible to improve upon in the context of human perception of dynamic range and frequency response.

The listening tests confirmed that the online versions suffered a significant loss in overall fidelity compared with Blu-ray audio. In addition to isolating the surround effects, turning off the mains also allowed me to hear the subwoofer channel. Uncompressed audio outperformed the online formats in a number of ways—each sound was more discrete, each sound had more impact at the same volume, and the accompanying bass impact for each sound effect was deeper. Listening to nothing but the surround effects really opened my ears to what's going on with the compressed formats. The same effects are there, but the soundfield is flatter. The sense of space, as well as the shock of a dynamic transient accompanied by strong deep percussive bass thump—those qualities were present in the Blu-ray version of Django Unchained.

Apple's iTunes HD offering does not fare well when compared to Blu-ray or even Vudu HDX—at least not in terms of specifications. Apple uses the exact same audio compression that was standard on DVD—Dolby Digital 5.1, with a bit rate almost 100 times lower than Blu-ray. Vudu claims that Dolby Digital Plus allows it to use a bit rate 40% higher than standard Dolby Digital, but that is still just a tiny fraction of the bandwidth used by uncompressed 5.1 or 7.1 audio. In some circumstances, Vudu also provides soundtracks in 7.1 audio, which is not available from iTunes.

Nevertheless, a funny thing happened as I performed my comparison. Listening to the iTunes version, I started to feel like it came close to the Blu-ray. In fact, I am not sure I would be able to pass a double-blind test. I do not know if this has anything to do with investments Apple has made in audio compression, but it seems to me like the company used its bandwidth better. I am a strong believer that good mastering trumps almost anything else when it comes to sound quality. I can say that I did not find iTunes to be lacking in the sound quality department.

Simply put, the online formats can never be as good as Blu-ray. They use too much compression, and as a result, fidelity is lost. However, an argument could be made that a highly capable 5.1 or 7.1 system is needed to hear the extra quality inherent in Blu-ray. After all, the sound available through iTunes is still the DVD standard—and I have thoroughly enjoyed many DVDs in the past. The fact is, the soundtrack is more than adequate in all the versions, but I did feel that Vudu was somehow not as good as iTunes for this movie. In some past comparisons, I have felt that Vudu's soundtrack was better sounding than iTunes'. In retrospect, each time that was the case, the soundtrack was actually 7.1 channel. This movie is 5.1, even though Vudu HDX uses Dolby Digital Plus, which supports 7.1 channels. HDX did not have extra surround channels to give it an advantage over iTunes. That made a real difference, and not in Vudu's favor.

This is the most intensive comparison of soundtracks that I have performed thus far, and I feel confident in recommending iTunes HD over Vudu HDX, at least for the soundtrack portion of Django Unchained. However, I also felt that all three soundtracks were acceptable for home theater use. Ultimately, discrete surround effects were present, and the bass was deep regardless of the version. Comparing sound in this manner is a completely subjective exercise, so please take my observations with a grain of salt.

Visual comparisons between formats are quite fascinating, and there is considerably less ambiguity than with any attempt to compare sound. Blu-ray, iTunes, and Vudu share the same underlying video compression, though iTunes and Vudu clearly take different approaches to processing video for compression. Sometimes, the processing can result in a change to the underlying character of the film. Specifically, Vudu HDX has a tendency to apply noise reduction. To some observers, the result is actually an improvement versus the original film grain—that happened with my Argo comparison.

In my last comparison, which was Lincoln by Steven Spielberg, a variety of factors combined to make online delivery versions of the movie unsuitable for home theater use. One big hint—Vudu HDX was not available on PC or Mac. As a result, I was not able to use screen captures, which resulted in less-than-ideal still-image comparisons. Django Unchained has no such restriction, which made it easy to use screen grabs to perform the comparison.

Django Unchained fared very well in my image-quality comparisons. Despite the many dimly lit scenes, and the often-monochromatic palette, all three versions were eminently watchable. While Blu-ray remains the reference for image quality, iTunes and Vudu both exhibited ample image quality. The differences were in the details; Vudu was more heavy-handed with noise reduction but also preserved slightly more fine detail. iTunes looked a little bit dirtier, but also potentially more authentic. With dark scenes and lots of action, both online formats struggled compared to Blu-ray.

With a bright scene and very little action, detail levels were close to Blu-ray in both online formats. In some dimly lit scenes, I witnessed Vudu HDX struggling with artifacts on occasion— but it required significant scrutiny. As far as I am concerned, the online-delivery versions of Django Unchained are very suitable for home-theater use. I hope that the following comparison images will help you come to your own conclusion.

All of these images are resized for web display. Please click within each image to view the original size. When viewed at the original size, you see the exact pixels taken from the screen grab. When viewed that way, the differences between the formats are considerably easier to see.

Dimly lit scenes tend to pose a challenge for compression algorithms.

To my eyes, iTunes 720p beats out the other two online delivery formats. I have seen this effect once before, in the movie Argo. The 1080p formats utilized too much noise reduction.

Artificially brightening the screen grab in Photoshop makes it easy to scrutinize the shadows. There are significant issues with Vudu HDX and both iTunes versions, but Vudu looks the worst by far.

This scene features uneven lighting. Because the camera is stationary, the lower bit rates of the online formats are not a significant issue.

Even though Blu-ray looks the best, the online delivery formats also look very good.

Artificially brightening the images in Photoshop once again reveals the inner workings of the shadows. In this example, Vudu HDX is not having any problems rendering shadow detail and looks remarkably similar to Blu-ray.

Diffuse outdoor lighting and a stationary camera make this a great scene for judging textures and detail rendition.

Blu-ray has superior detail in all of the rock textures, but iTunes 1080p does the second best job at rendering the scene. The noise reduction applied by Vudu HDX makes the scene look a bit fake, plus there is an issue with its color balance.

High key, outdoor lighting make this a relatively easy scene for the compression algorithms.

Among the online delivery formats, iTunes 1080p strikes the best balance between noise reduction and preservation of detail, coming the closest to the Blu-ray reference. Whatever slight gain in sharpness Vudu HDX provides is offset by the loss of texture detail that is a result of noise reduction. iTunes 720p continues to be the softest-looking, but it still looks very good.

The vibrant blue in Jamie Foxx's costume adds a colorful twist to a movie that is mostly painted in Earthy hues.

With ample daylight and a stationary camera helping to counteract limited bandwidth, iTunes 1080p once again performs the best among the online formats. Vudu's use of noise reduction is just too heavy-handed to retain a natural look.

This shot involved motion, and because the sun is behind the subject, there is a lot of shadow. It is the sort of scene that causes problems for online delivery formats.

Once again, the 1080p version of iTunes' HD format comes the closest to Blu-ray.

This artificially brightened version shows Vudu HDX struggling to render the shadows. iTunes HD 720p also seems to have a hard time with the shadows, compared to the 1080p version.

A scene like this—shot in broad daylight with a stationary camera and a wide-angle lens—possesses a tremendous amount of detail.

Blu-ray does quite a bit better than the online delivery formats, rendering detail right down to the individual pixel. iTunes HD 720p looks surprisingly good in this shot.

Leonardo DiCaprio lights a cigarette in this dim, amber-tinted scene. The lit match provides a spot of high contrast.

Both iTunes versions look remarkably similar to the Blu-ray, while Vudu's noise reduction continues to alter the image.

This indoor scene uses spotlighting to achieve a very high contrast, painterly effect.

Vudu HDX manages to preserve a bit more detail than iTunes. I was surprised by how similar these frame grabs look.

There is something about this scene that the online delivery formats liked—or rather, found easy to render.

Tarantino has a reputation for making movies that feature brutal, stylized violence. Of course, we all know it's just special effects.

Blu-ray renders more individual drops of fake blood than the online delivery formats, by a significant margin. Loss of detail during fast motion is one of the pitfalls of the relatively higher levels of compression used by iTunes HD and Vudu HDX.

Interesting. While I haven't read through the whole analysis just yet, The common thing I am seeing is that Vudu HDX appears slightly washed out, itunes 720 appears slightly pixelated, and itunes 1080 appears slightly blurry. None look to be unbearable, but they also don't quite match blu ray. Would you agree? reading through now...

Although I have only read about six of your reviews, this was the first one where the best video wasn't blu ray or Vudu. Of the movies I've seen online your definitely right on the money as far as audio goes. That was a really good and in depth review, thanks Mark.

That movie was really good. I watched a DVDSCR before it was "officially" out, and it looked fine, of course not as sharp as an HD source. These pixel peeping things are useless. They make VUDU HDX look horrible, when in reality, it looks the same as Blu-ray, and they make iTunes look way better than it really looks, with very harsh compression artifacts. They also don't look at Amazon, which has a very pleasing picture quality, even though it's not nearly as sharp as Blu-ray/HDX.

To my eyes, iTunes 720p beats out the other two online delivery formats. I have seen this effect once before, in the movie Argo. The 1080p formats utilized too much noise reduction.

The 1080p looking worse than the 720p doesn't have anything to do with noise reduction. Argo has a lot of film grain, Django has a lot of dark scenes. The 1080p itunes can't preserve as much detail as the 720p because it is only using 20% more bitrate. It literally can't display the detail and grain properly because it doesn't have enough bits to do so. Even a transparent 1080p x264 encode needs about 15 Mbps for this movie while the itunes 1080p is only 5 Mbps and the 720p is 4 Mbps.

The 1080p looking worse than the 720p doesn't have anything to do with noise reduction. Argo has a lot of film grain, Django has a lot of dark scenes. The 1080p itunes can't preserve as much detail as the 720p because it is only using 20% more bitrate. It literally can't display the detail and grain properly because it doesn't have enough bits to do so. Even a transparent 1080p x264 encode needs about 15 Mbps for this movie while the itunes 1080p is only 5 Mbps and the 720p is 4 Mbps.

Basically I agree with you, but what other name would you give to the reduction of film grain? For almost 20 years, every professional I know has referred to the process as "noise reduction" and it's considered a crucial part of achieving high levels of compression. In essence, 1080p looking "worse" than 720p has everything to do with noise reduction. Besides, Vudu HDX has the higher bitrate and also has the most heavy-handed noise reduction, which does not quite jibe with your theory.

Basically I agree with you, but what other name would you give to the reduction of film grain? For almost 20 years, every professional I know has referred to the process as "noise reduction" and it's considered a crucial part of achieving high levels of compression.

Noise reduction is an actual filter/process that is used to process out noise from the source. You want to preserve film grain as much as possible to be true to the source. Film grain itself is not "noise". It is intentionally there because that is how the director intended the picture to look. If you encode a grainy movie with x264 at say crf 18, it is going to have a high bitrate because film grain demands it. I don't have any proof one way or another (nor does anyone besides the company that encodes for apple) but the reason why the 1080p looks "smooth" is because the first thing to go when bitrate is not high enough is film grain, the second thing is actual detail. If the itunes 1080p encode was double the bitrate (as their movie trailers are encoded to) the video detail would be much higher. I don't understand why apple limits their 1080p encodes to ~5 Mbps bitrate, but it is seriously hurting its quality. The majority of 1080p content needs at least 8-10 Mbps to look decent.

You don't know what software Vudu or apple use to encode, and that matters a lot. x264 is the best H.264 encoder out there, but the companies don't disclose what software or settings they use. Bitrate is just one factor.

Noise reduction is an actual filter/process that is used to process out noise from the source. You want to preserve film grain as much as possible to be true to the source. Film grain itself is not "noise". It is intentionally there because that is how the director intended the picture to look. If you encode a grainy movie with x264 at say crf 18, it is going to have a high bitrate because film grain demands it. I don't have any proof one way or another (nor does anyone besides the company that encodes for apple) but the reason why the 1080p looks "smooth" is because the first thing to go when bitrate is not high enough is film grain, the second thing is actual detail. If the itunes 1080p encode was double the bitrate (as their movie trailers are encoded to) the video detail would be much higher. I don't understand why apple limits their 1080p encodes to ~5 Mbps bitrate, but it is seriously hurting its quality. The majority of 1080p content needs at least 8-10 Mbps to look decent.

You don't know what software Vudu or apple use to encode, and that matters a lot. x264 is the best H.264 encoder out there, but the companies don't disclose what software or settings they use. Bitrate is just one factor.

Film grain and digital noise behave in the same manner—when you end up with less of it as a result of some kind of processing, that is properly referred to as noise reduction. It's a semantic argument that's not worth having.

If Apple paid no attention to noise reduction whatsoever in the compression process, the artifacts would look worse than they do. I've compressed enough video to know how it works, I guarantee there is a noise reduction algorithm in use somewhere in the encoding process. Call it something else if you want to.

I'm sorry but noise and film grain are not the same thing. The ultimate goal in encoding is to preserve film grain exactly as it was originally shown. Not to reduce it. If you take an incredibly grainy movie like Safe House it is almost impossible to encode it to a smaller file size. If you do crf 18 the bitrate will be even higher than the source bitrate. As well versed as you try to sound I don't think you understand how this works.

I'm sorry but noise and film grain are not the same thing. The ultimate goal in encoding is to preserve film grain exactly as it was originally shown. Not to reduce it. If you take an incredibly grainy movie like Safe House it is almost impossible to encode it to a smaller file size. If you do crf 18 the bitrate will be even higher than the source bitrate. As well versed as you try to sound I don't think you understand how this works

I'm sorry you don't understand. No reason to be insulting, I'm trying to educate you.

Film grain is noise. It's presence is actually called "film grain noise". Reducing it is noise reduction, by definition.

You are probably thinking about digital noise, when you say noise is "not the same thing" as film grain. Anyhow, Blu-ray is a compressed format, so there is already some loss of film grain right there. There is even more loss of film grain when the bitrate drops further. That loss of noise, when achieved in a controlled manner so as to make compression into a digital format more efficient, is called noise reduction. Vudu HDX clearly used more of it than Apple did, and Apple uses more of it on 1080p files than they do on 720p files.

And by that time people are already into 4K so streaming is once again playing catch-up... Except for BiggAW, of course, he thinks HDX is already transparant to Blu-ray quality when clearly it's not (and he'll respond that we 're pixel peeping etc etc, the same song and dance trolling in every thread.)

I honestly have a collection of over 800 blu rays. All this makes me realize is that if you need to pause a movie and zoom in on little unimportant things and nitpick there really isn't much difference to the eye in real world watching conditions. Now if you want to compare sound format differences I'd be hard pressed to say that most couldn't tell the difference between compressed Dolby Digital Plus and DTS HD-MA if both were played at a loud level and not told to the listener which was which. All,HD formats seem fine in regards to actually watching the movie and enjoying it in HD. Neither are different enough to warrant that blu ray is by far better like it was 6 years ago. I'd say in 2 years most streams will match in quality or come so close that while in motion it would be impossible to see a difference.

It is pointless to boost the brightness in order to 'look into the shadows'.

The important thing is - on a calibrated display, from a typical viewing distance, does it look okay?

If it looks fine despite being clearly pixellated to hell when 'brightness boosted' then the encoding software has done its job perfectly. It has hidden artifacts in places where the human eye won't perceive them, in order to allocate those bits somewhere else.

If it doesnt look fine in a normal viewing scenario, then absolutely you should call them on it and take away some points.

It is the same with motion. It is completely normal and expected for there to be less detail in that blood spurt sequence.

I havn't seen this particular movie. However movies such as The Tall Man, Seeking Justice, Don't be Afraid of the Dark look horrendous on the dark gradations and fast movements on any of my calibrated displays. There is absolutely no need for brightness boosting at all.

I havn't seen this particular movie. However movies such as The Tall Man, Seeking Justice, Don't be Afraid of the Dark look horrendous on the dark gradations and fast movements on any of my calibrated displays. There is absolutely no need for brightness boosting at all.

So this would be a very valid criticism in these case.

On the flip side, I know many people who will complain about dark banding/blocking on even high quality sources. And of course when I visit them, it turns out that their brightness and gamma are completely out of whack.

On the flip side, I know many people who will complain about dark banding/blocking on even high quality sources. And of course when I visit them, it turns out that their brightness and gamma are completely out of whack.

These comparisons are not going to be viewed on calibrated home theater televisions, unlike the movies being compared. I am making the existence of artifacts obvious, even if this post is being viewed on a small or poorly calibrated monitor. The only other time when they would likely be obvious is either on a properly calibrated, high resolution system... or on an uncalibrated television that is too bright. In both circumstances, if the compression algorithm had not screwed up the shadows, the image quality would be better.

The way that Vudu HDX mangled the shadows in my first example, that is not the algorithm doing its job. That's the algorithm screwing up. There is nothing inherently wrong with demonstrating this clearly.

The way that Vudu HDX mangled the shadows in my first example, that is not the algorithm doing its job. That's the algorithm screwing up. There is nothing inherently wrong with demonstrating this clearly.

Like I said, if it is visible in normal viewing conditions then it is a relevant criticism.

But if artifacts are only visible when you artificially boost the brightness, then yes, that is the algorithm doing its job.

"Normal viewing conditions" vary widely, as do the black rendition capabilities of various monitors and televisions—even when they are properly calibrated. That's why clear examples are needed.

What you are presenting are not clear examples. The brightness has been artifically boosted by a very significant amount.

Nobody should be watching movies with the brightness set like this, and if they are it is their own problem.

A clear example should not need any changes to brightness. It is suffice to post an unaltered screenshots. We all know that we are looking at lossy material and therefore artifacts are inevitable. The only thing that matters is whether we can perceive these when watching the movie.

What you are presenting are not clear examples. The brightness has been artifically boosted by a very significant amount.

Nobody should be watching movies with the brightness set like this, and if they are it is their own problem.

A clear example should not need any changes to brightness. It is suffice to post an unaltered screenshots. We all know that we are looking at lossy material and therefore artifacts are inevitable. The only thing that matters is whether we can perceive these when watching the movie.

This has nothing to do with how the brightness is set on the TV. I'm sorry you don't understand the purpose of that part of my comparison. It should be easy enough for you to ignore those examples, since you don't consider them valid, and use the normal brightness versions to make any judgments.

Ultimately, you are correct it only matters if the artifacts "do or do not" manifest on one's own TV. However, they can't manifest if they don't exist. All I'm doing is showing that they do exist, in some instances only in one version. The meaning of their existence can be debated, but the validity of showing it—is not debatable. Go find someone else to nitpick.

This has nothing to do with how the brightness is set on the TV. I'm sorry you don't understand the purpose of that part of my comparison. It should be easy enough for you to ignore those examples, since you don't consider them valid, and use the normal brightness versions to make any judgments.

In fact I think the rest of your comparison is very well done.

Quote:

Ultimately, you are correct it only matters if the artifacts "do or do not" manifest on one's own TV. However, they can't manifest if they don't exist. All I'm doing is showing that they do exist, in some instances only in one version. The meaning of their existence can be debated, but the validity of showing it—is not debatable. Go find someone else to nitpick.

Okay but in reality you are doing more than simply showing that they exist. You are saying that one is 'worst' or worse than others, "struggling to render the shadows" etc.

We can take it for granted that artifacts exist. These are lossy codecs at relatively starved bitrates. All modern encoding software makes use of psychovisual optimisations to attempt to prioritise detail that a human eye can see over detail that it can't. If there is blocking in a shadow area but it is only visible when artifically boosted then it is irrelevant.

In fact I think the rest of your comparison is very well done.
Okay but in reality you are doing more than simply showing that they exist. You are saying that one is 'worst' or worse than others, "struggling to render the shadows" etc.

We can take it for granted that artifacts exist. These are lossy codecs at relatively starved bitrates. All modern encoding software makes use of psychovisual optimisations to attempt to prioritise detail that a human eye can see over detail that it can't. If there is blocking in a shadow area but it is only visible when artifically boosted then it is irrelevant.

We are really slicing and dicing the words to get to a very fine point. The artificially brightened artifacts are visible, but not necessarily by all viewers, all the time. On a good piece of equipment, properly calibrated they are perceptible. You can take my words at face value.

I am making the artifacts obvious, I am saying that their existence can affect the movie watching experience, at normal brightness levels. But not necessarily on a computer monitor or tablet or phone, where this comparison might be read. Nor on an inexpensive HDTV that renders the last few shades of deep gray as pure black, or I should say pure deep gray. I'm counting on the fact that people are not necessarily going to read the comparison on the same device that they're going to watch the movie. That is why an artificial example is necessary.

And it is true, there were very few instances of visible artifacts. I can usually forgive one or two instances of blocking that only last a few seconds. Blu-ray's track record is not much better than that, depending on the movie and whether there is an accidental fingerprint on the disk surface.As a rule, I don't see banding in the premium online distribution formats.Any home theater geek who obsesses over black levels is going to notice these artifacts, at one point or another.

And by that time people are already into 4K so streaming is once again playing catch-up... Except for BiggAW, of course, he thinks HDX is already transparant to Blu-ray quality when clearly it's not (and he'll respond that we 're pixel peeping etc etc, the same song and dance trolling in every thread.)

I just call it like it is.

And 4k streaming is coming first, as there is no widely accepted 4k disc format currently. You can stream 4k with about 38mbps in MPEG-4 AVC or about half that with HEVC, so it's coming, and it will be pretty soon. That doesn't mean it has a good use case, but it will come anyways. People like high numbers, even if they're useless. All the numbers I've seen online point to 38-40mbps with HEVC, but if you take VUDU's 9.5mbps, and multiply by 4, you get about 38mbps, and they're using some highly optimized MPEG-4 AVC to deliver Blu-Ray quality in 1/4 of the bandwidth.