Other Entries

Advertisements

In my experience one of the most frustrating things consumers can face in evaluating A/V gear is the guesswork, did it look better with that first DVD player or was it the second one? wait go back to that other chapter I didn’t see it that time, and so on. We should always try to eliminate as much guesswork from the evaluation process as possible. But how?

I like to use tools that have given me ‘X’ result in the past, If I suddenly get a ‘Y’ result then I know this is an area that requires further investigation. Often the reason consumers have such a hard time quantifying appreciable differences between two pieces of gear are the very methods used to evaluate them with.

For example suppose we’re looking at two rear-pro HDTV’s at a major retailer. It’s very possible that the signal distribution method their using differs significantly from what we might use at home. So much so that making a comparative judgment between the first display, much less the one sitting right next to is nearly impossible. In other words, unless you happen to own the same equipment the store is using to drive their demonstration, how can we expect to get the similar results at home?

This is why I’m such a big proponent of evaluating home theater gear in real world conditions with equipment that’s either the same or very similar to what you’ll actually use with the display in your system. Lets face it, if your generally unfamiliar with how Blu-ray and HD-DVD look, its reasonable to assume that just about any display you see running the hi-def disc’s will look phenomenal.

I used the word tools earlier, but what do I mean by this? A tool can be just about anything really, a notepad and pencil, a photographic memory, a tried and trusted DVD player or even a DVD itself. Many of us have go-to DVD’s that we’ve used time and time again because we know how they look, if during the process of evaluating a piece of gear we see something that looks a bit odd, it’s reasonable sure to assume its not the DVD, as we’ve eliminated that possibility in previous viewings.

But what if there was a tool that took this concept to next level? Say a DVD of short video clips with on-screen tests and a scoring system that could help us accurately determine how well that new display, DVD player, external video scaler etc was performing? You don’t think I brought you this far to pop your balloon now, do you?

Silicon Optix the makers of the well received, Relata HQV video processing chip-set (used in the Denon DVD-5910 ) offers a benchmarking DVD that fills the above description quite nicely. The HQV Benchmarking DVD offers a series of tests that will help you evaluate things such as: de-interlacing, motion correction, noise reduction, film cadence detection, and detail enhancement.

The full suite of HQV Benchmark test’s breakdown as follows:

Color Bars: The color bar pattern will show if the color hue and saturation controls are set correctly and provide a reference point to evaluate the vertical resolution of the display device.

Jaggies 1: Interlaced video creates images with scan line artifacts. When video is de-interlaced for display, some of these artifacts may not be completely eliminated. This test will identify how well your system de-interlaces video.

Jaggies 2: This test is similar to the one above, but introduces high-speed motion and angel changes into the mix. Proper de-interlacing with still and or slow-transitional video is one thing, but fast moving images may overwhelm some signal processors.

Flag: This test offers a combined approach to the two previous tests, with an American flag in the foreground and brick building in the background. The continuously shifting angles of the flag combined with the texture of the bricks in the background are quite demanding.

The comprehensive user’s guide that comes with the HQV Benchmark DVD has you check for proper de-interlacing in the flag, as well as if you’re able to discern good image detail in the bricks behind the moving flag.

Picture Detail: In this test we’re given a wide angle shot taken near a bridge, in the scene we have a road, a bridge and some concrete steps to the left. The purpose of the test is to gauge detail enhancement processing and the associated “ringing” artifacts and artificially enhanced edges that sometimes materialize with poor image processing.

Noise Reduction: Ah that blasted noise reduction, this one area that I just couldn’t get a good score on. If you’ve ever witnessed posterization or macro-blocking on your display you might have mistakenly assumed this was video noise. True video noise is a result of frame duplication in the editing process or in how inferior video processors render highly saturated shades of a particular color, over a large area. Another source of this annoying malady is digital compression.

The Noise Reduction tests on the disc are composed of outdoor shots with large areas of a single predominate color; this is quite a work out for most systems. As I mentioned above this was one area where my system could definitely use improvement. Although I have to admit I see this crop up more so on non-HD source material. Luckily for me this is something I look forward to seeing less and less of.

Motion Adaptive Noise Reduction Tests: Video processors with inferior noise reduction circuitry often confuse motion with random noise. As a result, in scenes with fast-moving motion too much noise reduction can be applied, resulting in images that are smeared or ghosting behind moving objects.

Again this test was quite a workout but luckily I fared slightly better than in the previous noise reduction test with still or slow moving transitional scenes.

3:2 Pull-Down Tests: I’m sure many of you have heard the term 3:2 pull-down and or pull-down removal. At the risk of butchering Dithermaster’s already excellent explanation of 3:2 pull-down, detection and removal, I’ll just say that film and video are shot at different frame rates, how well film source material is rendered on our displays is determined by your video processors 3:2 circuitry.

You may have noticed large areas of symmetrical stairs, roadways, or buildings that seem to ‘ring’ or have a moiré effect to them in with some film-to-video based material, improper (or no) 3:2 pull-down techniques are often to blame for this. The quality of your out-board video processor or the video processing circuitry in your display, can have a significant effect on how well film based source material is rendered on our HD displays.

Film Cadence: 3:2 isn’t the only frame-rate process to consider, there are in fact a host of different video based frame-rates. If the video processor can’t detect and lock onto all of these different sequences, half the resolution of the individual frames is discarded. This section of the HQV Benchmark DVD examines the 2:2, 2:2:2:4, 2:3:3:2, 3:2:3:2:2, 5:5, 6:4 and 8:7 cadences individually.

Scrolling Text: Film based content edited electronically for video can introduce additional artifacts and distortions to video processors. This section of the disc examines how well horizontal and vertical scrolling text is rendered and how well the background is rendered during the process.

My Test Results:

The results for the combination of my InFocus 7205 being fed by my Mac mini as a DVD player, were pretty much on par with my un-tested assumptions. I knew the mini wasn’t exactly an over-achiever in the video performance department but now I have specific test results, instead of generalizations. In short the G4 Mac mini de-interlaces video quite poorly (especially using Apple’s DVD player software), but hey it makes one hell of a convenient DVD server. The HQV Benchmark DVD score for my Mac mini feeding the IF 7205 was 59 out of a possible 130.

On the other side of the coin, the combination of my 7205 paired with the Denon DVD-1910 actually fared even better than I expected. This combination scored perfectly in several of the de-interlacing tests. To be quite honest the only area that precludes me from feeling I have near reference quality with this combo, is color and noise reduction performance.

As the InFocus is capable of much better color performance with Toshiba HD-A1 than with the DVD-1910, I’m going to place the majority of the fault in this combination with the DVD-1910. The combination of my Denon DVD-1910 feeding the InFocus 7205 scored a 105 out of a possible 130 with the HQV Benchmark DVD. Not too shabby.

More about the testing process:

You’ll notice I mentioned “combination of” a few times above, its important note that the HQV Benchmark DVD evaluates the overall system rather than one piece of gear. For example an out-board video processor is still subject to the quality of video its being fed by the DVD player.

Alternatively if your sending a 720p native projector a 1080i image, without the use of a external scaler, its the projector that will handle down-conversion, thus taking the de-interlacing duties away from the DVD player, yet the projector is still subject to the DVD players overall video performance.

Summary:

The HQV Benchmark DVD differs from calibration discs like Digital Video Essentials and the AVIA Guide to Home Theater. Those discs are primarily for tuning and calibrating existing systems and or re-tuning the audio/video performance of a system, after the addition or substitution of a new piece of equipment.

This disc (The HQV Benchmark DVD) on the other hand can be used for evaluating a piece of equipment beforehand, prior to the sale. This is a important distinction, we now have a tool that’s easy to use, repeatable and offers us way to make more informed video equipment purchases.

Conclusion:

I have to admit at first I was a little skeptical about the HQV disc, maybe I was letting the memories of frustrating hours spent digging through the sub-chapters of the DVE disc, for one particular frame cloud my judgment. Or maybe it was just the fact that I didn’t think I needed another calibration disc, once I got familiar with the HQV disc I quickly realized this was no ordinary calibration disc. To be exact its not a calibration disc at all, rather its an evaluation disc.

I now have a tool that offers something I’ve never had up to this point, at least not in such a self-contained easy to use format. The HQV Benchmark DVD provides me with an accurate, repeatable method of benchmarking displays, DVD players and video processors. To this end all display, DVD player and video scaler review’s at Home Theater Blog, from this point on will include a HQV Benchmark DVD score.

I wholeheartedly recommend this DVD to anyone in the market for a new display, DVD player or video processor. Or for that matter anyone who own’s the DVE and the AVIA calibrations discs, as the the HQV Benchmark DVD would make a prefect companion to either of those two titles.

At $30 the HQV DVD seems like a bargain to me, especially considering how expensive and impractical it would be to piece together the equipment necessary to replicate these tests. The HQV Benchmark DVD is available directly from Silicon Optix.