Post Your Comment

63 Comments

"As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option."

I would like to point out that 1080i has become a popular broadcast standard because of it's lower broadcast bandwidth requirements. TV shows are generally mastered on 1080p, then 1080i dubs are pulled from those masters and delivered to broadcasters (although some networks still don't work with HD at all, MTV for instance who take all deliveries on Digital Beta Cam). Pretty much the only people shooting and mastering in 1080i are live sports, some talk shows, reality TV and the evening news.
Probably 90% of TV and film related blu-rays will be 1080p. Reply

Hint: They didn't. What anandtech isn't telling you is that NO nvidia card supports HDCP over dual-DVI, so yeah, you know that hot and fancy 30" LCD with gorgeous 2560x1600 res? You need to drop it down to 1280x800 to get it to work with an nvidia solution.

This is a very significant problem, and I for one applaud ATI for including HDCP over dual-DVI. Reply

Does the HD video acceleration work with other programs, and with non blueray/hddvd sources? For example if I wanted to watch a h.264 encoded .mkv file would I still see the performance and image enhancements. Reply

Well, what annoys me is that there used to be all-in-wonder video cards for this kinda stuff. I do not mind a product line that has TV tuners and HD playback codecs, but not at the expense of 3d performance.

It is a mistake for ATI and Nvidia to try to include this stuff on all video cards. The current 2XXX and 8XXX generation of video cards might not been as pathetic had the two GPU giants focused on actually making a GPU good instead of adding features that not everyone wants.

I am sure lots of people watch movies on their computer. I do not. I don't want a GPU with those features. I want a GPU that is good at playing games. Reply

All in wonder cards are a totally different beast. The all in wonder card was simply a combination of a TV tuner card (and a rather poor one) and a normal graphics chip. The TV tuner simply records TV and has nothing to do with playback. ATI no longer sells All in wonder cards because the TV tuner card did not go obsolete quickly, while the graphics core in the AIW card went obsolete quickly, requiring the buyer to buy another expensive AIW card when only the graphics part was obsolete. A separate tuner card made so much more sense.

Playback of video is a totally different thing and the AIW cards performed exactly the same as regular video cards based on the same chip. At the time, playing video on the PC was more rare and the video playback of all cards was essentially the same because no cards offered hardware deinterlacing on their video cards. Now, video on the PC is abundant and is the new Killer App (besides graphics) which drives PC performance, storage, and internet speed. Nvidia was first to the party offering Purevideo support, which did hardware deinterlacing for DVDs and SD TV on the video card instead of in software. It was far superior to any software solution at the time (save a few diehard fans of Dscaler with IVTC) and came out at exactly the right time, with the introduction of media center and cheap TV tuner cards and HD video. Now, Purevideo 2 and AVIVO HD introduce the same high quality deinterlacing to HD video for mpeg2 (7600GT and up could do HD mpeg2 deinterlacing) as well as VC-1 and H.264 content. If you don't think this is important, remember that all new satelite HD broadcasts coming online are in 1080i h264, requiring deinterlacing to look its best, and new products are coming and exist already if you are willing to work for it, that allow you to record this content on your computer. Also, new TV series are likely to be released in 1080i on HD discs because that is their most common broadcast format. If you don't need this fine, but they sell a lot of cards to people who do. Reply

Oh, I forgot to mention that only the video decode acceleration requires extra transistors, the deinterlacing calculations are done on the programable shaders of the cards requiring no additional hardware, just extra code in the drivers to work. The faster the video card, the better your deinterlacing, which explains why the 2400 and the 8500 cannot get perfect scores on the HQV tests. You can verify this on HD 2X00 cards by watching the GPU% in Riva Tuner while forcing different adaptive deinterlacing in CCC. This only works in XP btw. Reply

You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.

It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.

Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.

Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now. Reply

If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.

So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds. Reply

Yes, if by "best" you mean:
- Higher CPU utilization when viewing any HD video content, compared to 8800
- Generally lower price/performance in games compared to 8800
- More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)

Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.

My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.

UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me! Reply

I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.

He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?

My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds. Reply

Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work? Reply

For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:

Best (courtesy of Mirriam Webster):
1 : excelling all others
2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction

So, if you truly want the best of both worlds, what you really want is:

UVD from ATI RV630
3D from NVIDIA G80

Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).

Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.

Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch. Reply

quote:While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", G84 and G86 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...

Dont you mean ...

quote:While the R600 based Radeon HD 2900 XT only supports the the features listed as "Avivo", HD 2400 and HD 2600 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...

No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware. Reply

<blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

Dont you mean ...
<blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

quote:We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine? Reply

this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all. Reply

Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this. Reply

I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?

Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.

From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player. Reply

It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!

As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR. Reply

Hi Derek,
Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.

Hopefully you'll be able to look into this, as AFAIK no-one else has yet.

15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.

Just my opinion, but I would save money on the Power DVD if you are buying ATI and just use theirs. Power DVD is not cheap, and I personally do not like it is much, but I am sure others do. He has to use it, of course, because how else would he be able to test Nvidia and ATI on the same software. But it's not a trivial expense, and the ATI stuff works well enough that it seems, to me, an unnecessary expense. You might be happier with spending that money on hardware instead of Power DVD. Again, all this assumes an ATI card purchase. Reply

Choosing a Pentium 4 560 is a really strange choice, do you think there are a lot of them out there with PCI-E waiting to upgrade to one of these cards. It's a minor point, but I think a Pentium D 805 would have been an excellent choice, since a lot of people bought these and it would be a much more interesting data point, and many of them on PCI-E based motherboards.

My next point is the expectation of the 2900 XT. I totally disagree this is something they needed to add, because what they are saying is absolutely true. Someone who will buy this item will almost certainly do it with a very capable CPU. Since high end processors are dual cores, it is not as if you can not do something else if the CPU is assisting with it. It's not free, you pay for it with cost, and you pay for it with power use, and you pay for it to heat, and it's going to be a waste the vast majority of time. Considering the power use of the 2900 is appalling already, adding to this is highly undesirable considering the very questionable usefulness of it.

I think they should be congratulated for using intelligent feature targeting for their products, rather than bloating a product with useless features and making people pay for it. Reply

Clearly, the point was to get a single-core point of reference. While admittedly that exact CPU would be a slightly rare case, it's a simple matter to benchmark it since it fits the same 775 mainboard as the two Core2 chips. A PD805 wouldn't be much use to compare, as it would simply be a bit slower than the E4300... so what? The P4 560 makes a reasonable proxy for the variety of good performing single-core P4's and Athlon64's out there, while the E4300 stands in for all the X2's.
Reply

The Pentium D 805 is a very popular chip and widely used, and represents an entirely different architecture. It would be an extremely valid data point because it's a popular item. It's not "a little slower", it has completely different performance characteristics.

A Pentium 560 owner will probably never buy this card, and many of these owners are not even on a PCI-E platform. I wouldn't even have had a problem if they sold a single core Sempron, but a Pentium 560 makes no sense at all. People are still buying the 805, in fact, and you don't think the idea of popping one of these cards with an 805, while waiting for the Penryn to come out, is not something people think about? Or a similar Pentium D? Except, they'll not know how it performs. Luckily, though, they'll know how the Pentium 560 performs, because, I'm sure, that is their next choice.

Seeing as this is an article concerning media decoding with an emphasis towards HD media playback, shouldn't Anandtech be applying some pressure on Nvidia to support open drivers for linux? mythTV and XBMC are promising HTPC options, perfectly suited towards this test scenario.

Why should h.264 offloading be exclusive to users of Microsoft operating systems? Reply

Actually, it does. The problem is that it is open source, while the MS equivalent is closed. ATI/NVIDIA don't want to share their specs in an open manner and never came up with a suitable API to make public. Reply

Well, gstreamer allows for closed source plug-ins since it's licensed under LGPL. Fluendo has already implemented a lot of proprietary (patented) codecs in gstreamer. With the required features exposed through the driver, it shouldn't be too hard for the IHVs to do the same with hardware accelerated H.264/VC-1.

Most drivers only support it with MPEG-2, but that doesn't mean it isn't capable of more. Looking again, I'm a little unclear about how much work would be required to get it working. I'm not sure if it is completely done and just requires support from the hardware vendors or if it also needs some additional work before that happens.

Hi, it would be really interesting to see similar tests done in Linux also

For example how cheap of a HTPC rig can you build, with free software too, and still provide betters features than any of the commercial solutions.

I think we are many that have some old hardware laying around. And when seeing this article it brings up ideas. Pairing the old computer with a (AGP?) ATI 2600 card would provide an ideal solution in a nice HTPC chassi under the TV perhaps? Reply

However a HTPC can still be built to be a player for satellite data for example, granted configuring all that up with a subscription card will not be for the faint of heart. But then again the Dreambox 8000 is not available yet, only a new decoder from Kathrein UFS910 with no decent software (yet) Reply

good review. However, based on a review of the german written magazine C't I have some suggestions and additions:
PowerDVD patch 2911, Catalyst 7.6, Nvidia 158.24
- the Geforce G84/85 miss not only VC-1 but also MPEG-2 bitstream processing.
- the HD 2400 does not have MPEG-2 bitstream processing, frequency transform and pixel prediction or it is not activated.
- A single core Athlon is significantly worse than a single core Pentium IV. The reson is AACS. Decryption puts a hudge load on the CPU and is optimized for Intel CPUs (9%->39% H.264, Pentium IV, Casino Royale). Perhaps later patches made the situation better (like your Yozakura shows?)
- VC-1 on the Radeons and Geforces showed picture distortions, but based on your review this seems to be fixed now

Why run with older drivers? If these features are important to you, you will need to stay on top of the driver game. Would have been interesting to see AMD chips in there, but then that would require a different motherboard as well. I think the use of a P4 560 was perfectly acceptable - it's a low-end CPU and if it can handle playback with the 2600/8600 then Athlons will be fine as well. Reply