Your charts are all buggered up. Just looking over the charts, in Crysis: Warhead, you test the nvidia 9600GT for performance. Ok fine. Then we move a long to the Power consumption charts, and you omit the 9600GT for the 9500GT ? Better still, we move to both heat tests, and both of these card are omitted.

WTH ?! Come on guys, is there something wrong with a bit of consistency ? Reply

Some of those cards are out of Anand's personal collection, and I don't have a matching card. We have near-identical hardware that produces the same performance numbers; however we can't replicate the power/noise/temperature data due to differences in cases and environment.

So I can put his cards in our performance tests, but I can't use his cards for power/temp/noise testing. It's not perfect, but it allows us to bring you the most data we can. Reply

Well, the only real gripe that I have here is that I actually own a 9600GT. Since we moved last year, and are completely off grid ( solar / wind ), I would have liked to compare power consumption between the two. Without having to actually buy something to find out.

Oh well, nothing can be done about it now I suppose.

I can say however that a 9600GT in a P35 system with a Core 2 E6550, 4GB of ram, and 4 Seagate barracudas uses ~167-168W idle. While gaming, the most CPU/GPU intensive games for me were world in conflict, and Hellgate: London. The two games "sucked down" 220-227W at the wall. This system was also moderately over clocked to get the memory and "FSB" at 1:1. Also these numbers are pretty close, but not super accurate, But as close as I can come eyeballing a kill-a-watt while trying to create a few numbers. The power supply was an 80Plus 500W variant. Manufactured by Seasonic if anyone must know( Antec EarthWATTS 500 ). Reply

I dont see the improvement over the past generation 4550. DX11 is useless in such slow cards (is almost useless for a 5670!). I expect this level of performance from a next gen IGP, not a discrete chip. AMD should have raised high the performance bar for this generation, releasing something like 5450 (120 sp), 5670 (640 sp, like 4770), 5770 (960 sp), and 80 sp for the IGP. Reply

The improvement is purely in power consumption. They can't improve performance of these lowest-end discrete cards or IGPs too much or they will eat into the value of the next step up (4670, 5670). You may "expect this level of performance" but you aren't gonna get it. Reply

I'd be rather interested in seeing a differentail analysis of power consumption of the two ATI cards (5450/5670) at rendering a h264 encoded 1080p movie with dts out via the card, as well as blu-ray with "HD".
And a video benchmark, to show how much bitrate/fps the respective cards manage, before desync/framedrop/freeze.
Power during furmark (or gaming) is of course higher on the 5670, because it has five times as many shaders to feed - depending on how smart ATI's power management is, the two cards might not differ a lot, if used in an HTPC.
And frankly, the 50 dollars extra would be probably worth the extra rendering/decoding horsepower, especially in an HTPC, where you want buttery smooth performance, and not worry about bitrate.
Oh - any news on passive 5670's? If they can do 5750's, I'm sure there'll be a few 5670's someday? Reply

Well, considering the extra shader load added by filters, apparently it may not be - or the proper algorithm for deinterlacing would have been available.

And that also leaves the question of the power draw of a 5670 at 5450 levels of performance - I'm pretty sure that in an HTPC, unless you use it as a console replacement for gaming, there will ever be a situation where the gpu is fully loaded, hence power input should be lower than the full-tilt number you published. Reply

Most HTPCs are meant to be small, and there's no way you're going to be able to fit a 5700 series card into a low-profile space. I know they had 4650s last generation, but there aren't any 5650s yet. =/ Reply

I just cant see how reviewers are claiming this card is the perfect HTPC video card. Not everyone uses Microsofts Media Centre. Lack of VDPAU support for Linux is a glaring hole in all ATI cards, If I were building a gameing PC today I would probably buy an ATI 5870, But if your building a low power HTPC you cant go past nVidia and VDPAU support. Take their ION platform will do 1080p in hardware on Linux for less than 30Watts this beats this ATI offering hands down when you add in Motherboard and CPU.

XvBA is also said to be working, which also acts as a backend to VA-API, just like VDPAU. They are even examinig the legal situation and whether the UVD specs can be opened to be support within the OSS (xf86-video-ati) driver. Of course nothing that will be done by tomorrow.

What I think would be worth mentioning, when it comes to HTPC comparing nVIDIA and ATi, is that UVD won't play H.264 higher than level 4.1. nVIDIA's PureVideo is capable of decoding up to level 5.1. Reply

The Evergreen series does apply angle independent anisotropic filtering. Also the fixed function interpolators have been removed and moved to the shaders.
Considering the limited power of the HD 5450, this causes a bigger performance drop compared to the other Evergreen products.
Reply

It's an artifact of stepping through the video one frame at a time with MPC-HC with the MS MPEG-2 decoder. Doing so captures the angled lines correctly, but it doesn't quite capture some of the other artifacts exactly the same because it ends up a field (basically half a frame) ahead.

Congrats Ryan, your Cheese Slices testing outside of a HTPC forum was really a big surprise for me! Keep on pushing AMD/Nvidia to realize that there's more than gamers in this world!

> Horizontal 1p lines missing:
I'm also wondering normally the 1p lines are no problem in screenshots (MPC-HC and more). Maybe you shot them during the "odd movement" (1h+3v see description in Cheese Slices thread). I have now edited Post #1: screenshots only during "even movement".

...then this might call for an article to look at it in that very light. However, is there a higher performing part for both series that can be directly compared? I'm guessing not, as they all differ in some way, be it shader numbers, ROPs or texture units. The only way I can think to do it would be comparing two 512MB 4850s to a downclocked 5870, but even if that were possible, you'll still get a performance drop due to Crossfire. Hmm. Reply

Testing a Clarkie requires switching out our test rig, so the results wouldn't be directly comparable since it means switching out everything including the processor. Plus Anand we only have a couple of Clarkies, which are currently in use for other projects.

At this point Clarkie (and any other IGP) is still less than half as fast as 5450. Reply

That brings up the point though that with a card this low on the totem pole it might be nice to include a benchmark or two of it paired with similarly low-priced hardware. I understand the reason for generally using the same testbed, but when it is already borderline playable it would be nice to know that it won't get any slower when actually paired with a cheap processor and motherboard. Reply

Hey Ryan, I'm glad you are the first reviewer to utilize Blaubart's very helpful deinterlacing benchmark. I would just like to note that with ATI, it seems memory bandwidth plays a big part in deinterlacing method as well. For example, the HD 4650 DDR2 can only perform MA deinterlacing, even tho it has the same shaders as the (VA capable) 4670. The only bottleneck there seems to be the DDR2 memory bandwidth. On the other hand, with the HD 4550, though it has DDR3, it is limited to 64bit memory interface, so that seems to be a limiting factor.

I have an old HD 2600Pro DDR2 AGP. When I OC the memory from 800mhz stock to 1000mhz, VA gets activated by CCC and confirmed in Cheese slices.

Nvidia's deinterlacing algorithm seem to be less memory intensive as even the GT220 with DDR2 is able to perform VA-like deinterlacing. Reply

That is one of the things that changed. However it's not very resource intensive for the shaders (which is one of the reasons why it was moved in the first place) and I don't seriously suspect that's the cause. Reply

So far all the 5000 series cards have an issue with PowerPlay. It messes audio especially on Dolby digital and DTS tracks, when DXVA is disabled. When DXVA is enabled the clocks are stabilized (at 400 MHz GPU and 900 MHz memory for the 5770), so Powerplay doesn't screw with the audio. Without using DXVA, the clocks are all over the place (PowerPlay enabled, normally a good thing), and this gives audio dropouts with DD and DTS tracks.

Could you test this, with HD videos and DD/DTS tracks? Maybe you'll have a better chance of getting this fixed than a bunch of us just dealing with their horrible support.

Powerplay also triggers funky sound when HDMI is used and 5.1 or 7.1 24-bit 96 kHz or 192 kHz output is set on Windows. Just set it like that and go about your business and you'll hear either crackling or crazy channel switching.

See here for reference, and the following posts of other users who confirm it, and even come up with their own ways to disable Powerplay. This thread at Doom9 was where it was discovered, and later confirmed by nearly everyone who tried (except one strange case or two). Reply

[Quote]
So far all the 5000 series cards have an issue with PowerPlay. It messes audio especially on Dolby digital and DTS tracks, when DXVA is disabled. When DXVA is enabled the clocks are stabilized (at 400 MHz GPU and 900 MHz memory for the 5770), so Powerplay doesn't screw with the audio. Without using DXVA, the clocks are all over the place (PowerPlay enabled, normally a good thing), and this gives audio dropouts with DD and DTS tracks.
[/quote]

1. ATI & the Review Sites, including Anandtech,
have been ignoring this horrible problem in these cards,
which makes them , for all practical purposes - useless.

But - It does not stop there.

2. The problem you have mentioned with The 57xx series,
creates SERIOUS DPC latencies , especially in XP,
that brakes even the fastest systems, & All audio is full of glitches & clicks chaos.

3. To add insult to injury 2D performance is the worst EVER seen on the PC, beaten even By IGPs.

Bottom Line:
Anadtech yet again fails to detect & report to these issues,
So I would not expect any replies to your questions here.

If You do any sort of regular graphics (ignore the IF..)
its all 2D ..

After looking hopelessly for ANY solution or even recognition to this problem from professionals, so I can share it, I found there is only one Site with staff professional & unbiased enough to Note the problem.

Not only did they notice it, they published 2 giant articles about it, & it is painfully, obviously, certainly NOT Anandtech.

It is a big issue, but I'm not sure if it's a hardware problem. It's probably a driver thing.

In any case, you can disable Powerplay for the time being, I've done what I linked above and it's working acceptably for me, when I play a game, I'll switch profiles to enable the higher clocks. I'm not sure how it would work on XP though or if the procedure is the same, but you can also use GPU clock tool to stabilize the clocks. Reply

I have a hard time seeing AMD giving this much attention, the number of users concerned with this issue is infintesimal.

Why overclock your GPU when you are focused on the audiophile features? I'll be shocked if the official response is anything other than "Graphics card audio output not supported when Powerplay enabled". Reply

Oh and BTW, also as the poster above said, it's not an "audiophile" issue. I actually try to distance myself form that term as much as possible. It's happening whenever DXVA is not enabled, and with DD and DTS audio. As in when playing DVDs with (say) PowerDVD with its post-processing features. Pretty normal scenarios. And it's not a subtle thing. It's dropouts (pops or cut outs in the audio). Also, choppy flash video (shouldn't happen with DXVA accelerated video with flash 10.1 though).

Powerplay also triggers horrible crackling and channel switching when output is set to multichannel (5.1 or 7.1) 96 kHz or 192 kHz audio for the Windows mixer. Hardly audiophile issues at all, any of these. Reply

It's not overclocking at all. Powerplay is, as the poster above said, for power efficiency only. It actually doesn't overclock at all, but underclocks when the GPU is not being stressed.

If you're referring to one of the posts that requires you to enable overdrive, notice that it's only being enabled so you can stabilize the clock (and thus effectively disabling powerplay), but the GPU/mem are actually being underclocked by messing with an xml file and lowering the clocks manually via overdrive. Reply

First of all, its not really a "audiophile feature" to get audio without droputs and other problems over HDMI, its devastating for the audio, no matter if you are a audiophile or not, secondly, powerplay is also used for power efficiency. The result is that HDMI audio doesn't work with default-setting for many people, this is a pretty major issue.

So far the most reasonable explanation I've seen by googling is someone if a forum suggesting that its function is just disabling certain features so as to prioritize smooth playback over those features. I don't see any difference with the 5770, otherwise (with that card it doesn't disable anything). Reply

I know this isn't exactly supposed to be a fast card, but its clocked ~10% faster yet its slower than the last generation card... I can't say I am surprised though, after seeing the 5770 clocked faster than the 4870 yet being around the same speed. Reply

I think people are missing allot of the big picture here and that's Crossfire with the Radeon 54xx series.
Specifically with the new 8 series of chipsets, hence the amount of shaders present, I expect a return of Hybrid Crossfire.

Pairing an IGP with a low-end card is a very cost effective solution to getting more performance out of a system and also gives AMD an edge in getting more people to buy an AMD Processor+Chipset+Graphics card. Reply

"me too" I'd dig a 758G Hybrid Crossfire review with this and the other sub $100 Radeons (if they support x-fire) 785 was a great motherboard to match with the Athlon II and Phenom II X2-X3, some of us were waiting on video card purchases and would like to see Crossfire 54XX/56XX compared to a 5750 discrete for example. Reply

The 80 shader discrete Radeons are just too limited by 64-bit DDR3. The 785G has more bandwidth and the same number of ROPS (mine even runs fine at 1GHz.) If they had cut the power usage of the 5450 down a lot more it may have made some sense. Reply

The article states: The Next Step In HTPC Video Cards.(Home Theatre Personal Computer Video Cards??)Am I right in this assumption?If so, then why are the benchmarks more orientated towards gaming??Obviously this is no gaming card.Please give a true outlook as to how this card performs connected to a TV.Is it OK? Is it Junk?My intent for this card would be a connection with a plasma display or perhaps LCD.Streaming video through the intenet such as Netflix or Hulu would be one concern.Playback through a Blu Ray Player on PC another.Perhaps my perception of this card and the subsequent review are irrelevant.Please don't game a card that's meant for something entirely different.Perhaps I have miscontrued the intent of this article, if so I appologize.Reply