$70 for a card that can't play any modern games, or most older games...no thanks.

But it has DX11, that is a good...

Kind of makes me wonder why anyone would buy this card over one of the cheaper HD 5450's in the $40-50 range that are also low profile cards. Granted, those would only have 512MB of RAM, but what does that matter on a card of this class?

I realize that these cards are more aimed at HTPC than gaming. However, personally, if I was buying a card for a HTPC, I'd get a $20 8400GS. It will do everything I need from a HTPC card, no point in spending any more money on something better. Power consumption might be higher with the 8400GS, I don't really know how much power an 8400GS really uses, but it wouldn't be a whole lot more.

Or maybe even better, ATi can release an HD5350 with 40 Stream Processors, that is in the $20 range and consumes even less power. That would be a killer HTPC card!

Absolute silence without compromising performance is my only benchmark score.

It's all marketing. More is better. If you stick 2GB of memory on a rotten GPU and ppl will buy them like crazy. My personal experience from retail store where casual ppl buy computers, first thing they always ask is how much memory is there on a graphic card. Or they rush in asking if we have a computer with X GB of memory on a graphic card. Then, after i explain to them that the type of core is the most important thing, they finally get it. Mostly. At least good thing now is that all cards have 1GB And they are all GeForce 9500GT. They only ask about memory s it doesn't really matter Cruel reality.

believe it or not, i am 100% sure that his will sell more of this card than of the 512 mb one. everybody in the graphics card industry confirms that

Click to expand...

Oh, I'm sure of it. I've had people, even against my advice, buy cards simply because of having more memory, even going so far as getting an HD4350 w/ 1GB DDR2 for $60 over an HD4650 w/ 512 GDDR3 for the same price...:shadedshu

Nice review, and good point about the memory. Honestly why does a bottom range card need as much as my 5870. The only reason it needs that is because it can handle resolutions in games higher than 1024x768, which is something this card can't do. I see the only reason to have this card is for older gaming (which can be done on 512MB or less) or for an HTPC which I would probably do when I finally get around to building one.

One thing though, why are you mentioning as a con 'No support for CUDA/PhysX'? It's like mentioning on a NVIDIA card 'No support for CCC' or 'No support for ATI Stream'. CUDA isn't something I miss since moving to ATI and I can see once they finally start making Bullet and the like work on OpenCL, CUDA will be old news...

I guess it's a card targeted for those who want just something better than onboard graphics.

Click to expand...

Precisely. The thing is though, modern integrated graphics is pretty decent. I believe NVIDIA are now doing 9400GTs onboard and this isn't much better, it just does DX11 (which can't really be utilised because it's so poop). I'd only use it over integrated if the integrated was something really old, like really really old as in Intel 945.

13W at full load is impressive indeed, but its overall performance is just too low.

I'd agree on the comment - it seems weird to see a negative about hardware PhysX/Cuda, when hardware physx is so underused, and ATI has stream (even if its less widely used) - seriously, PhysX (15 game titles use it in hardware) and coreAVC aside, who even uses cuda?

my only other question, and its mere curiosity: is 3Dmark 03 still relevant for power consumption? wouldnt a more strenuous test (DX10? DX11) use more of the GPU, therefore more power - kinda like how furmark throws my wattage another 100W up compared to any game.

my only other question, and its mere curiosity: is 3Dmark 03 still relevant for power consumption? wouldnt a more strenuous test (DX10? DX11) use more of the GPU, therefore more power - kinda like how furmark throws my wattage another 100W up compared to any game.

Click to expand...

I agree FurMark is more stressful and that 3DMark03 probably won't be accurate anymore, but the thing is with FurMark is it's supposed to find the absolute maximum under worst circumstances. Yes it's good to know that but more often than not you will be way under it. I reckon more using 3DMark06 or Vantage to find it would be smart.