I am afraid market is too slow to react to nVidia having worse products, AMD has nowhere near market share that it deserves to have.

We can't expect one player to dominate all the time. So when the underdog creates superior products, it should benefit from it. But this is not the case in GPU market, unfortunatelly, as nVidia still keeps much bigger market share, than AMD.Reply

I've tried quite a few ATI/AMD cards over the years, including the latest 5000 series, and to date not a single one of them worked right, i.e. without keep crashing Windows.It could be one reason.Reply

I agree and I have also used ATI and AMD graphics over the years. AMD graphics writes the worst software or drivers from a reputable company. I go with nVidia because I care for reliability and stability. I do not mind spending money on nVidia graphics because the money goes towards software development. The cost of AMD graphics is too low to provide enough for software development.Reply

I have personally found nvidia cards to have inferior hardware quality. This was very evident from the time when quality dacs for vga mattered, and nvidia cards absolutely sucked at that. Further suboptimal decisions made their cards meh.

Software wise, I thought nvidia's software quality peaked around the time of the detonators.Reply

DACs depended on the maker of the card. Quadro NVS cards which were made by NVIDIA were regarding as having excellent 2D image quality over analog display. Sadly a lot of NVIDIA partners used cheap DACs on some of their cards. Reply

perhaps you need to read a bit more and see how many 1000's have been recently been affected by this awesome nvidia reliability and stability when they all had to throw away there graphic cards and laptops.Reply

Beats me.If it weren't for lucky people like you, I wouldn't have tried ATI yet again. I thought it was just a case of bad luck for me.But now I'm seriously reluctant. They simply do not work for me.If some ATI executive believes that it's unusual and wants to get to the bottom of it, then I'm here waiting.Reply

Work in IT and you will understand. Ever try to put a Radeon in the same system as a Fire GL? Can't, as they both use the same name for their .sys file and neither is compatable with both cards. BSOD on boot. Going back a bit, I had an environment with over 100 ATI Rage 128's. Because of slight differences in the models I required 18 different drivers. The wrong driver would cause BSOD's any time an app with a 3'D button opened. In my current environment we use no ATI video cards as we have driver issues in our multiple monitor setups. We have pulled all ATI cards and replaced them with Nvidia cards and our driver troubles have gone away.

Not to say Nvidia is perfect, but at least I can get more than one of them to work regularly together.Reply

It wasn't that long ago that ATi drivers were renowned for being terrible and at the same time one of the many benefits of nVidia was the unified driver architecture. That being said, recently I've worked with the A MD/Ti Fire Pro series. Not for gaming, but hi end multi-display setups. The cards and drivers have been flawless - not at all what I expected. The Fire Pro's were selected because they were half height, passive cooling and quad display. There was not an equivalent nVidia card. Kudos to the AMD folk for winning this round!Reply

I've used Nvidia when I found them to have the better price / performance and AMD other times.

I've had more issues with Nvidia than with AMD (ATI).

I have seen a lot of people have issues with both companies cards.

I do however like that AMD releases drivers each month and makes an effort to improve. Nvidia seams to release drivers sporadically.

There is also one thing that annoys me very much with Nvidia. I can not purchase an nvidia graphic card to use for cuda and use a AMD graphic card as primary. Nvidia will then shut down this functionality. So in my view, nvidia is a bit more evil as a company.Reply

Ahh, good ol' days with ATI...Waiting every month for a driver update, uninstalling all drivers & software for my AIW one by one, reinstalling one by one, with reboots, checking out all new bugs, writing ATI's support, uninstalling, reinstalling, manually uninstalling everything, reinstalling, reinstalling all other PC drivers, uninstalling, reinstalling, trying suggested combinations of new and old drivers, uninstalling, reinstalling, and by the time they're out of suggestions - there's a new driver release!I'm sooo over that.

You were on an ALL In WONDER! What did you expect? Talk about niche!Look, I totally agree ATI has done many horrible things with their drivers in the past. But for most of the AIW's life it was the only real tv card out there, in the consumer market. The feature set was WAY beyond everyone else anywhere near that price point. Also I used an NVIDIA card that competed with the AIW and it was just as annoying in driver. Reply

Among others.I don't agree it's such a niche product, they've sold quite a few – it's not like a combination of hardware.I agree that they had the best feature set, they just kept crashing my systems.I can't recall a single time that I left it to record a show unattended and it actually did.Reply

Good question. I find it odd how all the ATI cards somehow failed every time, yet the Nvidia cards worked flawlessly. Yet, for my first ATI purchase in 8-9 years, the card works flawlessly. No driver issues what-so-ever. Lets say that I'm a little bit skeptical. Reply

Then you'll loose your money. I'm in no way novice and installed tons of hardware with no issues. There's really no place for me to go wrong installing a graphics card in a brand new install of windows.Again, I call ATI to investigate. Be my guest.Reply

That would be a lot to recall, search for old gear and receipts and detail just to look more reliable here. AMD also seem to have deleted my old ATI service account with nearly all correspondences. I will do it if I know it goes someplace useful.The errors varied with card, driver version, MMC version and windows version. Each card came with its own full long story. There's just too much to list from probably over a decade.PCs, Windows, cards were from all of that time range and from different manufacturers (except windows...). Cards by ATI, sapphire, hercules? oem by mic? (that's what the sticker here says) I was mostly bugged by bsods and other forms of crashes and freezes with all cards. One ended up dying, another deteriorated rapidly and if installed would probably die soon too. But there are many lesser bugs too. Change driver/Windows/MMC version and you get more/less crashes and some strange new bugs.

I didn't buy them all one after the other. Time passed and I was thinking - it's working for other people, got nice reviews, it's a different card than what I previously had, a different card maker, a different PC, different Windows version, so why shouldn't it work? A couple I got preinstalled, without much choice. And with the first ATI card or two ...well I wasn't aware yet.Now I'm accused of giving them too much credit.Reply

Interesting......Seems you just have bad luck with ATI for some reason. I'd chalk it up to crappy side manufacturers. I think it's telling when companies like "Hercules GPU' " are no longer around. Also, at the time, ATI and 3dfx (now owned by Nvidia) were lending out their tech to every fool with the potential to manufacture cards. There were a number of crap products during those times, I wouldn't be surprised if you experienced some issues. What was your problem with the 5xxx series though?Reply

With fresh new Windows 7 install it would for example never wake up right from sleep - it may permanently freeze or refuse to log in or log in keeping the log-in screen and stop responding, artifacts may also show up.In XP none of that happens, but frequently and sporadically the VPU would crash and get restarted. No need stressing the GPU or CPU at all. It would also break the anti-keylogging software. Get the card out, and everything is back to normal.Reply

Well, obviously they're working for most people.The question was, why so many people buy nvidia, and I gave one reason, my personal one - not because I love nvidia, but because the comparable AMD does not work for me. Give me any third competitor, and I'll give them a try. :)Reply

It's not ATI/AMD, you may have happened to have a bunch of bad AMD/ATI cards yourself that were bad (in which case you would be extremely un-lucky and I actually highly doubt that is the case), but they in general are NOT. I have ran both, and even 3dfx back in the day. Currently my box at work uses a 4350, 100% stable (4Ghz i7/X58), never blue screens, and my pc at home has had MANY configs (single 4850, dual 4850, single 4870, single 4350, etc) NONE of them caused stability problems, and I have ran many many many driver versions over time . I am not insulting you or your intelligence but I am saying if you are getting blue screens in windows these days you have bad hardware or a bad driver. The ATI driver is not bad/broken, period.

I'm not even a fanboy I have owned products from both vendors and my next may be an ATI/AMD one or an nVidia one, I haven't decided yet and I dont really consider stability to be a part of the decision as neither one of these companies make straight up un-reliable stuff. Infact last card I had was an 8800GT and then before that was an ATIX1950Pro in crossfire, and prior was just a single X1950Pro, and prior was a gf4 ti4200, then prior was a gf3 ti200, then prior was a voodoo 5 5500, and prior was a voodoo 3 3500, and I had many card before that too. That's just my main machine....

There IS something bad in your machine, don;t take it as an insult, be mature about it and just figure it out dude.

I remember years ago I was working on a machine that used RAMBUS, and I dont know if you are familiar with those but they required terminators in un-used ram slots. It had stability issues, I tried everything eventually pretty much swapped out all the hardware. It ended up being bad terminators in the empty ram slots. Weird stuff can and does happen with computers.Reply

All of my machines plus a few others being defective in a way that would not allow them to accept ATI, but only competing brands? That's even less likely.

If your car keeps breaking while driving road x, and nowhere else, then at first you say - a coincidence, then you look for a reason and decide - bad luck, but eventually you just avoid that road. Even if many other drivers are happy with it. It's not worth it.Reply

makes you wonder how good your IT qualities are.... had tons of ati and nvidia cards, never had issues with ati or nvidia. Sure both of them have had a few glitches but that is more due to MS and the drivers.

btw it's often due to crappy OEM implementation of the vendor driver, ever thought of that????Reply

I bought a passive AMD 5750 which I use in my HTPC, mainly to gain Blu-Ray audio bitstreaming. I have to say, AMD still has a lot of work they need to do in terms of their drivers. As much as I love the capabilities of this card, I spent 5 days getting it to work properly and output the bitstreamed audio. I think that is probably a large reason why their market share hasn't grown. It should have just worked after I installed the latest drivers. But no, I had to download a specific version (not the latest), had to also download audio drivers from ANOTHER company entirely (not AMD), get a specific version of those drivers (not the latest), install them in the proper order (i.e. install the third party drivers after I installed AMD's drivers).... It was hoop after hoop after hoop.... A normal consumer would have simply taken the card back as "broken".Reply

If image quality is so important, you wouldn't watch anything but 1080p24 Blu-Ray in the first place! :)

But seriously, in the days of DVD, HQV tests were of higher importance because DVD was 480i, when the movies and television shows were typically sourced on film @ 24 fps. With Blu-Ray, you get to watch TV and film in the native cadence with no deinterlacing, and you certainly don't want resoution scaling or conversion of any kind. Any attempt to artificially smooth or sharpen the image would adversely affect image quality.

The most important feature to home theater enthusiasts is the purest representation of the original source, without enchancements or filters.Reply

I am sure there are plenty of HTPC users who shoot using camcorders like the Flip or the Playsport. Not all of those videos are in 60fps. 30 fps videos need to be 2:2 pulled-down. Cadence detection helps a lot here.

In addition, Blu-Ray also allows 1080i videos. Of course, if you have a video processor, source direct is best. However, a good HTPC is supposed to make a video processor redundant.

We have mentioned in the review that power users can always work around the unimplemented features by doing a 'source direct' playback.Reply

I have a Geforce G210 for my media center and it runs blu-rays well. I am looking for a little more umpppff to allow some post processing of other videos. But I paid $35 for my card, and the only way I would replace it was if this one was under $50.

But.... a quick froogle search found they want to go for around $75. Giga-byte offers a passive cooler, looks pretty bad ass.

Hmm on page 1 it says roughly same die size as "Juniper GPU in the 5500/5600 families" - that should be redwood. Also, the "GT430 goes up against... GT430..." I guess that should be GT240?I'm really wondering how they attach 4 rops to two memory partitions btw. I believe that one quad rop per partition wouldn't really have required a lot of changes over one octo-rop per partition, but either the quad-rop block was split into 2 or it's actually attached to both MCs. Of course, for actual color fillrate, it doesn't really matter if there are 4 or 8 rops - pixel output is limited to 2 per SM anyway.Reply

1. Silent2. Consumes very little power3. Can decode all current and pending video format/containers4. Fits half-height as well5. Has rock-solid driver support6. Cheap7. Bitstream audio and includes a full version of BluRay software to support it

This card just doesn't measure up. Ideally the best HTPC card isn't a card, but is actually part of the motherboard.Reply

I can not agree with your conclusions and I'm an ATI/AMD fan. I will not touch fermi with a 10' pole, but to say from your benchmarks that the 430 is not competitive with the 5570 is just plain wrong. Let's do the numbers

On power consumption: I personally feel that a lower idle power consumption is more important than load as my system would sit idle way more frequently that it would be under load, but that depends on the user: Draw or +1 Nvidia

Noise : 0 | +1

Totals:AMD: 5 | Nvidia: 5

Looks dead even to me both cards would make great HPC cards. I find a lot of image quality dbenchmarks are highly subjective and fail under a double blind test. It's like asking an audiophile for advice on weather you should use MP3's or FLAC's very few people can tell the difference between the two and unless you're highly tuned to it it's not noticable. Not to mention I haven't even gotten to the fact that it's an AMD made denchmark, I wouldn't put any weight in an Nvidia benchmark for an ATI card. I've made 0 purchases based on 3D Mark Vantage scores that handicap Nvidia a rediculous amount due to the GPU compute portion of the benchmark, otherwise I'd own no AMD cards.

PLEASE DEFEND YOUR POSITION.

Full disclosure:10% of my portfolio is AMD0.0% of my portfolio is NvidiaReply

Please read the article correctly. It clearly states that it is on par with 5570 but fails every time against the 5670.

Given the pricing of this card it is not out of the question to compare it against the 5670 in gaming tests.

This card at its current price point is a huge failure for Nvidia.

The companies banking on 3D tech are going to find hard time in the near future because consumers arent buying in on it. Heck Ive already read about GIGANTIC price cuts in the LCD market coming in the next month because supply is through the roof and no one is buying.Reply

Well, that would be true if those points were all equal. However they are not. Especially in regards to image quality where the nVidia lost horribly in a test for the market that this card is specifically aimed at.

As for gaming, you can pick up 5670's for the same price as the 430, and the 5570 for less. So the 5670 is really the card that nVidia is up against. Which easily beat it in every single performance test.

The lower idle power/temps of the 430 are nice to have, but not if it means significantly worse performance in other areas.Reply

in which it states that Nvidia is in an unenviable position HTPC wise because the 5570 is superior to the 430. Which is not true based on their own benchmarks. Of course the 5670 is better but that's not the card this Nvidia is positioning against it that will be the GT 440 or maybe 435's job.

As for AMD's Image Quality test, that is not the best test/benchmark they had at their disposal. A double blind comparison where you play two clips one after the other to random people and give them an similar, better, worse question is the best test for something as subjective as image quality. Using an AMD test to judge any card is inherently biased in AMD's favor. Their conclusion is wrong.Reply

That just leaves your basic reasoning, and you've failed even more miserably here. Weighing the performance of something like "Battlefield 2" equally to that of video "image quality" to get your conclusion (i.e. "looks dead even to me") is just plain idiotic considering we're talking about the HTPC market where image quality is more imporant than video game frame rates. Which is pretty frigging obvious.... Note that I'm not calling you idiotic, just your reasoning.

You see a lot of crap on the internet that you ignore, but the thing is, your memo was written in such a serious way that even includes a disclaimer as well as a DEMAND in caps that you've just got to laugh because the end result comes across as someone trying too hard to look smart, and failing.Reply

The 5570 and the 430 are priced exactly the same their performance for all intents and purposes is exactly the same. ATI and Nvidia are both taking the same position in this category, why is ATI's position superior? Because an Nvidia card doesn't perform as well as an ATI card on an ATI created benchmark. That is a poor reason to state one card is superior to another.

As I stated previously a metric like image quality is purely subjective unlike a FPS score which is objective. What looks like garbage to one person looks great to another.

Regarding my intelligence, my argument is correct regardless of how smart I am. Furthermore I could care less how smart anyone here thinks I am, for all I care your can hold a mental image of me squatting in mud smiling as I shove berries up my nose.Reply

Well you answered which means you do care, which is why I fully agree when you say that you could care less how smart anyone thinks you are.

I'm not going to bother with reasoning out your obsession with video game FPS as the main 'objective' measure of an HTPC card while dismissing all else, as it's lost on you. Instead, let's try it the man no way instead: your argument is not correct because I say so.Reply

Don't take this the wrong way, but just because someone responds to a forum post does not mean they care what anybody thinks of them.

From the article:

"Whether it’s by NVIDIA’s design or matters out of their hands, GT 430 simply isn’t competitive with AMD’s 5570 and 5670 in gaming performance, with the latter cleaning the GT 430’s clock every single time. NVIDIA isn’t pushing the GT 430 as a gaming performance card so we aren’t going to recommend it as one. If you need budget gaming, then the only choice to make is to go AMD."

again for all intents and purposes the 430 and the 5570 are the same card. Same performance same price-point one is not superior to the other. Of course the 5670 is better, it's aimed at a different segment. My issue with the article isn't with you it's with "Ryan Smith & Ganesh T S" who draw the wrong conclusion. From the article:

"We always hate to rely so much on a single benchmark, but at this point HQV 2 provides the best tests we can get our hands on."

This is wrong to test image quality the best test they could use is a subjective one ie: a double blind test where they play the same clip on the two different platforms to random test subjects. Using AMD's IQ benchmarks to judge any card is inherently biased I don't care what AMD says. Just like using Nvidia's CUDA benchmarks.

The article's conclusion regarding the 430 vs 5770 is wrong for the time being do a double-blind test and revisit it. Regarding then430 vs 5770 in gaming performance conclusion, well that's just plain wrong.Reply

well manno, that is the trollest behavior i have ever witnessed on the subject of $79 video cards.

assuming you are NOT a troll, then TomsHardware and HardOCP are also wrong in the comparison with the 5570. Hardly.

Since AMDs 785G there is no need for a VGA in order to play blurays , except for 2 situations:

The very few that own a high end sound system ( $1500+) and swear that they can hear the "gap" in quality when going from 7.1 bitstreaming to lossless HD audio. These buyers do NOT pick a $79 VGA, they build a custom PC-based digital audio system to work with theirs bitperfect setup. If you dont know what bitperfect is, then dont bother trolling that the 430 is better than the 5570.

The not so very few that own a 3D TV: in 3D playback since it is lost half the resolution and/or half the brigthness image quality becomes even more important. After spending 3k+ on the TV i fail to see the reasoning on choosing the 430 over the 450, or the 460, that at least can game.

I say that HTPCs cards are a thing of the past since the 785G, HD audio is a much hyped and rarely used feature and 3D Bluray is a niche.

And the fair competitor for the 430 in 3D Bluray comes from the PS3, hence the need to compare this card image quality with the PS3, with camera pictures to silence the trolls.Reply

The POINT is to get the image to the screen 100% accurately. Not to enable noise removal (scoring noise removal, "edge enhancement" or anything other "improvements" are extremely dubious)

Scoring video resolution after these "improvements" is silly. Some of those deinterlacing tests are also stupid because the only way to score a perfect 5/5 score is to make the drivers detect THAT EXACT scene. An algorithm which is more universal cannot get 5/5. That alone completely negates a lot of HQV tests.

The point of HQV is to make a suite of tests and ONLY products that have been HQV certified will pass them. A lot of those tests are synthetically fabricated especially now with Blurays.Reply

Interesting point, and I won't argue against your views (because they may be correct).

However, stuff like cadence detection is independent of the stream. You are either capable of 2:2 pulldown or not. It has nothing to do with a particular stream. It is aspects like this which are problematic with the GT430.Reply

It's a shame NV gave up trying even for mere performance parity. With even 8 ROPs (16 would have been great) and/or GDDR5 this might have been a suitable low-end/older game budget card and possibly a worthy successor to the late great 9600GT. Adding GDDR5 alone won't fix the lack of ROPs though and probably won't make a noticable difference unlike the GT 240 where the GDDR5 variants were notably better. Too bad because a lower-end/older game silent low power draw budget gaming card is exactly what I'd be interested in, and AMD drivers are too quirky for me.Reply

Yes, that *was* stated, and then followed by 9 pages of gaming benchmarks.If you're going to state that the card isn't meant for gaming, then don't run game tests and fill the article with gaming benchmarks.There are 2 only pages discussing HTPC performance.We all know it's targeted to HTPC, so expand your reporting on that aspect of the card's abilities.Reply

I guess you skipped over the part where it said that the chip is a step back in terms of graphics-oriented functional units in favor of HTPC features?

It's clear this isn't a gaming card, but the potential for making it a passable gaming card that is actually competitive with the going-on-10+-month old lower-end Radeons is what I'm lamenting. I generally prefer NV cards but doing worse than their previous product in this segment (saying it's meant to replace the horrid GT 220 is a joke) is just sad. It's all part of the 'more features, no better price/performance' trend in all but the high-end that's been going on for a year though so whatever..Reply

I don't know what's wrong with Mr.Huang, but doesn't he even feel ashamed to bring this c-r-a-p to the market?! The AMD Fusion APU will easily overrun it. There will be no future for nVidia if Huang keeps making such pathetic products. The release of HD6800 is in a few days, the first tests show that HD6850 can beat a GTX460 1GB and HD6870 easily overrun the GTX470 (and probably the incoming 384SP-GF104-GTX475). Once the good days for GTX460 are gone, nVidia will be in total disadvantage. Reply

I think something very important is being missed in reviews of cards evaluated for HTPC use: the effects of the form factor of a case.

Many HTPCs are in low-profile desktop-style chassis in an effort to visually integrate with the rest of home theater gear. Examples: Antec MicroFusion, nMediaPC HTPC 1080P, Lian Li PC36. These cases cannot accept standard-height cards, and must use low-profile-compatible (half height) cards. Additionally, the low profile of the cases severely inhibits airflow, which renders "open bench" and "inside a cavernous full tower case" thermal and acoustic testing largely irrelevant.

There are also HTPC cases which accept standard-height cards but are only barely tall enough to accommodate those cards, also impairing airflow, and are thus good test targets. Examples: nMediaPC HTPC 1000, Antec Fusion.

Therefore I suggest the following be added for consideration in any card which is ostensibly intended for an HTPC build:

1. Is the card available in low-profile designs?2. For a given manufacturer's entry, is the low-profile bracket included?3. Thermal testing and acoustic testing on low-profile cards performed in a low-profile case with all panels in place4. Thermal testing and acoustic testing on standard-height cards performed in a case that is exactly as tall as a standard PC card, i.e. Antec Fusion, also with all panels in place.5. Please report card-length. HTPC and other compact cases often have an issue with this.

Have you seen the PowerColor AX5750? It's a Radeon 5750-based card, 1gb DDR5, which is low profile. As far as I know, that's the most oomph you can get in a low profile card at present. 1080p gaming on a single 5750 would be marginal, near 30fps for moderate settings on most modern games, but if you can deal with that or don't mind pushing the sliders a bit to the left it may be doable. All depends upon your requirements. Sliders slammed to the right and 8x antialiasing? Not gonna happen :)

If you need more, you can always try a Crossfire rig. In a low profile case that would be thermally brutal, however, and given that the PowerColor AX5750 uses a dual-fan cooling rig that does not exhaust out the back, two of them in a low-profile case might just be too much.

The good news is that the 5750 doesn't draw much power. Guru3D had a test that showed a 5750 Crossfire rig peaking at 401 watts, and that's with a much beefier CPU/motherboard setup (Core i7/965 OC to 3750mhz) than you would see in a typical HTPC rig. With something gentler like a standard-clocked Core i5 or Core i3, or a Propus 620e, you probably will never exceed 350 watts in a Crossfire setup, so you can stick to 450-500 watt PSUs in complete confidence for a pair of 5750. For a single 5750, a 350-400 watt unit (example: Antec's 380-watt PSUs that come with some of their cases) would be just fine.Reply

I'm using the Powercolor 5750 Low Profile in an NSK1480 and that size of case is probably about as small as you'd want to go for gaming before stuff gets too hot and/or noisy. These cases often don't have the 6- and 8-pin power connectors you often need and dangling too many off molex adapters is a bit risky.

However, it does run at good temps and quietly even overclocked from 700/1150 to 865/1265 and after slowing down the fan. It probably helps that I have two Noctuas, two Scythe S-flexes, and a very undervolted Q8200.Reply

As far as case volume is concerned, I think the SFF form-factor companies like Shuttle are producing are actually smaller. But because those cases are taller and blockier, air has room to circulate over the top of cards; also, big noisy fans are more acceptable in an SFF gaming porta-rig than an HTPC in which silence is golden.

Also, with LP cases, putting in a card effectively creates two new isolated compartments within a case (the card reaches from motherboard almost to to contact with the top cover, and from the back of the case almost to the front edge of the motherboard along the length of the card); air on one side of the card doesn't circulate with air on the other side of the card, and you're in trouble if either of those two new compartments doesn't have a robust source of cool air.Reply

Hi, I guess this card would be the same as the announced nVidia Quadro 600?Guess maybe even my current uber crappy G84GL based Quadro 570 would be almost as fast...That's one issue I see with GF108, sure here they are touting it as a HTPC thing, not for gamersBut it's ok to sell it as a Quadro and charging €200 for it instead?Meh

Please beware of these cards from Asus. I have a GT220 which sounds like a helicopter and according to Asus the fan speed on these cards never speeds up or slows down. This seems to be confirmed in this article by looking at the idle vs load noise data.

Overall, this card isn't impressive at all... the PRO's are there, and AMD does need 3D and physics abilities.

But at $80, it goes against the 5650 cards and easily loses.

About HDMI 1.4b... it doesn't really matter. HDMI is dead... faster than it should be, but there is no future in it. CAT6-A/V will start replacing HDMI in 2011.... all the big TV players are on board - they don't have to pay licensing fees or use special expensive connectors or cabling of HDMI.

And HTPC's will not get very popular until the Cable companies loosen up about people access channels like HBO, SHO, etc. Windows7 Media player is nice, but the interface is still rather weak for power users compare to some of the others out there. For example, the program grid is HORRIBLE... when others allow 2~4hours of blocks and around 20 channels at a time... none of this 1.5hr / 6 channel junk. Oh, and the DRM of Media player makes archiving your shows near impossible. Like if you have to reinstall the OS or do a system upgrade.Reply

er... PhysX and 3D has never been about improving performance. It was about adding to the visual experience. Like Avatar looks great in 2D and 3D... but 3D sucks you in a bit more.

Games like Mirror's Edge come more realistic with PhysX, even thou it doesn't improve game play one bit.

Those technologies are new, and until PhysX becomes shared/standard on all video cards - it will be more gimmick then a standard. But who knows...

Hmmm... back around 1988 when computers were 8~16mhz, only Mac and Amigas pretty much had a native GUI OS, MS was horrible MS-DOS with 8.3 file names, no multi-tasking, horrible graphics and forget about sound. Someone from the DOS camp said "Who needs graphics and sound, those are for toys. PC are REAL Computers".

Uh huh. And now we have 1000Mhz cell phones with 16GB of RAM.

The 1986 vintage Amiga had Graphics, sound and Multi-tasking... was it a gimmick?Reply

"Performance" was a typo on my part, since I clearly indicated that it was a system hogs. Physx, in most cases-as displayed titles such as Mafia II- contribute little to nothing (in some games) towards graphics. Most players won't even notice such things as enhanced physics or improved decals. In fact, the most noticeable thing displayed in Mafia II was the presence of debris. Players will, however, notice the impressive amount of lag brought on by such features.

3d Vision, as displayed in one review, rendered the GPU (a gtx 460 1gb) to unplayable frame-rates. It essentially required the player go to SLI. Which brings me to another point.....Why are you bringing up Physx or 3d vision in regards to this product? You seriously think this cheap HTPC card could handle any of the above features, particularly when a 1gb 460 struggles to?

And are we seriously comparing the Amiga to such an insignificant thing as cheesy video game effects? You can't be serious. Particularly when there are other physics engines (Havok being one of the most prominent) doing some of the same things.

However, please tell me how Physx made Mirror's Edge a more realistic experience. Particularly since that game, like Mafia II, only added physics to debris.Reply

I agree with you on the first paragraph. We want constant visual abilities, but without the cost of general performance.

This was one of the arguments of 3Dfx's Voodoo3 vs TNT cards -Performance with 16bit graphics vs nVidia's 24bit.

When I played the JSF game around 1999-2000, the 16bit limitation was noticed BIG time on my Voodoo1, but the frame rate murdered the ATI I had. It was a trade off. This is always a constant battle with out GPUs... remember when AA was added? Even today, AA effects the performance of every single video card - but unlike 8 years ago, it no longer renders most cards useless.

Yeah, 3d Vision & PhysX is useless on the GF430... pretty much like the ATI Eyefinity's tech doesn't belong on every ATI card (reduce the cost by $10, improve airflow) - especially for the low-end, but its very handy for business users.

You said: "And are we seriously comparing the Amiga to such an insignificant thing as cheesy video game effects?"

Yes, in that PhysX and 3D tech is still baby tech. In a few years, we'll be start seeing 3D TV's that don't require glasses. PhysX or Havok or other becomes more standard - or perhaps MS adds it to DX12. It's going to be years before we see results of the latest technology. Just like the PC folk's of the 80's who said the Amiga was a toy and computers didn't need graphics and sound. And yes, my Amigas still work.

"please tell me how Physx made Mirror's Edge a more realistic experience." Look up the various side-by side videos. It adds cloth effects, broken glass and yes, debris. A side by same example: http://www.youtube.com/watch?v=w0xRJt8rcmY and check out batman too.

Of course, that didn't help to actually POPULATE the city of Mirror's edge with people... funny, a huge modern city with only a few people and police, with all that construction - where are the workers? Another example. A burger that is just meat and bread is bland... but add some tomatoes, lettuce, cheese and it becomes a better meal.Reply

Why no comparison to integrated Intel Clarksdale? Many of us with HTPC went with that since we're not gamers. I've been really happy with it. Maybe once per Blu-Ray watching, I'll get a stutter. Not sure if it's because I'm underpowered or what. Would be cool to see what more I'd get for $100.Reply

The other consumer grade low cost Blu-Ray Player is the PS3. It qould be nice to have baseline HQV2 numbers for the last firmware of PS3 , so that readers could get a measure of how much picture quality, if any, a HTPC has over a PS3.

As for the 3D HTPC card, i simply do not see the owner of a 3D TV using such a low end card. The minimum budget wise is a 768MB 460.

Now NVIDIA should work for a passive card that can beat the 5750 on picture quality, until then it is a PS3 for 3D Bluray and a 5750 for HTPC.Reply

Some of us in here likes to see how new and old Nvidia cards is doing as dedicated Physx cards. There is some tests out there like fluidmark. And i know this is no problem for experts like you @Anandtech :)Reply

"unlike all of NVIDIA’s other desktop launches which had GPUs with disabled functional units, the GT 430 uses a fully enabled GF108 GPU. For once with Fermi, we’ll be able to look at the complete capabilities of the GPU."

I believe the GTS450 incorporated a fully activated gf106 with 192 SP's, did it not?Reply

How can any card that cant do decent gaming @ 1920x1080 be called an HTPC card? What a douchebag title. HTPC video cards have to be:

A) Able to do video decoding.

B) Do gaming at the defacto standard of 1920x1080.

C) Do it quietly.

NO HTPC card should even be considered unless it can do these. Yes you can get by without gaming, but, since an HTPC can do gaming and console type gaming is regularly seen on the main TV at home, I think it should be a needed quantifier of a decent HTPC card.Reply