If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.

Proof? I haven't had any problems... The best thing the 220 has going for it on windows is the brand name - lots of people will buy it just because it says NVIDIA on the box, without looking at what they're actually getting.

Like I said, I think NVIDIA can make a good case for these cards on Linux. For the same price you've got 3 distinct options:

1. ATI with horrible/buggy 3d performance + 0SS drivers
2. ATI with good (relatively) performance with binary drivers that are still incomplete/buggy
3. NV with mediocre performance but features like VDPAU in the solid binary drivers.

I suspect NVidia would probably win the majority of the marketshare based on the 3 options above. The problem is that in Windows the drivers are fairly even and NV just can't match up with the hardware.

So why do you still use Nvidia then? Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.

Failing GPUs are only the old 80nm parts like 8600(G84) and 8400(G86) which are long discontinued, and you believe what the biggest ATI asskisser Charlie writes? LOLOL, dumber than dumb indeed you are. Most of Charlie's BS lies were debunked at Beyond3d forums.

Loss of chipset market is irrelevant, Nvidia is replacing that with revenue from the Tegra SoC, which will be far better. Intel is well known to be greedy, wanting the chipset market cake and eating it too. VIA should know that, they got driven out by Intel's greed.

Oh I know about Charlie, Like anyone his word should be taken with a grain of salt, but it doesn't change the fact that even he is right on the money sometimes. As for Beyond3D, I wouldn't know, they aren't one of my usual haunts. I perfer HardOCP, Guru3D, MadShrimps, Jonny Guru, Xtreme Systems and OCF for my tech info.

Loss of the chipset market is by no means irrelevant, especially since it only leaves you with a delayed series of GPU, and the hope that the next gen Gameboy doesn't flop, not everything Nintendo touches is gold, remember the that the N64 did nowhere near what was expected and that the Virtual Boy is the bastard stepchild they keep locked under the basement stairs. Lets also not forget that the 800Lb gorilla in the room, Intel, is looking to take the GPU market by releasing their own GPU for the low to mid range, I wouldn't expect the Larabee to take the high end till at least it's 3rd generation, but Intel having a GPU in the market that is anything better then an IGP means that they will be able to further leverage themselves till they force everyone else out.

VIA was driven out by failing at just about everything they did since they made S3TC. Though thats not to say with a 3rd party chipset and GPU that a scaled up Nano couldn't get them back into the low end desktop market.

Who says I'm not planning on going with ATI/AMD on my next build? I'm looking for a buyer for this box first.

Originally Posted by GT220

Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.

LOLOL, dumber than dumb indeed you are.

So...Does this mean I win by default or something? Oh and before I forget http://wiki.x.org/wiki/RadeonFeature Just waiting on the prerequisites to fall into place and the driver should come together quite quickly.

This card has 48 cores??? Shouldn't it be getting higher benchmarks?
Maybe the drivers or benchmarking software are not taking advantage of this? Not a bad price for a new low end card.
I have a GTX 285 now and I'm pretty sure I'm not using both cores in Jaunty Jackalope with the 185.18.36 driver.
Can't wait to see what the Nvidia open source drivers can do!

Nvidia is starting to look worryingly similar to 3dfx when they tried to release the Voodoo 6000 (albeit vastly larger and more efficient). I'm sure some people are rubbing their hands with glee now - Nvidia acquired a lot of bad karma and ill-will when they bought 3dfx...

Originally Posted by smitty3268

Originally Posted by GT220

GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.

Proof? I haven't had any problems... The best thing the 220 has going for it on windows is the brand name - lots of people will buy it just because it says NVIDIA on the box, without looking at what they're actually getting.

Come on, why are you even paying attention to this shill? He registered just to advertise the GT220, you won't find any truth to his words that's not twisted to hell and back.

My personal experience is that NV drivers were beyond shitty on Vista for over a year after its release (Ati drivers worked pretty much fine since Vista RC1). Even now, multi-monitor support is a crapshoot with Nvidia's drivers forgetting or setting the wrong resolution time after time (Ati's drivers work correctly in the same setup).

Originally Posted by cliff

This card has 48 cores??? Shouldn't it be getting higher benchmarks?

Simply put, no. That's why you shouldn't listen to anything that comes out of a shill's mouth without a huge barrel of salt (1024 threads, oh wow!)

This is a low end card with low end performance. For comparison, Ati's flagship (the 5870) has 1600 cores, each one more capable than this card's paltry 48. Of course, this doesn't say the whole story either: the 5870 uses GDDR5 memory (vs GDDR2 or 3 for this card) and its architecture is vastly different, both more efficient and more capable (shader model 5 vs 4.1).

As I said, you won't find any truth in a shill's words that's not twisted beyond any recognition.

The bottom line is that this card is a marginal improvement over Nvidia's previous low-end hardware. It's mostly an attempt to increase margins for Nvidia and tide them over to the GT3xx release. There's no point in upgrading if you are already using a 8600/9500/9600 and there's little point in putting this card into an HTPC as long as it has a fan.

I'd only recommend this card if (a) you can find it fanless and (b) you want to build a low-powered Linux box. In all other cases, there are better choices than the GT220:
- Nvidia's 9500 for a fanless, low-power Linux box.
- Ati's 4670 for a fanless, low-power Windows box that can also play some games.
- Ati's 5xx0 series for more serious gaming.

GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.

You are an cute Fanboy. On my System i have no problems with DXVA and ATI with h.264 and VC-1. Or its posibile that i make something wrong because its work?