Post Your Comment

44 Comments

Many of the readers of these tech sites want to know the full capabilities of the cards, yet, sadly, reviewers at anandtech and every other tech site ignore the video capabilities of video cards. Even in the reviews for the new 6600 agp, the video aspect has not been tested by any reviewer despite the problems of the 6800. Never mind the fact that EVERY review of these cards is about the 3d aspect and is nearly the exact same - run halo, doom 3, hl 2, etc. and list the performance, yet no tests of dvd movies or the video aspect are conducted, thus doing a HUGE disservice to readers. Reply

Surprisingly, my 865G with Intel Extreme Graphics 2 can run Doom 3 beta at default, it still crashes, but when I run it, I get barely playable frames, I say around 20 at the highest and less than 10. I think the GMA900 should be much better, but maybe the DX9 support in it really sucks.Reply

Doesnt 2 cards cost more then one?
And whats the difference between having two 6600GT vs 6800GT? in price and performance?

I think this kind of "edge" could come in the future like the voodoo2 did, the card was getting old, people getting rid of it and "some" get them cheap just to keep their PC the longger time they could.Reply

I'm sure the core of the 6600 will overclock very well, but the memory all depends on the particular chips used and might not have any real headroom. That could be its main problem as its an 8-pipe 300MHz core so theres plenty of power there, but only 128-bit 500MHz (effective) memory which is what is probably holding it back. If thats the case then overclocking the core may not help very much.

Its a pity no attempt to overclock was performed in the review, but then again the results from overclocking cards sent out by the manufacturer are always suspect as they could have hand-picked the best.Reply

In my opinion ATI beats all nVidia cards except for their $200, where the 6600GT wins. But we can't forget the 6600 has a great overclocking potential, and street prices should be lower than the X700's, because of the slower memory.
Like already mentioned, you can find the 6600 for $135 already.
Reply

I also wish to thank you for keeping up the fight to unravel the mystery behind the mysterious video processor. That notion of that feature really got me excited when I first heard about it, yet site after site after site reviewed these cards without even touching on the subject.Reply

I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)

I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)

As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:

So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.

The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.

Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.

If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.

Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.Reply

The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games.Reply

"The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."

This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.

I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :PReply

"OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."

xsliver : I think it's because ATi has generally cared more about optimizing for DirectX, and more recently just optimizing for API. OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX. How long it will take to convert that priority into performance is unknown.

Also, keep this in mind: Nvidia specifically built the Geforce mark-itecture from the ground up to power John Carmack's 3D dream. Nvidia has specifically stated they create their cards based on what Carmack says. Wether or not that is right or wrong I will leave up to you to decide, but that does very well explain the disparity between ID games and other games, even under OGL.Reply

What do you mean "who has the right games"? If you want to play Doom3, look at the Doom3 graphs. If you want to play FarCry, look a the FarCry graphs. If you want to play CoH, Madden, or Thunder 04, look at HardOCP's graphs. Every game is going to handle the cards differently. I really dont see anything wrong with AnandTech's current group of testing programs.

And Newegg now has 5 6600 non-GTs in stock, ranging in price from $175-$148. But remember that it takes time to test and review these cards. When Anand went to get a 6600, its very likely that that was the only card he could find. I know I couldnt find one at all a week ago.Reply

Furthermore, the games you pick for a review make a large difference for the conclusion. Because of that, HardOCP has the 6200 outperforming the x600 by a small margin. So, I would like to know who has the right games.

And #2:
The X700/X800 is simular enough to the 9800 to compare them on pipelines and clock speeds. Based on that, the x700 should perform about the same.Reply

1) The X300 was omitted from the Video Stress Test benchmark because CS: Source was released before we could finish testing the X300, no longer giving us access to the beta. We will run the cards on the final version of CS: Source in future reviews.

2) I apologize for the confusing conclusion, that statement was meant to follow the line before it about the X300. I've made the appropriate changes.

3) No prob in regards to the Video Processor, I've literally been asking every week since May about this thing. I will get the full story one way or another.

4) I am working on answering some of your questions about comparing other cards to what we've seen here. Don't worry, the comparisons are coming...

Here's my question, what is better? A Geforce 6800 or a Geforce 6600 GT? I wish there was like a Geforce round-up somewhere. And I saw some benchmarks that showed SLI does indeed work, but these were just used on 3dmark and anyone know if there is any actual tests out yet on SLI?

Also to address another issue some of you have brought up, these new line of cards beat the 9800 Pro by a huge amount. But it's not worth the upgrade. Stick with what you have until it no longer works, and right now a 9800 Pro works just fine. Of course if you do need a new graphics card, the 6600 GT seems the way to go. If you can find someone that sells them.

O, and to address the pricing. nVidia only offers suggested retail prices. Vendors can up the price on parts so that they can still sell the inventory they have on older cards. In the next couple of months we should see these new graphics cards drop to the MSRPReply

Thanks for the tidbit on the 6800's PVP. I'd like to see Anandtech take on a video card round up aimed at video processing and what these cards are actually capable of. It would fit in nicely with the media software/hardware Andrew's been looking at, and let users know what to actually expect from their hardware.Reply

Anand, thanks so much for updating us on the PVP feature in the NV40. I think it's high-time somebody held nVidia accountable for a "broken" feature. Do you know if the PVP is working in the PCI-Express version (NV45)? Any information you can get would be great. Thanks Anand!Reply

That's an odd conclusion... "In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."
But looking at the results, the X600Pro is _faster_ in 5 of 8 benchmarks (sometimes significantly), 2 are a draw, and only slower in 1 (DoomIII, by a significant margin). Not to disregard DoomIII, but if you base your conclusion entirely on that game alone why do you even bother with the other titles?
I just can't see why that alone justifies "...overall, the 6200 takes the crown".

There are some other odd comments as well, for instance at the Star Wars Battlefront performance: "The X300SE is basically too slow to play this game. There's nothing more to it. The X300 doesn't make it much better either." Compared to the 6200 which gets "An OK performer;..." but is actually (very slightly) slower than the X300?Reply

Thanks Anand! I've been on about the PVP problems with nV40 for months now, and have become increasing fustrated with the lack of information and/or progress by nV. Now that a major site is pursuing this with vigor I can at least take comfort in the knowledge that answers will be forthcoming one way or another!

Again, thanks for making this issue a priority and emphatically stating you will get more information for us. It's nV vs Anand so "Rumble young man! Rumble!" :-) Reply

if you ask me, all these low end cards are stupid if you have a PCIe motherboard.. who the heck would get one of these crappy cards if they spent all the money for a brand new PCIe computer??? these cards would be perfect for AGP as they are now going to start to be lower end..Reply

OT, I wonder about the outcome for us 6800 owners and the VP... Nvidia screamed this new feature to us and I bought it . Will this end in a class action,or perhaps some kind of voucher for people that bought the 6800 specifically for this highly touted feature....Reply