Post Your Comment

262 Comments

I believe a few of these benchmarks are misleading. The flight simulator 2004 results are a perfect example. It's obvious that the frame rates are limited to the refresh rate of the selected resolution, which is different for different cards (example: 125 Hz, 75 Hz, and 60 Hz). I would suggest fixing your method of benchmarking or removing the games with incorrect results from your benchmark suite, since it misrepresents the NVIDIA card's true performance. I would have thought the stepped frame rates on cards of WIDELY varying performance would have clued you guys in on the problem, but I guess not.Reply

Anand have refused to give ATi the credit they deserve. In every bench where ATi prevailed they managed to find some fault with the benchmark rather than find fault with nVidia's cards\drivers. Those 5 "Marginal" benches belong to ATi ...but Anand can't very well award them or it looks too one sided. The summing up recommends ATi, but not because the cards are better, apparently it's because you can't predict what the future holds (???). It's time Anand got it's hand out of nVidia's pocket and told it like it really is ...The current crop of nVidia FX cards Suck as much as they Blow. They can't produce the goods without cheating and favourable reviewers. To those that would choose nVidia over any other manufacturer, no matter what they produce! You just got Screwed over.Reply

The 9800 XT is already on my upgrade list. Thanks you Anandtech once again for helping me make an imformed decision. One game I would like to see is Papy's Nascar Racing 2003 as it can be a real bear to run.Reply

AnandTech made an extremely extensive article about the performance and image quality of the current high-end graphic cards like Radeon 9800XT and GeForceFX 5950 Ultra (NV38). Beside the game benchmarks with 18 games, the image quality tests made with each of those games are strongly worth to be mentioned. AnandTech uses the Catalyst 3.7 on ATi side and the Detonator 52.14 on the nVidia side to compare the image quality. In contrast to the statements of our youngest driver comparison, AnandTech didn’t notice any general differences of the image quality between the Detonator 52.14 and 45.23 and therefore AnandTech praises the new driver a little into the sky.

This however not even absolutely contradicts itself with our realizations. The nVidia-"optimizations" of the anisotropic filter with texture stages 1 till 7 in Control panel mode (only a 2x anisotropic filter is uses, regardless if there were made higher settings) are only to find with proper searching for it, besides most image quality comparisons by AnandTech were concerned without the anisotropic filter and therefore it’s impossible to find any differences on those pictures. The generally forced "optimization" of the trilinear filter into a pseudo trilinear filter by the Detonator 52.14 is besides nearly not possible to see on fixed images of real games, because the trilinear filter was created in order to prevent nearly only the MIP-Banding which can be seen in motion.

Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly (why however AnandTech awards the driver 52.14 a finer filter quality than the driver 51.75 is a mystery for us, then the only difference between them is a correctly working Application mode of the Detonator 52.14). Thus the "optimizations" of nVidia are not to be really seenReply

Has any else noticed that the Nvidia 5600 ULTRA gets whooped by the ATI 9600 Pro in this article? This is sharp contrast to the AnandTech's last review on the two cards here==>www.anandtech.com/video/showdoc.html?i=1821. What's going on here??Reply

I have a problem the viewing the results of the tests they way they are now posted. Could it be possible to put the graphs the way it used to be, i.e., not use flash. Cos I access the site thru a public network and flash movies or similar things don't load and hence I haven't been able to view the graphs for the past few articles now. Thanks. Reply

I am only interested in 1600x1200 or higher, with AA & AF turned on. I could not imagine spending >$500 and using 1024x768, etc. Perhaps AnandTech could include benchmarks at the high end of resolutions, AA & AF, especially considering they are reviewing the "best" or "most expensive" graphics card available to the consumer.Reply

What surprises me is that NVIDIA let Anandtech use and benchmark a card that hasn't been even announced yet. I haven't seen any reviews or previews for this card anywhere, and it's not even listed on NVIDIA's site! It's a bad time to own a NVIDIA card, so I guess I'll get rid of mine real soon. Reply

Nah, Bigshit under whatever handle is an idiot. Most of the rest are just naive. I think it's fair to criticise this review in several areas - e.g. the constant use of 1024x768 - but give 'em a chance to get it right before you start the accusations.Reply

That was a pathetic review because there were way too many varibles and the fact that anand stated that there were no valid premises to reach a conclusion should have been taken to heart before he decided to publish such a POS as this.

This info is simply unofficial, as DX doesn't want to stir up the industry more than has alredy been done. As some might recall, 3dfx was given the same ultimatum back in 99', yet the news wasn't even released until 2 years later after Reply

So by all means, Do Not Download Detonator 50 Drivers!!! Along with this, NV has been caught cheating on benchmarks as they usually do over at Anandtech . Notice that all of the realworld benchmarks perform better on ATi, yet all synthetic benchmarks perform better by a large margin on NV hardware. "These violations are inexcusable" said a DX employee, and I'd have to agree. So without the inside drive on DX10, NV will not be able to even optimize their cards as ATi can and will probably fall into bankruptsy just as 3dfx did before them... Reply

NVIDIA out of DX10? DiscussThere's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

NVIDIA out of DX10? DiscussThere's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

NVIDIA out of DX10? DiscussThere's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

Sources among Taiwanese mainboard makers state that due to some major issues with Intel’s Strained Silicon 90nm fabrication technology commercial availability of Prescott processors is expected only in the first quarter next year. In December 2003 Intel is very likely to paper-launch its Prescott processors and supply only a handful of such chips to selected solution providers for systems intended for gaming, just like AMD did with its Athlon XP 2800+ processor last year, sources claim

Anand has benched a chip (Prescott) that will only come out at least 3 months from now??? .

I play games in 1024x768... with AAxAF turned up that is (which makes the testing perfect for me). When you show me a card that runs 1600x1200 with everything on that gets good framerates with NEW games then we'll talk.Reply

#19, When next year arrives there will be better cards that use the new GDDR memory which is supposingly 2X faster than what you have now. The NV4X is a new design and R4XX series are 2X faster than the R3XX series. So it's not worth it for a future investment like you make it sound. Everything is outdated in 6 months when you go for the latest and greatest in computers.Reply

I just called ati after reading your post and they told me the same thing. The first person I talked to said they never had stock on the item and the person I'm talking to now is checking for me. That's what happens when you give Canadiens your money I guess.Reply

ati has the worst customer service I have ever experienced. I ordered there 9800 xt off the website which at the time said it was in stock. After 2 days of crap they told me that they had an overwhelming demand and the order was backordered. This is after it said it was in stock on their website and I placed the order the day it came out. Then they explained their *crap* by saying the web site said limited amount in stock and that their web site only changes the next day. What a bunch of crap. They were very rude on the phone when I questioned them about it and especially when I asked if they could please cancel my order. Their response was "why would you want to, you can't get this anywhere else"....very disappointed.... Reply

I suggest you include Il2:Forgotten Battles (www.il2sturmovik.com) and Lock on: Modern Air Combat (www.lo-mac.com) in your benchmark suite. They will both stress all the newest hardware to the max and especially Il2 at the highest detail level will stress every part of the system. Furthermore people interested in playing these games will get valuable information from your site; it´s a lot more meaningful to say that system x runs Lo-MAC at 20fps in stead of 15 for system y than it is to say that Quake3 runs 400fps in stead of 500.Reply

Good article and nice new testing sweet. But look into adding SOE's Planetside to the mix that game eats anything less then a 5600 for lunch running at no more then 20 fps. my heavily oced 5600 (350/550) never gets over 70 or so. Reply

I'd like to see Nascar Racing 2003 tested, rather than F1Challenge. Since F1C is CPU limited, it makes the results rather useless for GPU testing.

As #159 notes, starting from the back of a full-field AI race will definitely show what your hardware is capable of doing. But the AI calculations may eat up a lot of CPU cycles. (FWIW, NR2003 is multithreaded and MP-aware, so this scenario might make for a good CPU/system test.)

However, one could create a _replay_ of a full-field race. The replay is then repeatable on any system. And, although I haven't tested this, I imagine the replay might be more GPU-intensive since there's less real-time AI and physics processing happening.

I would've liked to see a Radeon 9500 and/or an OCSystems 9500 Radeon in the mix with these benchmarks. Even though the 9500 is discontinued it still performs on par with a 9700 card. Overclock it or flash new Bios onto it and it might perform right up with these cards; for a third of the price to.Reply

I'd like to see my testicles running 2xAA. I have no doubt in my mind that ATi is even beter than Nvidia's best! I've noticed the lag on my genitals when playing multiplayer games, such as BF1952 & TF2. I personally cannot wait for Dx9.How about that HL2 source code leak? I have a good feelin about it. :-)Reply

Maybe there should be another video card manufacturer like ATI since NVidia is sinking, so the competition would be higher and new monster video cards would be developed for the good of the gamers. =) just like meReply

These score are worthless, clearly even the FS2004 test where not set up as fs willmake adjustments according to hardware it sees that have to be manualy changed. You can also add that the NV cards are AAing the whole scene as compared to the ATI cards that do not AA the clouds, Trees and other Alpha textures.

Why no AA was used is also beyond logic, what are we back in 1994? ROTFLOL!!!Reply

#164, I'm not so sure about that. From what I've read about it, and seeing the screenshots, the 64-bit mode of Far Cry is probably real. The only question is if it's all it's hyped up to be, but considering the screenshots and movies I've seen, it would be interesting even if the 64-bit mode turned out to be nothing special (not that I'd expect that, though)Reply

Evan suggested I repost this here. I didn't do so originally b/c I doubted you'd slog through 180 other comments, but on the off chance you do... ;)

I'm curious why 9600P results were left out of the Homeworld 2 benches. I also thought ATi's framerate problems with NWN were known, because the devs coded the engine around nVidia cards (though I also know ATi seems to be working to fix this).

I wouldn't mind seeing Medieval Total War in the test group. Maybe use a huge battle and replay it using all the different cards. Thanks for including Command and Conquer it helped me pick my card.Reply

Quote: The picture quality in UT2003 has been the reason for some of the criticism against nVidia recently. nVidia has chose to lower the picture quality in this game when it concerns anisotropic filtering. Most clear we see nVidia's lower picture quality in the 51.75-driver when AF is activated in the control panel.

Quote: The major difference in picture quality in WC3 is the anti aliasing. Due to some reason it seems like nVidia's cards have problems with horizontal edges in this game above anything else. Check i.e. the large log in the fire; it looks like it is principally without AA.

Quote: It maybe gets rather boring to hear (read), but we can not lie; ATi's high quality FSAA win over nVidia once more. As we can see Detonator 51.75 is totally out of the game when it seems like AF is being disabled.

Quote: Once more ATi takes the lead, 9800 Pro gets along this time too. Detonator 51.75 lowers (once more) the quality of texture filtering.

Quote: Radeon kicks some ass in BF1942, nVidia doesn't stand a chance. Once again, Detonator 51.75 lowers the texture quality so much that we don't find the results comparable.

Quote: ATi rips nVidia into pieces in Tomb Raider. If this is an indication of upcoming DirectX9 performance (and it seems like that when looking at Half Life 2 tests) nVidia won't have a lot to say the coming 6 months. Detonator 51.75 increases performance, but also causes some strange bugs.

nVidia's present drivers doesn't allow for floating point precision for render targets etc. resulting in lower quality. Detonator 51.75 also makes the Depth Of Field effect to run amok. The main character is repeatedly erased, despite the fact that the effect is supposed to erase things the longer away they are, for example. As expected, AA works better with ATi's card. Anisotropic filtering also looks better on ATi's card, since this kind of filtering causes ""texture aliasing" (floating pixels) on nVidia's cards.

Subjective analysis: There's no question about it, nVidia's card isn't even close to being playable. Despite that, ironically, Tomb Raider is part of nVidia's so called "The Way It's Meant To Be Played" program. I noticed no differences between XT and Pro when playing the game.

Quote: It's a bit strange to see that ATi's image quality and performance increases when we apply AF through the application instead of from the control panel. On the mountain in the very middle of the picture it's clear that the image quality once again is better with ATi's card.

Subjective analysis: I am very confused with the test results from Jedi Academy. The strange thing is that the game runs a lot better (and looks better) on the ATi card, but in the performance tests it seems like if nVidia beat the 9800 Pro, which doesn't reflect the impression we got when actually playing the game.

Quote: If we take a look at the 3DMark03 performance with Detonator 51.75 we can see that it has been given a good increase. Although we have great suspicions that these optimizations are more or less exclusively application specific, and that it might be optimizations that is by a nature that is not applicable on a real game. Therefore we have chosen to give 51.75 a gray pillar because we don´t believe it is giving an accurate result.

With Detonator 44.03 that is "approved" by Futuremark you can see that nVidia really has not got a chance against ATi's cards. As said: nVidia says it is misleading, but all of the DX9 games so far shows that Futuremark actually succeeded in creating a pretty good prophecy with their 3DMark03.

If we take a closer look at the even more synthetic Pixel Shader 2.0-test we see that the FX-card's weak spot is DX9.

Quote: We got very weird results in Aquamark 3. nVidia's Detonator 44.03 shows a very low performance, but the driver generates an output that is completely approved Detonator 51.75 on the other hand has noticeable losses of picture quality on a couple of areas: Mist/smoke has been heavily reduced in at least one scene, texture filtering is a lot worse, some things look like the are rendered with lower precision, and finally many lightning effects are a little suspect or not even there. Right now under these circumstances we can only point away from the results from detonator 50. And under those circumstances ATi has an unbelievable lead ahead nVidia.

If it is a Prescott, it isn't any better than the current Northwood P4. I base this on the Athlon 64 article. Anand used a Radeon 9800 Pro 256 MB in it and the only benchmark I see on both is the UT 2003 Flyby. For both articles he used the Catalyst 3.7. So taking the numbers:

Im excited to get my 9800XT. I got a 4200 in Sept 2002 cause UT2k3 and BF1942 ran like crap on my firt gen Geforce 3, It only cost me $178 and I modded it for better cooling and ran it at near 4400 speeds. Now im quit tired of the slowdowns in BF942 and UT2k3 as I want higher res and my effects cranked up, and I also want to run Half Live 2 in all its glory. I work my ass off at work and I feel like getting a $500 card, that is my choice, period.Reply

I think Anand is too worried about creating benchmarks that compare to benchmarks done by other review sites. Which is why they had "trouble" benchmarking certain games.

I agree, Morrowind would be a good game to benchmark with... I've used it recently to show the differences of AA and AF along with FS2004.

I think what needs to be done in some games like Morrowind is just play the game for 15 minutes... then tell us what the minimum frame rate was, the average, and the high. Who cares if it's not replicated EXACTLY each time... after 15 minutes, the average along with the lows and highs should paint a pretty accurate picture.

Also, in my opinion, FS2004 is THE BEST software to use in comparing the differences between AA and AF between video cards. All you have to do is disable weather and ATC, and save a flight, then load the flight every time you want to take a screen shot. Also pressing Shift+Z twice puts your frame rates on the screen, so there's no need to use FRAPS.Reply

I suggest adding Tiger Woods 2004 to the suite. Turning up the eye candy is more demanding than one may think, so it would be a good test. But my main motivation is that there appear to be serious driver-related image quality issues with ATI (!) cards (e.g. water reflections).Reply

What I would also like to see, is the test results from ATI and Nvidia against DCC packages, such as 3DStudioMax and Maya. I would like to know if these high end gaming cards can also handle some animation rendering too. Maybe they can't, but its one man's dream...Reply

Sony PS2 and X box Never have a graphics card issue (coz they purley game consoles idiot) yeah I know that, but also the game programers write the games for that particular game console.My question is why does Nvidia or Ati have to constantly adapt their drivers to PC games instead of the Games be compatible with the Graphics cards?

I don't intend to join into a flame war, but I would like a few points cleared up. First of all I have a GF4 4600 and I am looking to upgrade, but I still have concerns with ATIs drivers. In this review I noticed a few discrepancies on both sides. Unless something has changed, ATI's 9x00 series has serious problem fps with both SimCity 4 and Neverwinter Nights w/shadows (not to mention problems in Morrowind). Just pop over to Rage3d's forums if you want to find out more. Additionally I believe the current 3.7 catalyst have flickering menu issues in FS2004, not show stoppers but definitely irritating. Lastly, I wish someone would mention the R3x0 series slow frame buffers, since they are of major concern for people that use PSX emulators.

On the nVidia side, I am fairly confident that the NV38 isn't giving AA in Homeworld 2 unless they are using 4xS in OpenGL (not offered in current drivers). Just check the forums at Relic and you will find that none of the GF3+ cards work with AA in OpenGL unless you use QuinCunx/8XS.

I realize that you can't expect everything in a review, but I wish just a few review sites would mention/research the known bugs for the games they test.

Please don't respond with replies about how you don't play these games, thus you don't care if they work well. The point of a new GFX card is an upgrade for all software, not just to get 200+ fps in UT2003.

Just a view from someone that loves the IQ from ATI's R3x0 series, but dreads the driver issues. Guess I am either waiting for the NV40 or the magic Catalyst 3.8.Reply

For the guys worried about which is the best way to blow $500 on games that haven't been released yet, get a life.

There's no need for Anand to go over what everyone already knows about DX9 - NVidia blows bigtime with its current chips. Do the ATi fanboyz just want to grind the NVidia fanboyz faces in the dirt about this again?

The question is not whether to buy an ATi or an NVidia card, but whether it's worth upgrading your current card to a 9800XT when there's a next generation card only 6 months away. IMO only guys that reply to the "make your penis bigger" spams would think it's worth shelling out $500 at this point in time...Reply

Just set it up for the maximum number of players, enable all details and start a single race without qualifying. That leaves you behind a full field of cars and gives a realistic impression of frame rate. Hit "F" to display frame rate or use another tool to record frame rate.Reply

#154 here, just wanted to add that people shouldn't flip out just because their favorite company won/lost a benchmark. Just play the damn game, who cares if you're looking at 3fps less, seriously.Reply

#153: The XT is supposed to have a more powerful GPU (VPU? Damn companies using special names), so in theory the 9600XT could compete with the 9700 Pro if the VPU/RAM speeds were high enough. Of course, communism works too, in theory.

Oh, and I run all my games in 1024x768 at 32-bit depth with 4xAA and 8xAF (64-tap) using a PNY Ti4200, slightly overclocked (read, to the limit of the card at 265/545); it runs everything but DX9 fine (a whopping 425 marks in 3dMark2003 with AA/AF on, looked pretty as hell chugging at 3fps). I like seeing a benchmark that uses a resolution I'm actually using, instead of these pin-sized 1600x1200+ resolutions that only the $500 21" CRT freaks can use without going blind. Yes, it taxes a card, but I don't plan on taxing my overclocked card so hard it fries the GPU; particularly a $500 one, thanks.Reply

#112, just because doom3 is opengl and not dx9 it doesnt change the fact that this review completely sidestepped the issue of future performance in games. #92 makes perfect points apart from the discrepancy over doom3 using dx9, which ultimately doesnt matter since the shaders of its opengl API are similar to dx9 anyway.

YOU are the only person that looks stupid if you think that this review hasnt glazed over or sidestepped important issues, most benchmarks were totally CPU limited and an unreleased nvidia driver was used which might not even see the light of day.

I'm glad that you point out to everyone that IQ will be covered in later articles, its always great to see reviews posted claiming a certain level of performance without backing up scores legitimately! That would never give people false impressions now would it?Reply

- why not try more distributed forms of the review process. 30hrs in a row is quite bad and it's obviosuly going to impact a bit in terms of any sensible decisions to make during the benchmarking and comment-making.

- Dunno if you're allowed to answer this, but is the prescott a hot chip compared to a P4? if its got HT2 and things, again it could be painting an innaccurate picture. Im sure most people here have a Athlon 2xxx and that's what you shoulda benchmarked with. Also you left out too many old cards - what use is a comparision when your card aint on there?!Reply

- Benchmark at 1280x1024 with 4x AA, it's what these cards are designed for, especially with regard to DX8 titles. With DX9 same thing but without AA. I'm sure most of us are running our CRTs/17-18" LCDs at that.Reply

a lot of people dont have the time to spend hours reading hardware reviews, for those people reviews such as this one can be very misleading when such important details are glazed over or completely missing.Reply

Ummm wtf are you using 1024x768 for?! Did you not notice the scores being so close. Especially when 9600 is close to 9800xt. Omg. What crap. Use 1290x960 or 1600x1200 AT LEAST IN ADDITION to 1024 by 768. Almost all of the graphs look cpu limited.

How about we at least TRY to stress the 500$ cards next time OK???

And use the non cheating Detonators next time too will you? As in 43.30 or whatever they are. I want to see how bad the nv38 burns when you take out the cheats. Reply

Why not stating if these games are rather OpenGL, DirectX 8 or DirectX 9 Games...Looks Like Nvidia is ahead in OpenGl games (Homeworld 2 and Never Winter Nights for example). And ATI in Direct X Games...Reply

The idea to do all tests at 1024x768 was just PLAIN STUPID.The difference between the cards is way higher at higher resolutions... The results are much different at the other sitesReading this interview one could conclude that the 9800xt is only 20-30% faster than the 9600proThis is crap Anandtech. Either shape up or you've lost another reader.Reply

Quake 3 is used because it's CPU limited, not limited by video cards... so if a new CPU can run the game faster, that's an indiciation of it's performance... new games like HL2 and Doom3 tend to be more video card dependant than CPU dependant.Reply

I think it is becoming clear that ATI cards are superior in DX9 games, and are the way to go. Sad for us Geforce owners. Hopefully though, Nvidia will address this and come to market with their next gen cards (not this new batch...the next) having a lot better DX9 performance--for the entire DX9 spec.

If I were buying a card now, it'd be an ATI....but I don't think Nvidia is going to sit still, their new cards will be competitive.Reply

I think that the new reviews should include Half-life 2....when availableAlso when UT2k4 comes out toward the end of the year (or is available to anand), UT2k3 should be replaced as a benchmarking tool. It seems likely that the graphics engine will be tweaked and better looking, as well as include very large levels in UT2k4

This is what I want to see used for CPU articles. Your old crap tests suck (well, unreal 2003 is still used). This is MUCH more useful to someone trying to find out how the latest games will run on their new cpu. Why use quake3 in cpu articles when you can use a bunch of games like this? Do people care more about quake3 or the batch of games you're using here for tesing vid cards? The very same games apply to picking a new cpu. NOT Q3. That game is DEAD.Reply

You guys should really indicate what the API used for each game is -- DX8, 8.1, 9 or Open GL. That would help out a lot in determining if a company optimizes for an API, a single game, or everything... not everyone follows the game industry enough to know which games are programmed in which graphics API....Reply

"This is the first installment of a multipart series that will help you decide what video card is best for you, and hopefully it will do a better job than we have ever in the past.

The extensive benchmarking we’ve undertaken has forced us to split this into multiple parts, so expect to see more coverage on higher resolutions, image quality, anti-aliasing, CPU scaling and budget card comparisons in the coming weeks. We’re working feverishly to bring it all to you as soon as possible and I’m sure there’s some sort of proverb about patience that I should be reciting from memory to end this sentence but I’ll leave it at that."

Worth repeating since least 3/4 of whiners seem to have not noticed it. About 1/4 remains for the driver 'issues', which isnt mentioned but still might be/hopefully is intended, although I'd assume it to take at least as much time as the entire rest of the roundup.

Ye, hopefully parts I-III will include something to give more of an indication of Dx9. With a bit of luck it'll be the HL2 bench - the delay of which maybe being the reason for little in the way of Dx9?

nvidia can compete only in dx.7 and dx.8 or opengl 1.2 games due to wrong strategy of their ceo mr.hu ho ha nv 35 architecture has failed do you really think that nvidia can force microsoft to include nvidia custom shader language [code] in dx.9Reply

While I can appreciate the work it took to generate all these benchmarks...what a complete and utter waste of time! Less than 10% bumps in the clockspeed? Zzzzzzz. I'd have sent it back to ATI and told them to call when they had something interesting.Reply

It would be much more helpful if you included an older video card for reference. Like a geforce 4200, 4600. I am sure there are several users like myself who bought one of these cards in the past year or so and would like to see how it compares to what is new to see how benificial a new upgrade would be.Reply

How can anyone take your post seriously when you state 'facts' that you just got from the top of your head. Read your post, read the facts and think about how stupid you look.

btw, great review Anand and to other whiners who keep harping on about IQ and Resolutions being low, it's part one as mentioned at start of review. Read the whole review and you won't look so silly LMAO

BAD,BAD,BAD review.Why did you choose to use only 2.8Ghz? It's THE limiting factor that is preventing XT from showing its full capability.I'd bluntly say this review(er) is favoring FX by not using higer clocked cpu and beta driver for FX.Reply

Anand/Derek mentioned Nvidia being better at Doom 3. Y'all sneaky son-of-a-guns must be beta testing it in the background or sumthin. I know Carmack said it ran better but I betcha y'all got your hands on a copy. Go ahead. Admit it! Quit holding out on us. We wanna see the benchmark! I got a shiny nickel with your name on it if you put it out there...

Overall great review. I sorta agree that 1024x768 is kinda like the 640x480 of yesteryear now, but most of us can gather what 1280 will run at. For the fanboys/girls, "You should've included counterstrike and hexen 2. waah!" Honestly, I know how long it can take to set up and benchmark those tests in a _controlled_ environment. Do you guys use automated software testers?

Question though.Even though FFXI ran slow, is it still playable? I don't want to believe that it runs that slow all the time.Reply

k, I'm going to ignore everyone who's bitching because they didn't read it and thus haven't already twigged that IQ comparisons will be in part 2

Re the PCI slot thing, doesn't that apply equally to two-slot cards? If putting a PCI card next to the AGP slot on a one-slot card is bad, surely putting a PCI card in the first slot after a two-slot card isn't exactly smart either? You still lose an extra PCI slot over what you would have with a one-slot cardReply

Please use Battlefield 1942 in benchmarking in the future! It's an awesome game and has some very nice and demanding mods like desert combat. Please use desert combat in benchmarking too. Try flying around, blowing up stuff and checking if framerate ever goes to unacceptable levels. Gamers rarely care about average or maximum fps, if game is running 50fps or 150 fps it doesn't matter, but if it ever runs as sluggishly as <10 fps in heat of the battle, it is very annoying.

Just tell us with your own words, which graphics card brings playable framerates!Reply

over all a sad review, using drivers that are not out for every one to use, no IQ tests to see if the drivers are cheating at all, and then comments like "From these two graphs, it seems like NVIDIA is the clear winner, but in watching this demo run so many times, we noticed that the NVIDIA cards were running choppier than the ATI cards, and we again had some image quality questions we need to answer"

so that pretty much does it for me. I won't take this with a grain of salt untill they rip apart the drivers, and make sure Nvidia is not up to any "optimazations" Ive lost all trust in Nvidia. I hope the Nv40 can turn this around.

#69 just because you run games at 1280x1024, doesn't make you the majority representation of gamers. Most gamers run at 1024x768. Most computer users resolution is at 1024x768, like 55% or something like that. Reply

#81 and anyone who thinks it isnt important to benchmark dx9 performance:

When you have 2 of the most highly anticipated games due for release over the next few months (halflife2 and doom3 for those who are asleep at the wheel) which both include dx9 features, Why in gods name would you buy a $500 card that _doesn't_ support dx9 features effectively? You would obviously have to be someone who isnt interested in having the best quality visuals you can get, and that is the exact opposite to the reason anyone would spend that amount of money on a video card in the first place. People want to see the best quality with the best performance! I simply cannot understand why someone would buy one of these expensive cards expecting that they would need to buy another equally or more expensive card as soon as such new dx9 titles appear... the simple truth is that you would have to be a fool with money to burn if you are prepared to pay $500 for a card that cannot perform well in new games that arrive after a couple of months

It's not a matter of whether dx9 features should be benchmarked, it's a matter of how.. I will avoid the whole benchmarking fiasco going on with regard to cheats, but why do you think people put such weight in programs which are designed to predict the performance of hardware on future games? People want cards that will perform well with _future_ games! It is just a pity there arent more tools available that can provide a RELIABLE prediction of how hardware will perform with these future games.Reply

If you're serious enough to inquire about BF1942...why not test it using TFC? I personally enjoy the soles of my Air Nikes, but come on, Reebok? I think I'd rather buy a pair of New Balance at my local shoe wharehouse. ATi has proven itself to be capable of running today's graphically intense games, just as smooth as STEAM is running HL. Thx for the tests, ATech.Reply

No discussion of image quality for each game?Other sites now do this as a matter of course as it is clear that Nvidia is taking shortcuts on quality to maximize speed. Hey I own a 4600 Gold Sample so I am not an ATI zealot, but I know where my next card purchase this year is going, that is clear from all the reviews I have seen.

#73 makes a good point...but at the same time I've made a few observations on that note. I've seen a lot more motherboards with a gap between the AGP slot and PCI slots...and while some people would be led to believe it is just for Nvidia cards, this is most likely not the case. Graphics cards in general put out a lot of heat, and it's never a good idea to put a big card right next to your graphics card anyway, you're just begging for heat problems. For the most part it's just the Nvidia reference design that takes up two slots. The boards distributors usually use their own cooling anyway and plenty are available that only use one slot.

What it all boils down to is that it's not the size it's how you use it. :)

Now as far as ON topic ;) I thought the benchmarks did what they should....they showed performance in today's popular games and some signs of what is to come. For those of you crying because there are no significant DX9 entries...guesss what...DX9 games aren't available in any kind of quantity and won't be any time soon. Granted there will be some, but the bulk of games that are released in the next 6 months will be built on DX8 with some DX9 features. By the time the publishers start churning out DX9 titles guess what...the new chips will be ready for release which will run full DX9 titles better.

Coincidence? Not at all. Does Nvidia or ATI want you to buy their 500 dollar card now and use it for the next two years...hell no. They want you to buy bleeding edge technology for now, then buy another new one in a year or less...and so on and so forth. There's a reason they release a whole line of cards at once (performance, mainstream, budget), that's so they can tackle the whole market with each release. If they make a card too good now you won't need to buy their next one...welcome to the world of trying to make money :).

For those of us with 17"+ LCD Monitors, the 1280x1024 resolution results would be more useful since this is the most common native resolution for these monitors. The games look great, just as long as we keep some games other than just action and FPS games. I mostly play strategy and RPGs so it's good to see Warcraft and NWN on the list. Keep up the great work!Reply

NV38 is still not a 'finished' design; and by finished I mean there is still not a publicly available set of drivers supporting the card. The card itself is not even publicly available much less on the OEM market, therefore it makes it rather difficult to fully benchmark this product. Likewise, to a certain extent the 9800XT is not a finished design even though it's on the market, as the Overdrive (ATI supported overclockin) feature is unavailable until the Catalyst 3.8 drivers become publicly available in the next week or so.

The point of this rant is that the information presented here in Anandtech's review is PRELIMINARY. Regardless of Anantech having engineering samples or final products, beta drivers or publicly available drivers, they can only work with what they have available to them at the present time, and when reading this review you HAVE to take that sort of non-explicitly-stated information into context to guage credibility.

Personally, I believe that given what is available at the present time Anandtech has done a very good job of providing a sample guage of what to expect from the newest 'refresh' video cards which are still incomplete in regards to being able to be all that they can be (special application optimizations not withstanding of course). While I would like to see them guage these cards against older cards as someone mentioned earlier in this thread to see if upgrading is worth it, I don't see the point of doing so until these products are fully completed (i.e.: they're readily available in stores and they have publicly available drivers).

So perhaps after the NV38 truly comes to market that would be a better time to insist on seeing an all-out battle of the GPU's. Just my two cents on the matter, have a good day.Reply

I have just came here from [H]ardOCP to read this article and I noticed something so glaringly obvious im surprised no one has mentioned it. How many of you play games at 1024X768, I know I dont, I play em at 1280X1024 or higher and where has Nvidias biggest drawback been lately, yes thats right you increase the screen size and Nvidia jumps of a cliff whereas ATI walks down a step. I been a gamer who doesnt use 1024X768 means this review is of no use to me, the drivers used are questionable, the image quality is inferior, the setup is poor, and the results DO NOT compare to other sites (ive checked 4 sites so far not including NV38 part), also after looking over this site I have seen not one advertisement for ATI yet I have seen a few concerning Nvidia. Anandtech from what I remember used to be impartial this something I dont think they are anymore.Reply

I'm most interested in hearing about MMORPG performance. I know you included Final Fantasy XI in this suite, but I was hoping that you select an established, popular game. MMORPG DX9 titles like Starwars Galaxies or Asheron's Call 2. And MMORPG DX8 titles like Dark Age of Camelot or Anarchy Online. These games represent more closely were MMORPGs are headed in graphics engine development. Upcoming titles like, like Middle-Earth Online (Turbine), D&D Online (Turbine), Everquest 2 (Sony Entertainment), and Mythica (Microsoft). Reply

i find it akward that since i have a radeon 9800PRO in my rig running a athlon 2600+ with 1gb of ram, i usually get 15fps with the updated patches from EA for simcity4. I've been searching around the internet about this problem of why simcity 4 just plain sucks with radeon cards and everyone on the forums says that its EA's fault for the way how they programed it. Anyways, why is it that Anand's benchmark of his radeons are all the way up to 52fps when most of his system setup is close to my specs?....anand? what drivers and patches are you using?Reply

I know that this is all about the newer technology, but it would have been nice if you would have thrown a couple of the older cards in for comparisions sake (and for those without the cash to purchase new cards every 6 months) like the Geforce 4 TI 4600 or 4200 and the Radeon 8500.Reply

I think I remember that Tron 2.0 asked me to install DX9 so it probably uses some DX9 functions and it's an existing game so why not try to build a benchmark on it? Anyway since we're dealing with unreleased Det50 drivers here... (I rather prefer the THG way of dealing with that)

BTW, I think there's a massive misunderstanding on whether a game is DX8/8.1/9; it can be all at the same time. You can use DX8 pixel shaders and DX9 pixel shaders at the same time.

It's just that as soon as you start using DX9 functions you lose compatibility with DX8/8.1 compatible cards. It's up to the developer to replace these convenient DX9 specifics by DX8/8.1 compatible pixel shaders for instance. So DX9 is really an extension to DX8.1 and DX8.1 is an extension to DX8 and so on

It would be a good idea to include at least FIFA2003 (and if possible, NFS HP2 or PU) mainly forthe same reason why C&C was benched. These arereally popular games and people would like to knowhow they "feel" running with these new cards anddrivers. Also, FIFA 2004 is reportedly coming upwith even more impressive graphic quality and AI(the latter could be a reason to CPU bench it,perhaps?).Reply

im in the same boat as you here #57, very excited about the 9600XT although i think that its more a case of not available than lack of want for a review on AT's part.Can't wait to see how they go =)Reply

I'm was hoping to see benchmarks for the 9600XT. $500 for a new card is rather high for someone on a budget. I've been interested in the 9600 Pro cards for a while and I'm disappointed none of the 9600's were shown. Not everyone can afford the high end cards and I for one would like to see more coverage of the cards that many more people are like to have or buy. It's great to see the flagship cards and what they can do, but don't forget some of us just can't go that route. And we'd like to see benchmarks for the cards that we have or want to purchase.Reply

Prescott will come out at 3.2 and 3.4 GHz later this year.Lower versions 3.0/2.8...will follow afterwards.So its for sure no Prescott here.And if, I wonder why there is no test/word at all about it.Reply

#47 - So what you are saying is that people who invest in a $450-500 video card shouldnt worry about how their card will perform in future games? Are they supposed to buy a new card for each new game thats released implementing new features? That does not seem very wise to me, and I would expect that most people paying such a high price tag for a card would EXPECT their card to have some decent lifespan where it can perform well in the latest games(6months? a year?). Of course there is going to be a point where you simply need new hardware to run new games, I don't know about you but suddenly finding that your card doesnt make the cut when a new crop of games comes out wouldnt be my idea of celebrating a $450-500 purchase.Reply

Funny how after reading this review and seeing all the very marginal gain my 150$ more could buy me, the only thing that got me interessed was the fact they used a prescott to benchmark.As long as they keep "tweaking" the curent crop of cores, those new cards are just to keep the performance crown, and by what, .3fps to 5fps?.Reply

"Games such as Command & Conquer Generals: Ground Zero and Simcity 4: Rush Hour are examples where ATI clearly has the lead over NVIDIA and the argument could be made that ATI holds the lead because they optimize for all games, while NVIDIA just optimizes for benchmark titles. However, looking at games like Homeworld 2 and Neverwinter Nights you could make the exact opposite argument."Except that Command & Conquer is an EA title. The company which officially works with nVidia...Reply

Hey I'm disappointed. This isn't a real flame war, it's more like handbags at 30 paces.

What the review says is that even with a top processor most current games are CPU rather than GPU limited if you have one of the better cards and that for these games there's not much to choose on framerate between ATi and NVidia. IQ is a different matter though. It certainly suggests that while NVidia does have some advantages they are generally outgunned by ATi unless they "cheat" by lowering IQ.

No point in worrying about future games - when they come out is the time to make a decision on that. However at the moment it looks like the card you will be buying will be made by ATi, unless you are some kind of masochist.Reply

- no contrast between AVAILABLE nvidia detonators and the UNRELEASED drivers used to benchmark.- it was unclear which games were using dx8 or dx9 features for these benchmarks without having previous knowledge.- more focus placed on performance with older existing games i.e. dx8, Arent top of the line $500 gfx cards intended for use with future games? i.e. dx9- image quality issues not adequately discussed (major issues glazed over leaving false impressions of performance in 'some' cards), they may be covered in an upcoming article however alone this particular article may be misleading.Reply

OK, you didn't mistype it, you just made it totally unclear and impossible for anyone to really understand. No big deal.

However, to imply that Anandtech should have spent time doing IQ testing instead of NV38 testing is nothing sort of ridiculous. No one (except you apparently) wants to see IQ testing instead of NV38 testing. Reply

I d like to see q3 in your benchmarking suite again.it s still the only q3engine game where a brand new graphics card can run smoothly with Fsaa and AF on highest quality settingsimportant is also that the timedemo is full of action an filled with a lot of playersReply

Okay, on other sites is not a subject but "they" can not refer to "reviews" (the context forbids that). So hopefully you understand now that "they" can only refer to "the other sites" since I did not mention Anandtech there. :) Reply

#32 Okay a second time and I will take you by the hand (please respond with a joke about me being gay):

This is what I wrote (the whole paragraph):

"Perhaps YOU are clueless. I don't need to wait for complete reviews on other sites. And yes, they might have had more time as they did not benchmark NV38. However that they did not get NV38 makes this review even more suspicious."

I wrote about reviews on other sites there. You see? From that moment on every "they" automatically refers to "the other sites" until I come up with a new subject? Okay? That is grammar and hard to understand but you will get there.Reply

You missed the point #23, and it makes you look stupid. Since you admit to mistyping your previous response, as I said before, Anandtech JUST received the NV38 within the last 24 hours. To claim that this is somehow suspicious makes you look ignorant based on that fact alone, and the fact that there's nothing "else" to make this review look suspicious at all. I think Anand did a great job, and certainly better than every other web site except perhaps B3D.

And if you can't wait for a review you're just a freaking whiner who probably isn't even going to be purchasing a high-end video card and so really has nothing better to do with his time but bitch and moan. Guys like me, who are seriously considering a 9800XP (well actually, I'm thinking a 9800 non Pro now), are who Anand is writing for mostly. You, well, you're just a whiner apparently.

Also, #25, I disagree with you on the unreleased drive bit. The 52 beta drivers are supposed to resemble the final Det 50's very closely according to Anandtech, so it makes zero sense to imply that using 52 series beta drivers is somehow not right. Also, exactly what DX9 titles is Anandtech going to test with? They don't have HL2 just like every other site, and what other DX9 games are out currently, even for web sites? None. Tomb Raider is the ONLY real game that AT should have included, which is an odd omission I admit, but nothing more. DX7/DX8 is the most prevalent standard used in games today, it would be idiotic not to include mostly these types of games in a review.Reply

However, that does not mean it's impossible to compare ATI and NVidia in Doom3 directly. You can do that, by enforcing the NVidia to run in the same generic code path, that will be used by all non-NVidia cards. Carmach has done that himself, and indicated that the FX cards than achieve about half the performance as the Radeons.

That's completely in line with the other new games and benchmarks. Reply

I like the idea of taking more games for benchmarking. However, it has become so very clear by now that that GeforceFX cards do great on DX8 but are lousy at DX9 games. Therefor, you need to be very carefull on the games you choose to benchmark. Seems that the games tested are mostly DX8 like. That simply gives a wrong picture to people who want to buy a new card now, at the moment that all new games have much DX9 features. They will be dissappointed when they start to play these new games.

Furthermore, we simply know that NVidia has been cheating in almost all (benchmark) games with their drivers.The last review in THG for Aquamark, was very clear. Only drivers that were release before the benchmark came out rendered correctly. ALL drivers after that reduce image quality, or simply don't render stuff at all. And Aquamark isn't the only one. This isn't an iccident, this happens in almost all games. And images corruption in beta drivers certainly isn't limited to benchmarking games. Lot's of others suffer too.

At this point in time, you simply CAN NOT ignore these facts. It is totally unacceptable to use unreleased NVidia drivers, without having a extremely thorough investigation to check where the new cheats are.

Do any of you really think that a driver optimization can produce double performance? Dream on!

Just look at Aquanox. The only driver that produces correct images, has half the performance of the Radeons. (Which is exacltly what you expect from experiences with all other Dx9 like games.)The 45.23 has a convenient 'bug', and just happens to get double performance. The 51.75 has another less obvious 'bug', and also just happens to get the same double performance. Do I have ANY reason to believe that somehow, the unreleased beta 52.14 drivers doesn't have 'bugs' that happen to double the performance?Come on... nobody can be that naive....

If there's anything we've learned in the last half year, it's that:1) NVidia is at this moment completely unreliable2) Driver optimizations that give more than a few percent performance increase, don't optimize, but simply cheat.

If you want to show benchmark results on FX cards, you use drivers that you can be really confident in that they don't cheat. It is totally unacceptable to use unreleased beta drivers. It's NOT enough to say you're going to look into image quality in the future!!! Reply

#22 For you again: "I don't need to wait for complete reviews on other sites. And yes, they (THE OTHER SITES) might have had more time as they (THE OTHER SITES) did not benchmark NV38. However that they (THE OTHER SITES) did not get NV38 makes this review (ANANDTECHS) even more suspicious."

And yes Shadermark 2.0 is no game. But it might have shown that NV38s Pixel Shader 2.0 is inferior to the Radeon XTs. And this WILL be important for performance under many DirectX 9.0 games like Half-Life 2.

And I am not Natoma. I do not even post regulary on Hardware sites. And I am no idiot or fan boy eiter.Reply

#19, did you even read this review at all? You look more idiotic by the second; Anand DID INCLUDE NV38.

And if you are honestly that whiny to suggest that you can't wait for Part 2 for a free service such as Anandtech...then just get a life.

And no, SM2.0 cannot single handedly predict future game titles, that's just ignorance on your part. If you knew anything about programming you'd know that there are so many different variables that affect a game that it would take multiple code testing programs (like SM2.0) to even get a relatively accurate picture of future game title performance. Unfortunately, no web site in the world is going to spend their whole day doing that crap, they wouldn’t be able to get other games benchmarked.

(Btw, if you mistyped your comment about NV38, since your next comment seemed to imply that AT is somehow biased because they got NV38 and no one else did, you are simply a paranoid dope with nothing better to do than bash a big web site. Christ, you don't even know how dumb you sound; just today I was told by an editor through pms that AT's 9800XT review was delayed because they received NV38 at the last minute. Yeah, that clearly shows that NVIDIA had planned all along to have this AT review by their biased leash.

Haha, I just noticed over at Beyond3D that you're Natoma, yes? Haha, no wonder, you're one of the least knowledgeable guys there. LolReply

Doom 3 is an OPENGL game, not DirectX. And Carmack himself said they had to write specific code paths for Nvidia (to use lower precision), so you can't really compare ATI and Nvidia in Doom 3 directly.Reply

Perhaps YOU are clueless. I don't need to wait for complete reviews on other sides. And yes, they might have had more time as they did not benchmark NV38. However that they did not get NV38 makes this review even more suspicious.

AND: You should also want Shadermark 2.0 crap if you are interested in playing some games already on the horizon and most games that will be released over the next year. Some of these games may be fillrate intensive like Aquamark3 but they are not that Pixel Shader 2.0 intensive.Reply

#11 you're an idiot. What loser who wants to buy a 9800XT or NV38 wants to see Shadermark 2.0 crap? Jesus, I certainly don't, and I'm one of many people that wants to buy a high-end video card. Tomb Raider sure, but he included 15 total games you idiot. And if you actually READ the review, you would have noticed Anand say he will do IQ testing in Part 2 of this review.

Jesus, are there really this many clueless Anandtech readers? lolReply

I really wonder what happened to Anandtech. I once liked and trusted their reviews so much that I did not read any other ones.

Now I see the first review of the NV38 and do not see it benchmarked in any way that would interest me. No Tomb Raider: AOD, no Shadermark, no AA/AF, no image quality comparisons and no Half-Life 2 (okay, this might not be Anandtechs fault).

This means no DX9.0 title that is demanding when it comes to Pixel Shader 2.0 power (no, Aquamark isn't). So please not not bench a ton of CPU/Memory limited games even without AA/AF.

"The performance crown under Doom3 is still in NVIDIA’s camp apparently". Doom3 is mainly DirecX8. Period.

"ATI is still ahead in Half Life 2. The numbers we’ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%), although in some cases ATI manages to pull ahead by double digits." What does that mean? Is this only with the NV30 optimised (degraded IQ) code path. If so, too bad for them.

Finally what I liked to know is if NVidia required Anandtech to benchmark this way...Reply

It was great to see so many games represented, not the least of which is one of my favorites: Neverwinter Nights.

One game that I would be thrilled to see is Star Trek Armada II. The game is a blast to play, and under situations with many ships (ESPECIALLY multiplayer) the game can slow to a crawl even on high-end systems. I would hazard to guess that this game is more CPU bound, but a graphics analysis wouldn't hurt anything.Reply

I thought it was a little ridiculous that almost every benchmark had the stipulation that "AA didn't seem to be applied. We'll investigate later." or "Image Quality wasn't up to snuff. We'll investigate later." and yet you still included the results for the Nvidia cards.

After the article from Lars Wienand from THG where he states that if the driver reduces image quality to gain Framerate they gray it out, I expect the same thing from Anandtech. Especially since the drivers you used are unreleased for public consumption and may never even reach the public.

At this point image quality is indeed king. Who wants to spend $500 on a video card that will not provide top notch image quality? I know I don't.Reply