Just ordered mine Friday, should be here Wednesday.
I was lucky, ordered it from Newegg at $239.99 and it seems totally worth it after reading these comments.
Oh and I checked again today and it seems Newegg is all sold out =]]
Reply

It'd be awfully nice to have the axes of the graphs labeled. For the first set, I can guess that they are screen resolution on the horizontal axis, and frames per second on the vertical, but I could be wrong, since there's no labels.

I also couldn't follow the page on comparing the 8800 GT to the 8800 GTX. Your conclusion seems to state that, "the 8800 GT doesn't touch the GTX." However, I can't come up with that conclusion from the graphs. They look roughly comparable in most of the tests that you've shown, with only a slight advantage to the GTX at very high resolutions.

On the "Out with the old..." page, there is a typo in the second paragraph. In the last sentence, "fasted" should be "fastest".

From what I have read here and else where this seems to be THE card to get. Before I make a purchase though, I would very much like to see more data comparing the cards offered by the different card manufacturers.
Is Anandtech going to be doing an 8800 GT roundup any time soon? Do I have to beg? Reply

How much of an improvement if you SLI'd a 8800GT with a GTX? I know mix n matching is not optimal but the price difference makes me wonder. Would it fall between two GT's and two GTX's? I dont have any experience with SLI. I've avoided it because it's never been a decent upgrade path. Reply

I bought 2x 7800GTX's (just before the 7900GTX's came out) at £660 for the pair and this card just blows them away.

On my 24" dell monitor at 1920x1200 with 4xAA on an opteron 175 @ stock 2.2ghz i get average 125fps in Team Fortress 2. Everything else I've tried has been very smooth - world in conflict, bf2142. A *very* noticable performance increase at a billiant price!

If you're considering an upgrade buy one of these NOW, play todays games at awesome speeds. then get a nice new intel 45nm quadcore + x38 + ddr3 in january when the products are released and prices will be lower.. Then if needed sell the gfx card and buy whatever nvidia are offering in january - if you really need too (which i doubt you will).

Most online retailers have pulled these items off their websites entirely, as I'm sure these cards have been picked up ravenously by gamers wanting The Holy Grail of video cards, as it seems this is.

My question is, "Why so cheap, and why now?" for injection into the market, NVIDIA could have raised the price at least $50 (which most retail shops have already done to capitalize on its popularity) and still have a product that sells like crazy. This makes me wonder what is next, and if a better product is in the works that makes them want to get rid of this inventory as quickly as possible before the next big thing comes out. It may be (and yes, I'm reaching) that this card is on the low side in NVIDIA's new product line, and they can clear inventory at a price premium now as opposed to when the full line is released. They have little reason to throw out their best until AMD has shown their hand, and are playing the same game that Intel is, with their 3.0ghz processor that can easily be clocked higher.

With this in mind, I plan on holding on to my money for now, partially because I can't even find one in stock yet, and partially because having this card at this price point doesn't seem to make much sense unless a full line refresh is coming, and this card is the weakest link, which is an incredible thing to think about, considering how good this card appears to be Reply

"A G92-derivative will appear later this year with even more shader units. According to company guidance, the new G92 will launch in early December and feature 128 shader units as opposed to the 112 featured on GeForce 8800 GT. ... In addition to the extra shaders, the new G92 will also feature higher core frequencies and support for up to 1GB GDDR3." Reply

Awesome, been waiting for something like this to come around. Right now at most places the cheapest I've found is $260 with $6 shipping. I'll wait for it to drop down to around the $199 mark & I'll be all over it. Reply

how long before we start seeing something like this in a laptop? i think there was a brief mention that it might be possible to make one with passive cooling.. so that makes me hopeful. the 8600 series in laptops doesnt really impress me

The text of the article goes on as if the GT doesn't really compare to the GTX, except on price/performance:

quote:We would be out of our minds to expect the 8800 GT to even remotely compete with the GTX, but the real question is - how much more performance do you get from the extra money you spent on the GTX over the GT?

quote:But back to the real story, in spite of the fact that the 8800 GT doesn't touch the GTX, two of them will certainly beat it for either equal or less money.

Yet all the graphs show the GT performing pretty much on par with the GTX, with at most a 5-10fps difference at the highest resolution.

This is obviously an amazing card and I hope it sets a new trend for getting good gaming performance in the latest titles for around $200 like it used to be, unlike the recent trend of having to spend $350+ for high end (not even ultra high end). However, I don't get why a GT part is higher performing than a GTS, isn't that going against their normal naming scheme a bit? I thought it was typically: Ultra -> GTX -> GTS -> GT -> GS, or something like that. Reply

I've been hearing rumors about an Nvidia 9800 card being released in the coming months .... is that the same card with an outdated/incorrect naming convention or a new architecture beyond G92?

I guess if Nvidia had a next-gen architecture coming it would explain why they dont mind wiping some of their old products off the board with the 8800 GT which seems as though it will be a dominant part for the remaining lifetime of this generation of parts. Reply

After lurking on Anandtech for two layout/design revisions, I have finally decided to post a comment. :D
First of all hi all!

Second of all, is it okay that nVidia decided not to introduce a proper next gen part in favour of this mid range offering? Okay so its good and what not, but what I'm wondering is, something that the article does not talk about, is what the future value of this card is. Can I expect this to play some upcoming games (Alan Wake?) on 1600 x 1200? I know its hard to predict, but industry analysts like you guys should have some idea. Also how long can I expect this card to continue playing games at acceptable framerates? Any idea, any one?
Thanks. Reply

From context, I'm thinking 512. Since 512MB are the only cards available in the channel, and Derek was hypothesizing about the pricing of a 256MB version, I think you can be confident this was a 512MB test card. Reply

Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.

For a different, more comprehensive look at the 8800 GT, why not try http://www.firingsquad.com/hardware/nvidia_geforce...">the FiringSquad review? They test at a variety of resolutions with a decent selection of GPUs and games. Out of all of their results, the only situation where the 8800 GTS 640 comes out ahead of the 8800 GT is in Crysis at 2xAA/8xAF at 1920x1200. Granted, they don't have 2560x1600 resolutions in their results, but how many midrange people use 30" LCDs? For that matter, how many highend gamers use 30" LCDs? I'm sure they're nice, but for $1300+ I have a lot of other stuff I'd be interested in purchasing!

There are a lot of things that we don't know about testing methodology with all of the reviews. What exact detail settings are used, for example, and more importantly how realistic are those settings? Remember Doom 3's High Quality and Ultra Quality? Running everything with uncompressed textures to artificially help 512MB cards appear better than 256MB cards is stupid. Side by side screenshots showed virtually no difference. I don't know what the texture settings are in the Crysis demo, but I wouldn't be surprised if a bunch of people are maxing everything out and then crying about performance. Being a next gen title, I bet Crysis has the ability to stress the 1GB cards - whether or not it really results in an improved visual experience.

Maybe we can get some image quality comparisons when the game actually launches, though - because admittedly I could be totally wrong and the Crysis settings might be reasonable. Reply

Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.

For a different, more comprehensive look at the 8800 GT, why not try Reply

Well clearly a graphics issue this must be. But I read nvidia 169.xx drivers were made for optimizing the performance, but lowering the quality of the graphics.
This was prooved when the water was less nicer in crysis etc with 169.04 and 169.01, than with their previous 163.xx drivers. Reply

It's hard to tell what you are getting when you compare the results from one article to those of another article. Ideally, you would like to be able to assume that the testing was done in an identical manner, but this isn't typically the case. As was already pointed out, look at the drivers being used. The earlier tests used nvidia's 163.75 drivers while the tests in this article used nvidia's 169.10 drivers.

Also, not enough was said about how Unreal 3 was being tested to know, but I wonder if they benchmarked the the game in different manners for the different articles. For example, were they using the same map "demo"? Were they using the game's built-in fly-bys or where they using FRAPS? These kind of differences between articles could make direct comparisons between articles difficult. Reply

To blacken. I am a big AMD fan, but right now it's almost laughable how they're getting stepped and kicked on by the competition.

AMD's ideas are great for the long run, and their 65nm process was just a mistake since 45nm is right around the corner. They simply do not know how to compete when the heat is on. AMD is still traveling in 1st gear. Reply

Well the 8600GTS was a mistake that never should have seen the light of day: over-priced, under-featured from the start. The 8800 GT is the card we were expecting back in the Spring when NVidia launched that 8600 GTS turd instead. Reply

I may be a bit misinformed on this, but I'm getting the impression that Crysis represents the first game that makes major use of DX10 features, and as a consequence, it takes a major bite out of the performance that existing PC hardware can provide. When the 8800GT is used in a heavy DX10 game context does the performance that results fall into a hardware class that we typically would expect from a $200 part? In other words, making use of the Ti-4200 comparison, is the playable performance only acceptable at moderate resolutions and medium settings?

We've seen something like this before, when DX8 hardware was available and people were still playing DX7 games with this new hardware, the performance was very good. Once games started to show up that were true DX8 games, hardware (like the Ti-4200) that first supported DX8 features struggled to actually run these DX8 features.

Basically, I'm wondering whether Crysis (and other DX10 games that presumably will follow) places the 8800GT's $200 price point into a larger context that makes sense. Reply

I've run Vista for about a month before switching back to XP due to Quake Wars crashing a lot (no more crashes under XP). I've run bunch of demos during that month including Crysis and Bioshock and I swear I didn't see a lot of visual difference between DX10 on Vista and DX9 on XP. Same for Time Shift (does it use DX10?). And all games run faster on XP. I really see no compelling reason to go back to Vista just because of DX10.

Just wondering though, if you were able to test the cards at the same clock speeds. The GT by default has @100MHz advantage on the core over the GTS, which is a common reason the GTS falls so far behind in head to head testing. I expect the GT to have more OC'ing headroom than the GTS anyways, but it would be nice to see an apples to apples comparison to reveal the impact of some of the architecture changes from G80 to G92. Of note, the GT has fewer ROPs and a smaller memory bus but gains 1:1 address/filter units and 16 more stream processors.

Also, I saw an early review that showed massive performance gains when the shader processor was overclocked on the GT; much bigger gains than significant increases to the core/memory clocks. Similar testing with the GTS/GTX don't yield anywhere near that much performance gain when the shader core clock is bumped up.

Lastly, any idea when the G92 8800GTS refresh is going to be released? With a 640MB GTS this seems more of a lateral move to an 8800GT, although a refreshed GTS with 128SP and all the other enhancements of the G92 should undoubtedly be faster than the GTX...and maybe even the Ultra once overclocked. Reply

Thanks for the reply.
This card looks to be pretty cool running and when not running 3D intensive apps I'm sure power consumption and noise is really low.
So it might be nice to be able to play a little on a 52"LCD! Reply

also, if you go with a less powerful card for HD HTPC you'll want at minimum the 8600 GTS -- which is not a good card. The 8800 GT does offer a lot more bang for the buck, and Sparkle is offering a silent version. Reply

Nothing like cherry picking the games... I don't understand why games like Stalker and Prey weren't tested as the 2900XT has superior performance on those titles, as well as other titles. Seems like a biased test. Reply

well, first, if G92 has those units disabled, then it can't claim them.

second, NVIDIA would not confirm that the G92 as incarnate on 8800 GT has units disabled, but it is fair to speculate that this configuration was chosen to work out yields on their first 65nm part. Reply

Based on benchmarks and price this card is finally in the sweet spot for me which means I can finally ditch my ATI X300! I only have one question remaining and that concerns the noise level. How does it compare to the 8800GTS? Why was this omitted from your review? Reply

This game seems real demanding. If it is getting 37 f.p.s. at 1280 x 1024, imagine what the frame rate will be with 4X FSAA enabled combined with 8X Anistrophic Filtering. I think I will wait till Nvidia releases there 9800/9600 GT/GTS and combine that with Intel's 45nm Penryn CPU. I want to play this beautiful game in all it's glory!:) Reply

but there were issues ... not with the game, we just shot ourselves in the foot on this one and weren't able to do as much as we wanted. We had to retest a bunch of stuff, and we didn't get to crysis. Reply

Yes, I am glad instead of purchasing a video card, I instead changed motherboard/CPU for Intel vs AMD. I still like my AM2 Opteron system a lot, but performance numbers, and the effortless 1Ghz OC on the ABIT IP35-E/(at $90usd !) was just too much to overlook.

I can definitely understand your 'praise' as it were when nVidia is now lowering their prices, but this is where these prices should have always been. nVidia, and ATI/AMD have been ripping us, the consumer off for the last 1.5 years or so, so you will excuse me if I do not show too much enthusiasm when they finally lower their prices to where they should be. I do not consider this to be much different than the memory industry over charging, and the consumer getting the shaft(as per your article).