But on a more serious note, nice update. I love reading AT's articles! .)

By the way, was there a reason the original article was posted at exactly 9:00 and the SLI update was posted at 10:00? I looked at a few other articles by you and others and most were not posted exactly on the hour. Just curious, thanks! .)Reply

It's nothing in particular. When I put an article stub in the system it gives the article the time the stub was created, which means when I publish it the system calls it several hours old when it's not. So I rewrite the article's time to match up with the current time or something close to it.

For the GTX 580 article the time listed on it is when the NDA expired and it went up, while for this article it went up a few minutes after the hour so I just put it down on the hour. I could have published this article at any time.Reply

Well you posted this in the original GTX580 review as well. Thought I'd reply once again here.

The GTX480 minimum framerates, as high as they are at the lower resolutions, are likely a measurement error or anomaly. One only needs to look at the GTX470 to compare. There is not a good reason that the GTX480 can outpace the GTX470 by 100% at 1680x1050.Reply

Ok I see you were referring to the SLI results. I looked in the single-GPU review.

Anyway, 2fps out of 55fps is kinda within a margin of error. And even in that Crysis minimum framerate graph, the single GPU GTX580 is faster than the single GPU GTX480. It looks like all the dual-GPU cards are started to get bottlenecked by the CPU/non-GPU parts of the system which is why the results are so clumped together.

As for the Techgage results I am more than a little skeptical. As explained before there is no reason why a GTX480 can beat a GTX470 twofold, let alone GTX580. That super high minimum framerate may very well be a bug. It's the only video card in Techgage's graph that's far away faster than anything else. Doesn't make sense at all.

And I believe you are doing a disservice by being so fixated on this minimum framerate issue without applying basic statistical analysis to the problem.Reply

I only read the reviews here for entertainment purposes. I enjoy seeing how the reviewers struggle in their reviews to manage the balance between favoring Nvidia over AMD without showing their bias blatantly for everyone to see. However, they seem to be less and less successful at it...

You have managed to find two GTX 590s pretty much at launch date but couldn't get hold of a second 5970 which was released a year ago? Really? Well, on the other hand it's understandable since Anand doesn't take any money from any company that it might just be over their budget, remember, this is a high-end part!Reply

And do you remember that huge article from a while back showing the very poor scalability of mutli-GPU solutions when you get into three- and four-way GPU setups? I'm sure dual 5970s are not included because the performance would be severely unimpressive (not to mention all the screen tearing and the like) for the money you're throwing at it.Reply

That has NOTHING to do witht the fact that they did not run HD 5970 CrossfireX, they "only" have one.I think AT needs to get another ASAP and test it too and stop ignoring AMD's top of the line soultion.Reply

I've never had to install game-specific drivers to take advantage of SLI in the games I play, and I've had an SLI rig for nearly three years (2x 8800GT). I just update my vid drivers once every few months. It's true that there are often performance tweaks for individual games in a given driver version, but I've never found a game that just doesn't work under SLI with whatever the driver version at hand. When did you last try?

As for Crossfire... my "guest" PC is all-AMD (Athlon II X4 620, 4870, 785 chipset) and is a fine machine. Every time I consider going Crossfire on that rig, I check the various tech sites and game support sites, and see issues with Crossfire being reported far more frequently than SLI. This points to a situation that has existed for a while: AMD makes faster hardware for the money, but nVidia overall does a better job with drivers, particularly in multi-GPU scenarios, and from the game developers I know, seems more interested in working closely with game devs.

Which is why when friends ask me about gaming builds, my usual answer (depending upon the products both vendors have at the time) is "Single vid card, go with AMD, dual vid card, go with nVidia". There have been exceptions: 8800GT in its day was just plain the best, and 460 GTX until very recently was also the best single-GPU solution in its price bracket. The overall trend seems pretty steady with regard to single-GPU vs. multi-GPU, though.Reply

Who drops in another card after 2 years, if there is a new card available that's not only about 100% faster but also brings new features to the table? (E.g. DirectX11, Tesselation, Eyefinity etc.)Reply

Um, i got 2 5850s for less than the price of a single gtx 580, which they consistently outperform. Dual gpu is a legitimate solution in the short term at least.

You're right that it becomes a less sensible option after a fair amount of time, assuming the tech has moved on significantly - however, expect pc gpu tech to stagnate for a while (as evidenced by the very marginal improvements displayed by the 6xxx and 5xx series) at least until the next round of consoles are out.

You DO know that the marginal improvements from 58xx to 68xx stem from the fact that the new top of the line 69xx are yet to be launched?

Yes, GPU tech will stagnate because all they have to master are some 3rd grade console ports that only turn out so few fps because the process of porting them over to the pc is done as quickly and cheaply as possible?

If there was such a thing as a native PC game anymore, you probably would see all those DX11 features put into practice.

Right now it's simply ridiculous. A HD4870 or a GTX580 will play any console-ported crap you throw at it... more performance has become irrelevant as there is no game to request it.

Oh, there is Crysis, right.And it's out since when exactly?I'm really not in the mood to pick up this vegetation benchmark in-disguise and look at it again...

And then there are games that run with 200+ fps instead of 60+ fps. *yawn* Please wake me up, when you reach 500+ fps with your GTX580 SLI so I can walk over to my bed for some real deep sleep...Reply

Yep, 8800 GT SLI does run rather well, though as my results showthey fall behind for newer games at higher res/detail.

Summaries I've posted elsewhere show that if one is playing oldergames at lesser resolutions, then using a newer card to replacean older SLI setup (or instead of adding an extra older card) willnot give that much of a speed boost, if any (look at the 4890 datavs. 8800GT). For older games, newer cards only help if one alsoswitches to a higher res/detail mode. Newer cards' higher performanceis focused on newer features, (eg. SM3, etc.); performance levelsfor older features are often little changed.

I know this is a bit off topic regarding GTX but as someone that just received as a gift a GTX480 recently, and has no space in the case to use a 3 slot air cooling solution, I was wondering if it would be possible to fit the new GTX580 cooling solution on a GTX480.

If this would be possible it would be great if you guys could retest the temperatures, noise and power draw for such a modded GTX480.

I see no reason why you would want to keep CIV5 in the benches, since its obviously not reliable to test graphic card speeds.I would like you to add another strategy game in its place, something like Total war series, you know with large scale combat and all that good stuff.

And why isn't there no Starcraft 2 benchmark there, its basically one of two PC only titles and yet you don't include it?Reply

While it's clear from this review, as well as others, that the GTX 580 is a significant improvement over the GTX 480 in power usage, temps and noise I realized that what I've actually taken away from reading it is how totally awesome the Radeon 6870 is. In CrossFireX especially.

I reckon that's not quite what Nvidia had in mind but meh, it seems the absolute extreme-end cards still aren't a winning proposition.Reply

Aside of the odd issue with minimum frame rates, Barts is a notable improvement over Cypress for Crossfire. If I was to paraphrase a quote taken from the 460 launch, this would be Evergreen done "right", at least in terms of multiple graphics card setups - better scaling and better power usage essentially means less waste, and it's not as if Evergreen was hungry to start with. Just a shame that AMD has effectively limited Barts to two cards at once.

As for the 580, very well done, however I think we're going to have to wait for another redesign before power usage is properly tackled.Reply

It would be nice if you could include 5770 CF scores in the future. I have a Sapphire Vapor-x running at 960 / 1350 and it's very quiet and performs ok, and I could get another one for about $140 on Newegg. The 5770 has similar or faster tesselation performance to that of 5870 (due to faster clock speed) and in CF has similar shading / texturing abilities. It would be nice to see how 2 of these at stock speeds compare to the other cards. Overall this is a great article as always!Reply

look how quick he do an sli and the 580 just release yesterday and when the 6870 and 6850 was release there wasnt any xfire the next day but he could also do an over clock 460 sometimes i wonder why amd even send u guys card to review Reply

Well, most simple explanation between these different heat result is that the review part was cherry picked (like they normally are, so that you can get better overclockin results etc.) and the Asus one is normal version, that you can normally expect to get.When the production run gets better, I think that we will see more chips that are more like the early cherry picked one...Reply

That's a slippery slope to start going down. Ultimately, it's up to Anandtech to decide what they can/can't will/won't review, not up to the whims of some chipmaker (despite that there are only 2 chipmakers these days).

Reviewing a pre-built gaming system takes a lot of time and effort to do properly, and anything less degrades the reputation of the review site, even if it's just to jump through hoops with some snooty vendor. "If you're willing to jump through THIS hoop to do what we want here, can we withhold a new card for you on the next launch unless you give us complete editorial powers on that review?"

Ultimately, the separation between the review site and the vendor is extremely important to help ensure unbiased, thorough reviews (something that I've appreciated from Anandtech over the years) complete with "here's a bunch of numbers, but what does this all mean in context" (aka "analysis"). Part of that is not bowing to a vendor's wishes (outside of a standard and acceptable product launch NDA).Reply

I'd like to request that y'all start testing Eyefinity and Vision Surround setups and resolutions for the games that supports them. Granted, there is a vastly difference in how it is supported in hardware, it'd be nice if any hardware capable of this can be tested.Reply

With GTX 580, 5970, and the 68xx series as exceptions, I've ran every other config on that GPU load temp chart, and have lowered temps drastically using a combination of a mobo that allows card spacing, a proper case, decent airflow, TIM replacement, and an Afterburner fan profile. I'm not talking jet engine; there's a tolerable middle ground with any card/system.

As an example, I have GTX 470s right now loading to 70-73c (both cards) in Crysis & Metro 2033, and GTX 480s loading to 78-81c. I don't game on Furmark so it's irrevelant.

It's a shame to see such high numbers knowing that reviewers have done ZERO to find a tolerable medium. I have respect for what they do; but geez, people around the world see this and take it for gospel. You could at least show that temps can be improved..Reply

They must be under new management as of this year because the best deals there these days are on watches and coffee makers LOL. Seriously when Best Buy starts to beat you on components and not just on laptops (BB has been beating newegg on laptop prices for at least 2 years now) something is wrong. Newegg is not the company we have grown to love anymore, and it's sad. I haven't bought anything there (but continue to buy tech items) this year... and this comment made me laugh: When we first saw Newegg post their GTX 580s for sale our jaw dropped as they were all $50-$80 over NVIDIA’s MSRP-AND- However after checking out MWave, Tiger Direct, the EVGA Store, and others, we saw at least 1 card at MSRP at each store...So it's officialNEWEGG = FAILReply

Looking to upgrade a folding rig, and have been watching the new Nvidia cards with interest, since the 4xx rolled out. I read an early review of the 580, and one thing mentioned was like 380 million circuits (?) were disabled to get the heat down, and gaming perf up. If main concern is folding perf, heat and power consumption be damned, would the full bore, un-gimped 480 be theoretically better for this? Seems that the price currently for either version "superclocked" is somewhat close, so $ isn't really a dealbreaker, just want the best perf. Thoughts?Reply

I think the last page shows that Stream Processors perform the same, whether in a 580 or 480. As Ryan pointed out, Folding and SmallLux performed 6-7% higher, the same increase in SP's that the 580 has. To get an index of performance for any Fermi part, just multiply the number of SP's with the core clock.Reply

The problem is, not only is "oh, we didn't have second 5970", (card that is out for how many month?) a rather strange excuse, but even it still doesn't justify claiming nVidia got a "new record" in dual config.Reply

Assuming we don't drop it entirely the next time we refresh our suite, we'll probably go to all Enthusiast for at least 1 resolution. Up until now, only a couple of cards have been fast enough to run it at anything resembling a playable framerate.Reply

Let's wait and see. If the model numbering is as skewed as with the 68xx series, the 69xx series may not be as fast as you think... though the dual GPU card should perform better thanks to the improved Crossfire performance with the new cards. Again, let's wait and see, however as two 6870s are quicker and less greedy than one 580, and noting that the 5970 uses much less power than two 5870s (granted, it's technically two 5850s but that doesn't explain all the difference), AMD could have a good dual GPU card on the way.Reply

They are unlikely to do so. The 5970 is being phased out of production completely to make way for the 6000 series version that is likely to come soon or to encourage people to instead buy 2x 6870s. Not to mention it's a 700 dollar card.Reply

I don't see what the big deal is. Yes, most likely 2x 5970s will beat 2x 580s.

If that's all you want to hear; there it is. Just picture an added graph bar with higher fps above the 580 sli in your mind. It works just as well.

There's really not much point in trying to prove it at this time as ATI doesn't intend to keep selling them much longer. They have no incentive to push it.

Again for most of the 5970s currently being sold, buying 2 would cost someone $1360 whereas buying 2 580s would be $1000, so it's expected that 5970s should have more total power. Switch this up to 3x or 4x 580s and both price and performance probably go higher for the 580s.Reply

The 6870 CF and 470 SLI are still the BEST bang For the $$$ here when you consider the performance and $$ spent .You get Better performance then either of AMD's or Nvida's TOP cards and at less money then either single card solution.Here's hoping that AMD's New cards drive down prices !! even more !Competition is great .Reply

Since the 580 would be more bandwidth-bottlenecked than the 480 at same clock speeds, the bandwidth of the 580 should also be 6.67% higher than that of the 480, in order to keep it in line with 6.67% more shaders and TMU's.

Then we could subtract exactly 6.67% from the overall performance gains, to compare it directly to GTX 480.

I've added up an average of the normalized gains (excluding Civilization V since it's CPU-bound), and subtracted 6.67% from the average for the grand result of 3%, after subtracting 2% for the bandwidth penalty. The 2% penalty is just an estimation of how much it would have gained with 6.67% more bandwidth (with the memory running at 3942MHz to keep the bandwidth perfectly linear with the core muscle). If 2% penalty is still too much, please also consider the ROP penalty, as there are still only 48 ROP's for full 512sp, compared to 48 ROP's for 480sp, but then the penalty for ROP's should be very slight, given that 48 ROP's are already plentiful and hardly ever reaches 100% usage with 4x AA.

The grand result of 3% is a bit lackluster for doubled FP16 texturing power along with minor z-culling and other architecture optimizations.Reply