Whats the point of playing games at larger resolutions than 1600x1050.

In fact I'd say that 720p resolution is probably the best to play games at, because it tends to be easier to follow since pixels kind of move faster and you have more precision and smoother gameplay experience.

Exactly, some of us have panels with native 2560x1600. I _could_ game at some miserable 1600x1050 resolution, or I could play at my native resolution. I choose 2560x1600 and ignore all review results at inferior resolutions. Damn you Crysis, damn you!Reply

Um...no. For one, HUD elements tend to shrink in physical size as resolution rises, meaning that games with a lot of HUD (WoW comes to mind) benefit by letting you see more of what's going on, which means that 720p is pretty friggin' awful. For two, 1920x1080 has become the standard for most monitors over 21" or so, and a lot of gamers get 1920x1080 displays, especially if they're also watching 1080p video or doing significant multitasking. Non-native resolutions look like ass, and as such, 1600x1050 is right out as you won't want to play at anything but 1920x1080.

Now, you can say that there isn't much point going above that, and right now, that may be so as cost is pretty prohibitive, but that may not always be the case.Reply

What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

Couldn't agree more, but since we know from the 1,5 GB 580 that the nVida card do poorly in higher resolutions, AnandTech is probably never test any such setup. Expect 12x10 instead, as nVidia tends to do better in low resolutions than Amd. 19x12 is already irrelevant with these cards. Reply

I don't understand this either. There is no need for anything fancy, heck you don't even need to have them actually outputting anything, just fool the drivers into THINKING they are driving multiple monitors!Reply

I don't entirely agree. While it doesn't matter much for simple average FPS benches like Anandtech is currently doing, they fall well short of the maximum playable settings testing done by sites like HardOCP.Reply

I would have thought that the marketing departments of companies like Asus, Benq, Dell, Eizo, Fujitzu, HP, LaCie, LG, NEC, Philips, Samsung and ViewSonic would cream their pants at what is really very cheap PR.

Supply sets of 3 or 5 1920x1080/1920x1200 displays and 3 or 5 2560x1440/2560x1600 displays in exchange for at least a full year's advertisement on a prominent tech website.

If we use Dell as an example, they could supply a set of five U2211H and three U3011 monitors for a total cost of less than 5,900 USD per set. The 5,900 USD is what us regular people would have to pay, but in a marketing campaign it's really just a blip on the radar.

Now, excuse me while I go dream of a setup that could pull games at 9,600x1080/5,400x1920 or 7,680x1600/4,800x2560 :DReply

I'd just like to note that advertising is handled separately from editorial content. The two are completely compartmentalized so that ad buyers can't influence editorial control. Conversely as an editor I can't sell ad space.Reply

Well there is advertising and advertising. Asking multiple monitor manufacturers to supply sets of screens in order to test them for multi monitor setups, where the best sets will be used for a longer test period to see how they keep up over time, doesnt interfere with your journalistic integrity, bit for the manufacturers the cost can be written of as PR/advertising costs.in the end your readers will get the information they want, your integrety will be intact and we also will find out what the best monitors are for multi monitor setups.Shouldnt be that hard now should it? Reply

Once you are dealing with framerates above about 80fps at the highest resolutions and quality settings, multi-monitor is one option, or it is time to start cranking up the anti-aliasing as the results are otherwise irrelevant.

With one card doing 100fps and another managing 140fps in a game, the best card for that game is the one which is cheapest even if only slightly, unless you can do something to bring the framerates down in a useful way, and higher AA than 4x is the way to go.Reply

As a general principle I agree, however 8x AA really doesn't do anything for IQ. I'd rather turn on SSAA/TrAA/MLAA, however AMD and NVIDIA do not have a common AA mode beyond DX9 SSAA. So for DX10/11 titles, MSAA is still the lowest common denominator.Reply

100% agree. i own an AMD card too, and i felt like anandtech was extremely positive about the 5800 series when i bought that card.

it also looks like they are leaning towards the 6990 vs the 590 in this very article. to paraphrase... the 6990 is going to be better for current new games and future games, but the 590 seems to do a bit better with older titles and new games played at lower resolutions.

i don't understand how anyone could think that makes them nvidia biased. it boggles the mind.Reply

and for your question i would like to answer with other question: Why buy faster cpu if you won't need faster at the moment? Granted there is no limitations like dx compatability, but still you can use your gfx card quite a while. Im still using 3870x2 card (3 years), mainly because there are so few dx11 games.Reply

For the "hype" they were throwing out with that video preview, I thought they were going to pull a fast one and launch a GeForce 6-series. But a teaser video for a super-enthusiast GPU using existing tech that'll be so rare I'll probably never physically look at one in my lifetime?

... are you guys not testing these cards at multi-monitor resolutions? You said yourself in the review that these are marketed towards the high end. Well, the high end isn't just 30" monitors. Multi monitors boast a higher resolution and I think it's important to know how well these dual GPU monsters perform at the level they are marketed towards.Reply

i got sick of buying the latest video card when they hit $399 years ago. around 60fps you really don't notice any difference in speed so getting 100fps or some other number doesn't do it for me anymoreReply

That just goes to show that you play the wrong games then. The new top of the line games really can push the $400 cards fairly well at 1920x1080 and full details. With DirectX 11 support, these new games really push the limit. Then you have things like Eyefinity, driving 5860x1080, and you want more than a $200 card.Reply

Yeah I don't get it either. The last review of the 6990 was fantastic with how it was color-coded. Now you have a DIRECT competitor in both price and performance and the 6990 is never highlighted in the charts!?!? It made it a real PITA to always go hunting for the 6990 bars when they should have been labeled from the get-go.

I've changed Ryan's colors. Green is for the stock and EVGA clocks, yellow is for OC 590, red for 6990, and orange for 6990 OC. For the color blind, I apologize but using green/red for the primary cards is pretty standard.Reply

"After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, which is far from being an extreme test."

"I most strongly advise anyone to stay away from overclocking this product and use extremely conservative settings, maybe up to 650 MHz and no voltage adjustments."

Looks like it is no OC champ.

For high res action, H got their review up. 6990 even beats the 590 at Civ V. Conclusion is damning too:

"We truly thought the GTX 590 was going to make the Radeon 6990 look bad, but the fact of the matter is that NVIDIA made the 6990 look that much better. The GTX 590 is not the "World's Fastest Single Card Solution" as stated on our page 1 slides; the Radeon HD 6990 is very much retaining that title. Hail to the King, baby! "

2x 6950 on water looks to be my next buy. Must buy wider desk though :-)Reply

Fanboy much? Anyone who is an AMD/ATi fanboy shouldn't be crowing about nVidia driver quality. That's one reason I left AMD and went full nVidia-only GPUs after a 4870 purchase. nVidia cards also play a lot nicer on Microsoft FSX, but not everyone cares about that just like not everyone cares about Civ5. Reply

Yup. But who doesn't expect voltage tuning today? Read the back of the ASUS box: "50% faster with voltage tuning." Nice way to say: This card can't be oc'ed. It will catch fire. But of course, you are overvolting so it is your own fault.

But, only reason I posted it was to warn ppl who invest $700 in a card and has it fry on them. I'm not arguing who shall take blame.

Thanks man. Already have one evga 580. Only reason I was considering was for the step up program evga offers (590 for around $200). I have till first part of June to think about it but am leaning towards adding another 580 once the price comes down in a year or two.Reply

Crysis: Warhead will always be the benchmark. Crysis 2 isn't nearly as demanding. It's been dumbed down for consoles, in case you haven't heard. There are no advanced settings available to you through the normal game menu. You have to tweak the CFG file to do so.

I thought like you, originally. I was thinking: Crysis 2 is gonna set a new bar for performance. But in reality, it's not even close to the original in terms of detail level.Reply

I know I can be annoying, but I checked it myself and the built in benchmark in Civ V really favours nVidia cards. In the real world scenario the situation is almost upside down. I got a reply from one of the reviewers last time that it provides quite accurate scores, but I thought that just for the heck of it you'd try and see for yourself that it doesn't at all. I know it's just one game and that benchmarking is a slow work, but in order to keep up the good work you're doing you should at least use the advice and confront it with the reality to stay objective and provide the most accurate scores that really mean sth. I don't mean to undermine you, because I find your articles to be mostly accurate and you're doing a great job. Just use the advice to make this site even better. A lot of writing for a comment, but this time maybe you will see what I'm trying to do. Reply

"It really doesn’t seem like it’s been all that long, but it’s been nearly a year and a half since NVIDIA has had a dual-GPU card on the market. The GeForce GTX 295 was launched in January of 2009, the first card based on the 55nm die shrink of the GT200 GPU."

Well, shit. I thought Jan 2009 was TWO and a half years ago. I MUST GET BACK TO THE FUTURE!Reply

"....However the noise results are nothing short of remarkable – if NVIDIA can dissipate 350W+ of heat while at the same time making 5-7dB less noise, then it starts to become clear that AMD’s design has a serious weakness. The ultimate question is what did NVIDIA do right that AMD did not?...."

I can't see anyone tolerating the noise level of the 6990. But the 590 is barely noisier than a 580. So an easy win for nVidia if you really need/can afford one of these monsters.Reply

Ya, even the 6970/6950 are hot cards. Very disappointing after a very cool and silent 5870. I think AMD had a problem with the chips and never intended for them to be so hot. Maybe they had to crank up the power to get them to run right? idk...........Reply

Major bias with the OC listing in the charts... the OC version is not enough... but a 20+% OC is included versus the other standard configs... and the lousy flip switch OC mode of 6990... not around 940/1400+ that other sites have attained.... that offers 6970CF+ performance :s ...Why dont they show 5870/6950/6970 CF & 470/480/570/580 SLI etc with appropriate 20%+ overclocks to put these cards in their place... especially with price vs performance.... the 2gb 6950s also having the ability to be flashed into 6970s too... not bad CF for price...

The second fastest single card out there.... but still a beast and its kept its idle power within reason... i think its time for 28nm tech asap... as the carbon taxes on these bad boys will be horrendous...lolReply

When you do the 3x monitor review can you please include last generations top end card (5970) for comparisons. Eyefinity and co is really where it is at with this monster graphics cards and in my experience the 5970 just doesn't have the horse power to play well at 5760x1200. I would really like to see how much difference these new cards and their increased RAM actually makes.

50% performance compared to last generation at 2560 is OK, but do they get even more distance with the higher resolution?Reply

Ryan, I commented last week on the 550 review. Just to echo that comment here: how are you getting the "nodes per day" numbers? Have you considered switching to a points per day metric? Very few people can explain what nodes per day are, and they aren't a very good measure for real world folding performance.

(also, it seems like you should double the number for this review, since I'm guessing it was just ignoring the second GPU)Reply

Last year NVIDIA worked with the F@H group to provide a special version of the client for benchmark purposes. Nodes per day is how the client reports its results. Since points are arbitrary based on how the F@H group is scoring things, I can't really make a conversion.Reply

Good to see that a $700 finally has a decent cooler! Why would somebody spend $700 & then go and hafta spend another $40 for an aftermarket cooler??? nvidia & AMD really need to just charge $750 and hve an ultra quiet card, these people in this price range are'nt gonna squabble over an extra $50 for petes sake!!!! it makes no sense that they skimp on the cooler at this price range! this is the top of the line where money isnt the issue!Reply

Let's get this straight, nVidia. Slapping two of your existing GPUs together does not make this a "next-generation card". Saying that you've been working on it for two years is also misleading; I doubt it took two years just to lay out the PCB to get two GPUs on a single board.

SLI and Crossfire still feel like kludges. Take Crysis 2 for example. The game comes out, and I try to play it on my 295. It runs, but only on one GPU. So I go looking online; it turns out that there's an SLI profile update for the game, but only for the latest beta drivers. If you install those drivers *and* the profile update, you'll get the speed boost, but also various graphical corruption issues involving flickering of certain types of effects (that seem universal rather than isolated).

After two goes at SLI (first dual 285s, next a 295), I've come to the conclusion that SLI is just not worth the headache. You'll end up dealing with constant compatibility issues.Reply

For the record, I've had three goes at CrossFire (2 x 3870, 4870X2, and now 2 x 5850). I'm equally disappointed with day-of-release gaming results. But, if you stick to titles that are 2-3 months old, it's a lot better. (Yeah, spend $600 on GPUs just so you can wait two months after a game release before buying....)Reply

"For the GTX 590 launch, NVIDIA once again sampled partner cards rather than sampling reference cards directly to the press. Even with this, all of the cards launching today are more-or-less reference with a few cosmetic changes, so everything we’re describing here applies to all other GTX 590 cards unless otherwise noted.

With that out of the way, the card we were sampled is the EVGA GeForce GTX 590 Classified, a premium GTX 590 offering from EVGA. The important difference from the reference GTX 590 is that GTX 590 Classified ships at slightly higher clocks—630/864 vs. 607/853.5—and comes with a premium package, which we will get into later. The GTX 590 Classified also commands a premium price of $729."

Are we calling overclocked cards "more-or-less reference" cards now? That's a nice way to put it, I'll use it the next time I get stopped by a police officer. Sir, I was going more or less 100mph.

Reference is ONE THING. It is the basis and does not waver. Anything that is not it is either overclocked or underclocked.Reply

Bad example, as in the US at least your speedometer is only required to be accurate within 10%, meaning you can't get ticketed at less than 10% over the speed limit. This card is only overclocked by 4%. More importantly, they a) weren't sent a reference card, and b) included full tests at stock clocks. Would you rather they not review it since it isn't a reference card?Reply

Maybe reject the card yes, but that is not going to happen. Nvidia is just showing who is boss by sending a non reference card. AT will have to swallow whatever Nvidia feeds them if they want to keep bringing the news.Reply

Can you explain how can you nullify the modified / improved cooling system of the eVGA compared to reference cards?

A reference card with stock voltages/frequencies may still run worse than this eVGA when downclocked, for example because on the reference card the voltage regulators may heat more and throttle the card more often.

Otherwise.... not an ATI fan but it's painfully obvious you're not focusing on this review on things that make this card look bad, like playing on 2 x 1920x1080 monitors or something like that.Reply

I hate coming to anandtech sometimes because so many of the comments are from wannabe editors. It's annoying and takes away from the excellent content..

Why not have an "email correction" button vs taking it out in the comment section? It's weird, I don't see this anywhere else - you guys must be a particularly anal group...

To go back O/T, I miss the days of powerful sub-$250 graphics cards. There's a market for it but all the action's on the high-end. Remember when the affordable TNT2 would play every recent game at playable framerates?Reply

For me the gaming differences between AMD Nividia at this level would not get me to change allegiance. In my case I bought two EVGA classifieds but they may never see a game. I use them as low cost replacements to the Telsa cards. For Parallel processing I held off until we had powers of two ie 256 512 1024 cores. I did not like 240 432 480 alternatives. Just easier to work with powers of 2. The lower noise and lower power were big winners. Evan if the power supplies can handle them you still have to get the heat out of the room. My 4 nodes of 1024 cuda cores each are a pain to air condition in my den. I really love you gamers, you buy enough GPU's to keep the competition going and lowering the price for impressive computation power. Thanks all!

I thought you guys learned your lesson the last time you pitted a factory-overclocked reference nVidia card against a stock AMD card. If you're going to use an overclocked card in an article that's meant to establish the performance baseline for a series of graphics cards (as this one does for the GTX 590 series), then you really should do at least one of:

1. Downclock the overclocked "reference" card to stock levels.2. Overclock the competing AMD card by a comparable amount (or use a factory overclocked AMD card as the reference point)3. Publish a review using a card that runs at reference clocks, and then afterwards publish a separate review for the overclocked card.4. Overclock both cards to their maximum stable levels, and use that as the comparison point.

Come on now, this is supposed to be Anandtech. It shouldn't be up to the readers to lecture you about journalistic integrity or the importance of providing proper "apples to apples" comparisons. Using a factory overclocked card to establish a performance baseline for a new series of GPU's is biased, no matter how small and insignificant the overclock may seem. Cut it out.Reply

anybody else remember back in the era when pentiums just kept getting bigger & hotter every year? i wonder when they'll start making gpus smaller, cooler, quieter like they finally ended up doing with CPUsReply

no offense anandtech, but this card is aimed squarely at the bleeding edge consumers much like the amd 6990 is. to that extent, any video card can only add performance to a system with respect to how much the system can deliver on the cpu side. as for the base system itself, it's a basic rig, nothing spectacular now. the gtx 590 and the amd 6990 restively would both perform better, and your results would prove more the limitations and capabilities of those cards if they were being run on the platforms they were targeted for. an example of what i mean: tom's hardware used a test platform based on an intel i7-990x OC to 4ghz paired to an asus rampage III formula mb vs. anandtech's older i7-920 clocked at 3.33 ghz paired to an asus rampage II extreme mb. the review from toms hardware nvidia's gtx 590 and amd's 6990 both performed far better than on anand's rig, but still similar overall. personally, i think i'll stick with my sli gtx 580 oc water cooled setup for performanceand get an upgrade for my cpu, neither the 6990 or 590 in any configuration is worth the expense for the miniscule gain in performance on the graphics side.Reply

Another important review from AT, another biased review from AT. GRRR.- AT chooses NOT TO overclock HD6990 BUT presents un-overclocked results as HD6990 OCYeah, it could embarrass our masters if AMD's built-to-overclock card was presented deemed overclockable

- "The GTX 590 simply embarrasses the 6990 here; it’s not even a contest."Yeah, 4dB is no contest, embarrassment, of course. It is AMD's card after all. (it is louder, no question there)

PR mercenaries at their best. Lets brace ourselves for another round of PR warfare when BD and Llano launch ...Reply

Don't you know that noise level goes by factor of 10 with 10dB increase? Do the math, and you will find 6990 2 times louder than 590. Indeed it's no contest. You can check out Linus Tech Tips' video review you Youtube. 6990 is definitely much, much louder than 590.

The article itself isn't biased. 6990 and 590 have similar win-some-lose-some situation just like most cards at similar price range (570 vs. 6970, 560 vs. 6950, 460 vs. 6850 etc.) Darn it's impossible to have a real card king these days when both NV and AMD are paying developers for optimization.Reply

What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

Interesting article, however it would be useful to know what power supplies you recommend.

I am interested in using a pair of these cards for running an OpenCL application flat out and I want to know if my Tagan 1100W supply is sufficient; It's running a GA890-GPA-UD3H/Phenom x4 955 with two unlocked 6950's at the moment.

Looks like it might just do it with a whisker to spare, but I'd like to avoid expensive pyrotechnics.Reply

im not really into giving all my money to the same manufactorer every year just to buy a new video card that does nothing else accept increase FPS by 3-10 fps. Sometimes it doesnt increase it at all. If what you have already owns everything out there, i dont see the point. I only see the point if what you have DOESNT do what you want it to do. AMD 6870 is running every game iver played so far on max withotu issue at highest resolution. We are in the console era.

Unless your playing Arma 2 and Shogun total war 2, but even then thats more CPU related adn an amd 6870 combined with a high end cpu prety much gets the job done at high res. NOT 30" though. Bu my eyes are screwed up enough from staring at a montier, than even want to go to 30 ".Reply