I wish there was a separate button to point out this sort of thing so they could silently correct it. Dont get me wrong, I think its good to have accurate information, just clutters things up a bit.Reply

If you look at accumulated benchmarks across the web, the 680 Nvidia cards beat the 7970 amd cards by a much higher percentage in 1920x1080 (17.61% ahead) than they do in 1920x1200 (10.14% ahead). This means anand reviews always tests in 1920x1200 to give the amd cards a prettier looking review, instead of testing in 1920x1080 (the most commonly available resolution at 1920x that they could easily set their 1920x1200 monitors to). Hence their tests here at anand are likely also amd favorably biased in higher resolutions.http://translate.google.pl/translate?hl=pl&sl=...Reply

Sadly, it is a very uncommon resolution for new monitors. Almost every 22-24" monitor your buy today is 1080p instead of 1200p. :(

Not mine. I'm running a 1920x1200 IPS.1920x1200 is more common in the higher end monitor market.A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230)Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.

It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons:1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080.1680x1050 = ~1.7MP1920x1080=~2MP1920x1200=~2.3MP2560*1440=~3.7MP2560x1600=~4MP

2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.

3) They already have three of them around to run 5760x1200. Why go get another monitor?

Opinionated Side Points:Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference.1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions.I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).Reply

There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300.+There are 242 models available in 1920x1080, with nearly 150 models under $300.You people are literally a bad joke when it comes to even a tiny shred of honesty.Reply

"Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."

I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards. Reply

You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cyaReply

I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.

Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.

So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.

P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.Reply

Keep laughing, this card cannot solid v-sync 60 at that "tiny panel" with only 4xaa in the amd fans revived favorite game crysis.Can't do it at 1920X guy.I guess you guys all like turning down your tiny cheap cards settings all the time, even with your cheapo panels?I mean this one can't even keep up at 1920X, gotta turn down the in game settings, keep the CP tweaked and eased off, etc.What's wrong with you guys ?What don't you get ?Reply

Currently the only native 120Hz displays (true 120Hz input, not 60Hz frame doubling) are 1920x1080. If you want VSYNC @ 120Hz, then you need to be able to hit at least 120fps @ 1080p. Even the GTX690 fails to do that at maximum quality settings on some games...Reply

You must be talking about minimum fps, because on Page 5 the GTX690 is clearly averaging 85fps @1080p.

Tom's Hardware (love 'em or hate 'em) has benchmarks with AA enabled and disabled. Maximum quality with AA disabled seems to be the best way to get 120fps in nearly every game @ 1080p with this card.Reply

You must be ignoring v-sync and stutter with frames that drop below 60, and forget 120 frames a sec. Just turn down the eye candy... on the 3 year old console ports, that are "holding us back"... at 1920X resolutions. Those are the facts, combined with the moaning about ported console games. Ignore those facts and you can rant and wide eye spew like others - now not only is there enough money for $500 card(s)/$1000dual, there's extra money for high end monitors when the current 1920X pukes out even the 690 and CF 7970 - on the old console port games. Whatever, everyone can continue to bloviate that these cards destroy 1920X, until they look at the held back settings benches and actually engage their brains for once.Reply

You are correct, I don't own one... I own three in triple screen. Dell U2412m's.

I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.Reply

Are you being sarcastic or an idiot?From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.

If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.Reply

There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg._In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time. So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200... I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.Reply

Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.

I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty:You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move.If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones.Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion.Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.

The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.

Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?Reply

Hm, odd.Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now.Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.Reply

The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards.Reply

1920x1200 was very common for several years. Until a few years ago, they were much more common than 1920x1080. I even have an old laptop that's 1920x1200. Looking at what's available to buy new, today, doesn't tell the whole story. Because people don't replace their monitors every day.

Anandtech has always recommended spending up and getting a quality monitor. You see it in nearly every review. So, I think the readers here are more likely than the average guy on the street to own less common screens. I've had the same 2560x1600 monitor through 3 computers now, and I spent more on it than I've ever spent on any computer.Reply

Yes, you're all super premium monitor buyers, and moments ago you were hollering the videocards are way too expensive and you cannot possibly afford them unless you are an idiot with too much money. I love this place, the people are so wonderfully honest.Reply

1920x1200 is only rare now. i've gone thru enough monitor to know what I like and cheap 16:9 TN panels are not if its that good enough for you then enjoy.

As for your other comment about v-sync and 4xAA Guess what some of us don't care to have 8x AA and 16XAF running all the time.

I would rather play at 1200p at high settings with AA and AF off if it means playable fps and a enjoyable experience. This isn't [H] i'm not gonna spend $1000 on a Gpu so I can meet your approved settings for playing games dude. Get a clue!Reply

No...I couldn't afford one but I very much wanted to buy one. It is much prettier than 16:9 for workstation purposes. New ones are being released all the time. You just have to pay more, but its worth it.Reply

Oh, so someone who almost wants to be honest. So isn't it absolutely true a $500 videocard is much easier to buy when your monitor doesn't cost half that much let alone twice that much or $2,000 plus ?You don't need to answer. We all know the truth.Everyone in this thread would take a single videocard 680 or 7970 and a 1080P panel for under $200 before they'd buy a $450 1200P monitor and forfeit the 680 or 7970 for a $200 videocard instead. It's absolutely clear, no matter the protestations.In fact if they did otherwise, they would be so dumb, they would fit right in. Oh look at that, why maybe they are that foolish.Reply

Oh? A little over a year ago, I had some money for an upgrade and I wanted to upgrade either my monitor or my video card.Now, I have (and play) Crysis, which can only now, just barely, be handled by a single card, so obviously I could have used the GPU upgrade (still can, for that matter). I also had a decent (though not great) 22" 1920x1200 monitor.

However, despite that, I chose to buy a new monitor, and bought a used 3008WFP (30" 2560x1600). I have not regretted that decision one bit, and that was a lot more money than your $200-300 upsell for 1920x1200Now, admittedly, there were other factors that were a consideration, but even without those, I would have made the same decision. Putting money into a good monitor which I'll use ALL the time I'm on the computer vs. putting money into a good video card that I'll use some of the time is a no-brainer for me.If all of my electronics were taken and I were starting from scratch, I'd get another 2560x1600 monitor before I even bought a video card. I'd suffer through the integrated IGP as long as I needed.

Now, that's my choice, and everyone's needs are different, so I wouldn't demand that you make the same decision I did, but, by the same token, you shouldn't be expecting everyone to be following the same needs that you have. ;)Reply

You've jumped from 1920 to 2560 so who cares, not even close. In your case you got no video card. ROFL - further proving my point, and disproving everyone elses who screamed if you get this card you have another two grand for monitors as well - which everyone here knows isn't true.

I never demanded anyone follow any needs, let alone mine which are unknown to you despite your imaginary lifestyle readings, and obverse to the sudden flooding of monitor fanboys and the accompanying lies.Reply

You are right that the average Joe doesn't have a 1920x1200 monitor, but this is an enthusiast web-site! Not a single enthusiast I know owns a 1080 display. 1920x1200 monitors aren't hard to find, but you will need to spend a tad more.Reply

Nope, 242 vs 16 is availability, you lose miserably. You all didn't suddenly have one along with your "friends" you suddenly acquired and have memorized their monitor sizes instantly as well.ROFL - the lies are innumerable at this point.Reply

It's either 1920x1200 @ 60hz, or 1920x1080 @ 120hz. I prefer smoother gameplay over 120 pixels. Also I know quite a few gamers that like using their TV for their PC gaming, so this would also be limited to 1080p.Reply

I'm be more worried about AMD's performance going down in certain games due to Crossfire than something as trival as this. As a 4870X2 owner I can tell you this is not at all uncommon for AMD. I still have to disable 1 GPU in most games, including BF3, because AMDs drivers for any card more than 12 months old are just terrible. As you can see even the 6990 is being beat by a 6970 in games as modern as Skyrim - their drivers are just full of fail. Reply

Except that nVidia wins in the article and all of the accumulated benches here, even at 1920x1200 (which this card would be a complete waste on...), so what exactly are you complaining about?It's bias if they say that the AMD cards are better when they're not, but in the benchmarks and in the conclusions (here and elsewhere), nVidia is consistently ahead, so any claims of bias are completely groundless...Reply

Read my first post instead of asking or having already read it attack like you just did and continue to be a jerk who cares, right ? You obviously are all seriously upset about the little facts I gave in my very first post. You're all going bonkers over it, and you all know I'm correct, that's what really bothers all of you. Continue to be bothered, you cannot help it, that's for sure.Reply

I guess all of you got very upset that my point was made, you're looking at a biased for amd set of benchmarks. I'm sure most of you are very happy about that, but angered it has been effectively pointed out.Reply

On the one hand, if they are trolling just for the reaction, it's fascinating. What kind of weird creature lies behind the internet persona? In most cases, we all know it must be a sad figure of a person with all sorts of interesting personality problems.

But on the flip side, if this person actually means and believes what they say is some sort of honest analysis, it's just as fascinating. What kind of thick bastard must then lurk behind the keyboard in question?

I think that these creatures are Nvidia's fanboy, they react always the same way. CeriseCogburn remember me one of them a little while ago, can't remember his name. He was convinced that the 7970 pricing was the worst thing to ever happen to humanity since the birth of Justin Bieber, or at least, it looked alot like that. Sure the price wasn't attractive, but there's some limit you must not cross to stay in the real world.

So as weird a creature they can be, I believe they are a result of Nvidia's funding them to spread insanity in forums speaking of video cards. They can't be natural things after all, they just don't make sense. Their closed mind is second to none. Or else, they could only have the possibility to type insanities and a filter to read the replies to stop some information entering their brain.Reply

Do you really think ATI and nVidia would pay these weird, sad, little trolls to piss off readers every time one of their products is reviewed? It's an embarassment and a distraction. No, I think they would pay someone like that NOT to talk about their products if they could. I'm sure that employees do write comments on product reviews, but guys like this are bad for business. Nobody wants someone like that on their side. If I were nVidia, I'd pay that guy to become an AMD fan!!!Reply

I'm certain they would pay none of you since not a single one can be honest nor has a single argument to counter my points. You're all down to name calling trolls - and you all have to face the facts now, that your clueless ignorance left out of your minds for some time. Have fun buying your cheap 1080P panels and slow and cheapo amd cards - LOLOh sorry, you all now buy premium flat panels...Reply

No actually I expected a lot more from the people here.I expected a big thank you, or a thanks for the information we'll keep that in mind and it helps our purchasing decisions. Instead we got a flood of raging new monitor owners and haters and name callers. Next time just thanking me for providing very pertinent information would be the right thing to do, but at this point I don't expect any of you to ever do the right thing.Reply

I'm curious why the 680 and 690 trail AMD cards in Crysis and Metro, seeing as those seem to be the most GPU intensive games, while they win in most other tests. Would it be shading performance or something else?

My mind is pretty blow that we have cards that can run Crysis and Metro at 5760x1200 at very comfortable framerates now, that's insane. But barring that resolution or 2560 for some games, I'm sure most of us don't see appeal here, it will be sold in a very very small niche. For normal monitor resolutions, I doubt games in large quantities will get much more demanding until we have new consoles out. Reply

Oh, wow, they also are so biased toward amd they removed the actual most demanding game, Shogun 2, Total War, because I kept pointing out how the Nvidia 680's swept that game across the board - so now it's gone !ROFL(before you attack me I note the anand reviewer stated S2TW is the most demanding, it's right in the reviews here - but not this one.Reply

Oh I see it was added because the patch broke the Nvidia cards - but in amd's favor again, the tester kept the breaking patch in, instead of providing results.Wow, more amd bias.Glad my epic fails are so productive. :-)U still mad ? Or madder and raging out of control?Reply

So, if they failed to add it, it'd have been AMD bias, but considering they DID add it... it's AMD bias.

And you're the one talking about rage, trollboi?

Had you just merely mentioned that the patch doesn't provide favourable results for NVIDIA cards, Ryan might have been tempted to reinstall the game and retest. Well, he might have - can't speak for the guy. Doubt he will now, though.Reply

It's a very pertinent topic because despite the protestations the vast majority of readers are likely to have a 1080p monitor, and apparently they all have amnesia as well since this came up in a very, very recent article - one of the last few on the video cards - where it was even pointed out by Anand himself that one should extrapolate the fps data with the 11% pixel difference, something every last one of you either never read or completely forgot or conveniently omitted. Unfortunately Anand isn't correct, period. He should know it, but of course has to have an answer for the complainers - so he gave one. What should be clear, even by just looking at the 1920X benches here is that below that resolutin and above it don't always jibe with it - and personally I've noticed nVidia has issues with 1600X1200. So, all the hundreds of readers with 1080p monitors can thank me deep down in their hearts, as they now know Nvidia is supreme at that resolution, by 17%+ on average, more so than at 1200p, and thus "saving money" on an amd card using this sites 1920 1200p stats is for many, incorrect. Reply

And yet they specifically called out the fact that the patch broke performance on nVidia cards, went out of their way to state what performance was like before the patch (which is clearly better than any of the other cards), and finally stated that they're pretty sure that the game is at fault, not nVidia or their drivers...

Except in the 680 tests all you fools ignored S2TW, which I had to repeatedly point out was the hardest game in the bunch, not Crysis or M2033 - over and over again I had to tell all the fools blabbering, and now suddenly the game is "broken". ROFL - it's beyond amazing.Reply

Are you using the Steam version? Your results differ from that of HardOCP, hardwareheaven, and hardwarecanucks. They get scaling you don't. Your version should be dated March 2012, thats when the patch was released.Reply

This card has 2GB per GPU, not 4. Also, the lack of memory (!) will limit performance before the memory bus does. Compared to previous NVIDIA products, the 680 has far faster memory which mitigates having a narrower bus.Reply

The few reviews I've seen have 4GB GTX 680 card between 5% and 10% faster at high resolutions (starting at 2560x1440 to 7860x1600). Adding, on top of that some more memory bandwidth would have been the gaming card most people expected from nVidia.As it stands, the GTX 680 is good, but also very expensive (I can have t he 7970 for 65€ less). The GTX 690 is a good product for people who want SLI but don't have the space, PSU, SLI enabled mainboard or want 4 GPUs.Reply

They're being held back like the "real 680" top nVidia core, because nVidia is making more money selling the prior launches and the new 2nd tier now top dog cards. It's a conspiracy of profit and win.Reply

For instance the entire lot of 7870's and 7850's on the egg are outsold by a single GTX680 by EVGA - confirmed purchasers reviews. So it appears nVidia knows what it's doing in a greatly superior manner than your tiny mind spewing whatever comes to it in a few moments of raging rebuttal whilst you try to "point out what's obvious" - yet is incorrect.Reply

Every time my Anandtech feed updates, the first thing I'm hoping to see is reviews for the more-reasonably priced, and less power-hoggy GTX 640 (w/GDDR5) and GTX 660 Ti. If we see a review, then at least we know it'll show up at retail very soon after.

All I want for xmas is a mid-range NVidia card with a higher idle wattage to maximum performance ration than AMD (because NVidia > AMD wrt drivers, esp under linux).Reply

The GTX680 has sold more card by the verified reviewers at NewEgg than the entire lot of the 7870's and 7850's at NewEgg combined, and that's just with ONE GTX680 sold by EVGA - check it out my friend... ROFLGTX680 in one listing outsells the entire lineup of 7870 and 7850 COMBINED at newegg- with verified owners count.HAHAHAYes, the supply is always "key". ROFLReply

I must confess that every logic i can think of says i don't need this GPU.....but.....i want it....i don't need it.....but damn it....i want it.....it's nvidia....it's aluminium....it's 4 GB VRAM....it's probably 5 times faster than what i have.......and i want to congratulate the team for the review wich i read from start to finish...but to be honest with you.....you don't need 19 pages to describe it...for me...."futureproof" says it all.... Reply

Simply put, NVIDIA has superior software department in comparison with AMD.AMD is mainstream. Whenever they try to reach the high end, they fail miserably, both on GPU and CPU camps. Driver issues with crossfire, trifire and quadfire with or without eyefinity in numerous games (with eyefinity even more problems) etc.If they don't get their problems solved by Catalyst 12.5 buying AMD cards for high end builds (anything multicard related) is a waste of money. And that is sad.Reply

Yes, and the reviewer is constantly trying to catch nVidia in a big lie - and it shows - he even states how he never believed a word nVidia said about this card but had to admit it was all true. I have never, in many years, seen the same bad attitude given to amd's gpu's.The bias in the write up is so blatant every time it's amazing they still get nVidia cards for review. The reviewer is clearly so pro amd he cannot hide it.Reply

He did say that Crossfire was so broken that he couldn't recommend it. He's been pointing out flaws in both companies along the way I think you should dial back the bias accusations a little bit. Reply

Well if you want me to point out like 10 blatant direct wordings in this article I will. I'm not the only one who sees it, by the way. you want to tell me how he avoids totally broken amd drivers when he's posting the 7970CF ? Not like he had a choice there, your point is absolutely worthless.Reply

Because you idiots aren't worth the time and last review the same silverblue stalker demanded the links to prove my points and he got them, and then never replied. It's clear what providing proof does for you people, look at the sudden 100% ownership of 1920x1200 monitors..ROFL If you want me to waste my time, show a single bit of truth telling on my point on the first page. Let's see if you pass the test.I'll wait for your reply - you've got a week or so.Reply

It is indeed sad. AMD comes up with really good hardware features like eyefinity but then never polishes up the drivers properly. Looking some of crossfire results is sad too: in Crysis and BF3 CF scalling is better than SLI (unsure but I think the trifire and quadfire results for those games are even more in AMD's favour), but in Skyrim it seems that CF is totally broken.

Of course compared to Intel, AMD's drivers are near perfect but with a bit more work they could be better than Nvidia's too rather than being mostly at 95% or so.

Tellingly, JHH did once say that Nvidia were a software company which was a strange thing for a hardware manufacturer to say. But this also seems to mean that they forgotten the most basic primary thing which all chip designers should know: how to design hardware which works. Yes I'm talking about bumpgate.

See despite all I said about AMD's drivers, I will never buy Nvidia hardware again after my personal experience of their poor QA. My 8800GT, my brother's 8800GT, this 8400M MXM I had, plus number of laptops plus one nForce motherboard: they all had one thing in common, poorly made chips made by BigGreen and they all died way before they were obsolete.

Yep, that's true. They killed cards with a driver. They should implement hardware auto shutdown, like CPUs. As for the nForce, I had one motherboard, the best nForce they made: nForce 2 for AMD Athlon. The rest of mobo chipsets were bullshit, including nForce 680.

The QA I don't think is NVIDIA's fault but videocard manufacturers.Reply

No, 100% Nvidia's fault. Although maybe QA isn't the right word. I was referring to Nvidia using the wrong solder underfil for a few million chips (the exact number is unknown): they were mainly mobile parts and Nvidia had to put $250 million aside to settle a class action.

Although that wiki article is rather lenient towards Nvidia since that bit about fan speeds is red herring: more accurately it was Nvidia which spec'ed their chips to a certain temperature and designs which run way below that will have put less stress on the solder but to say it was poor OEM and AIB design which lead to the problem is not correct. Anyway, the proper expose was by Charlie D. in the Inquirer and later SemiAccurateReply

But in fact it was a bad heatsink design, thank HP, and view the thousands of heatsink repairs, including the "add a copper penny" method to reduce the giant gap between the HS and the NV chip. Charlie was wrong, a liar, again, as usual.Reply

The problem was real, continues to be real and also affects G92 desktop parts and certain nForce chipsets like the 7150.

Yes, the penny shim trick will fix it for a while but if you actually were to read up on technicians forums who fix laptops, that plus reflows are only a temporary fix because the actual chips are flawed. Re-balling with new, better solder is a better solution but not many offer those fixes since it involves 100s of tiny solder balls per chip.

Before blindly leaping to Nvidia's defence like a fanboy, please do some research!Reply

Before blindly taking the big lie from years ago repeated above to attack nvidia for no reason at all other than all you have is years old misinformation, then wail on about it, while telling someone else some more lies about it, check your own immense bias and lack of knowledge, since I had to point out the truth for you to find, and you forgot DV9000, dv2000 and dell systems with poor HS design, let alone apple amd console video chip failings, and the fact that payment was made and restitution was delivered, which you also did not mention, because of your fanboy problems, obviously in amd's favor. Reply

I have some issues with this article, the first of course being availability. Checking the past week, I have yet to see any availability of the 680 besides $200+ over retail premium cards on ebay. How can you justify covering yet another paper launch card without blaring bold print caveats, that for all intents and purposes, nVidia can't make for a very long time? There is a difference between ultra rare and non-existant.

Is a card or chip really the fastest if it doesn't exist to be sold?

Second, the issue of RAM, that's a problem in that most games are 32 bit, and as such, they can only address 3.5GB of RAM total between system and GPU RAM. This means you can have 12GB of RAM on your video card and the best you will ever get is 3GB worth of usage.

Until games start getting written with 64 bit binaries (which won't happen until Xbox 720 since almost all PC games are console ports), anything more than 2-3GB GPU RAM is wasteful. We're still looking at 2014 until games even START using 64 bit binaries.

While I'm afraid we're not at liberty to discuss how many 680 and 690 cards NVIDIA has shipped, we do have our ears to the ground and as a result we have a decent idea as to how many have shipped. Suffice it to say, NVIDIA is shipping a fair number of cards; this is not a paper launch otherwise we would be calling NVIDIA out on it. NVIDIA absolutely needs to improve the stock situation, but at this point this is something that's out of their hands until either demand dies down or TSMC production picks up.

The 690 is a stunning product... but I'm left wanting to see the more mainstream offerings. That's really where NVIDIA will make its money, but we're just left wondering about supply issues and the fact that AMD isn't suffering to the same degree.Reply

A single EVGA GTX680 sku at newegg has outsold the entire line up of 7870 and 7850 cards combined with verified owners reviews.So if availability is such a big deal, you had better ask yourselves why the 7870 and 7850 combined cannot keep pace with a single EVGA 680 card selling at Newegg. Go count them up - have at it - you shall see.108 sales for the single EVGA 680, more than the entire combined lot of all sku's in stock and out of the 7870 and 7850 combined total sales. So when you people complain, I check out facts - and I find you incorrect and failing almost 100% of the time.That's what happens when one repeats talking points like a sad PR politician, instead of checking available data.Reply

Have you considered using WinZip 16.5 with it's OpenCL accelerated file compression/decompression as a compute benchmark? File compression/decompression is a common use case for all computer users, so could be the broadest application of GPGPU relevant to consumers if there is an actual benefit. The OpenCL acceleration in WinZip 16.5 is developed/promoted in association with AMD so it'll be interesting to see if it is hobbled on nVidia GPUs, as well as how well if scales with GPU power, whether it scales with SLI/dual GPU cards, and whether there are advantages with close IGP-CPU integration as with Llano and Ivy Bridge.Reply

I actually don't know if it's AMD only. I know AMD worked on it together with WinZip. I just assumed that since it's OpenCL, it would be vendor/platform agnostic. Given AMD's complaints about use of vendor-specific CUDA in programs, if they developed an AMD-only OpenCL application, I would find that very disappointing.Reply

"WinZip has been working closely with Advanced Micro Devices (AMD) to bring you a major leap in file compression technology. WinZip 16.5 uses OpenCL acceleration to leverage the significant power of AMD Fusion processors and AMD Radeon graphics hardware graphics processors (GPUs). The result? Dramatically faster compression abilities for users who have these AMD products installed! "Reply

Excuse me but you're wrong, again." by Ryan Smith on Thursday, May 10, 2012According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. "Ryan's comment from the 670 release review.Reply

I'm sure the gamer's manifesto amd company "ownz it" now, and also certain it has immediately become all of yours favorite new benchmark you cannot wait to demand be shown here 100% of the time, it's so gaming evolved.Reply

Here's some research mt know it all: " by Ryan Smith on Thursday, May 10, 2012According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. "--Congratulations on utter FAIL.Reply

First off, thank you for this review. If you didn't do this, we'd have no idea how these GPUs perform in the wild. It is very nice to come here and read a graph and make educated decisions on which card we should purchase. It is appreciated.

The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing.

Reviewing the data you published, the average frame rates for the 5 top performers over all bench marks are;

Also, the number of times which the 7970 dipped below 60 fps in the benchmarks (excluding the minimum frame rate benchmarks) alone, without the 680 doing the same was 4. This is over 29 benchmarks and some of the dips were minimal.

This aligned with the price considerations makes me wonder why one wouldn't consider the 7970?Reply

"The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing."

Under normal circumstances we would do this. For example GTX 570 vs Raadeon HD 6970 last year; the two traded blows often enough that it came down to the game being played. However the key was that the two were always close.

In 20% of our games, 7970CF performance is nowhere close to GTX 690 because CF is broken in those games. It would be one thing if AMD's CF scaling in those games was simply weaker, but instead we have no scaling and negative scaling in games that are 5+ months old.

For single card setups AMD is still fine, but I cannot in good faith recommend CF when it's failing on major games like this. Because you never know what games in the future may end up having the same problem.Reply

I must say I found it quite odd and hilarious to see people accusing Anandtech of favouring AMD by using a monitor with a 1200 vertical resolution. 16:10 monitors are not that uncommon and we really should be showing the industry what we think by not purchasing 16:9 monitors.

Anyway, if anything this review seems to be Nvidia biased, in my opinion. The 7970 CF does not do too badly, In fact it beats the 690 / 680 CF in many games and only loses out in the games where it's "broken". I am not sure why you cannot recommend it based on the numbers in your benchmarks since it hardly embarrasses itself. Reply

When AMD gets it right, CrossFire is absolutely blistering. Unfortunately, the sad state of affairs is that AMD isn't getting it right with a good proportion of the games in this review.

NVIDIA may not get quite as high scaling as AMD when AMD does get it right, but they're just far more consistent at providing good performance. This is the main gripe about AMD; with a few more resources devoted to the project, surely they can overcome this?Reply

No, I said AMD's drivers have issues with Crossfire, not that they suck in general.

I've also checked three random British websites and there's no issues whatsoever in finding a 1920x1200 monitor. I also looked at NewEgg and found eight immediately. It's really not difficult to find one.Reply

I'm with ya bro. Forget these high resolution monitor nancy's who don't know what they're missing. I'm rockin' games just fine with 60+ fps on my 720p plasma tv, and that's at 600hz! Just you try to get 24xAAAA in 3D (that's 1200hz total) on that 1920x1200 monitor of yours!

On page 2 of the review - where you have all the pictures of the card - we have no real basis for figuring out the cards true size. Could you include a reference in one of those photos? Say, a ruler or a pencil or something, so we have an idea what the size of the card truly is?Reply

why they back to 256 bits and the gtx 590 have 384 bits?!?!cause they dont want to have a lot of advantage?maybe the next gtx 790 will have again 384 bits and it would be better than gtx690 ....come on!!!Reply

Wonder what the 7990 will look like next month. AMD clearly waited on purpose to see how the 690 was going to perform. They easily could have released a dual 7970 card already or at the very least sent specs to card manufacturers but they haven't.

We know they left a lot of headroom on the 7970 - some people have even suggested we'll get a 7980 at some point - wonder if now we'll get 2 x fully clocked 7970s on the same card ... will be interesting to see how they deal with that power consumption at load though.Reply

"Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer when it comes to setups using a pair of high-end 28nm GPUs is “you better damn well believe it.”"

No they actually cannot. 1920X, even the cf 7970 or 690 need help with lowered settings, as in many of the games. Can't even keep up with the 1920X monitors resolution refresh rate, set at a low 60.Sorry, more fantasies another for you perhaps. :)Reply

Hey guys!Thanks for the article, I enjoyed the read (although I am not in the market for dual GPU configurations after trying the HD3870X2 and 2*8800GTS, happy with one 7970 OC'ed to the max.). But you seem to be missing the numbers for noise from the HD7970 in a CF configuration. I hope you can post them! :D-DAReply

This was mentioned on the Test page, but we don't have a matching pair of 7970 cards; what we have is a reference card and an XFX 7970 BEDD. Power and temperature are the same regardless, but it would be improper to list noise because of the completely different acoustic properties of the BEDD.Reply

Seems to me they should be saving money in the construction when compared to two 680 in SLI. Half the fans, half the connectors, have the circuit boards. They should have at least cut $50 off the suggested retail price.

Half the magnesium, half the aluminum, half the PLX chips, half the R&D, half the vapor chambers, half the chip binning, half the power circuits, half the copper pcb's.... oh no wait, all those are added expenses, not reductions...+I guess they should be charging $200 over the 2x$499 dollar usual price.See how actually using the facts, instead of sourpuss emotion delivers a different picture ?Reply

These cards are sold out on Newegg for $1200 per. talk about taking advantage on a 20% markup over the msrp,hopefully AMD knocks the prices way down when they bring out there 7990, $800 sounds about right.Reply

Now I need a new keyboard because I was drooling into mine as I read this review. I have a GTX 680, but I don;t like to run SLI setups - I had a bad experience with my dual 560ti's. This looks like a truly awesome card that would hold its value for resale later. Nevertheless, there is no way I'm spending a grand on a video card.Reply

Not precisely. That $350 performance point? It used to be a $200 performance point. Similarly, that $350 point will turn into a $400 performance point. So, assuming I maintain the price tier, graphics returns for my dollar are gradually tapering off. I look at the performance I was getting out of my 7800 GT at 1280x1024, and it wasn't worth upgrading to a newer card, period, because of Windows XP, my single core CPU, and the fact that I was already maxing out every game I had and still getting decent frame rates. I think they key factor is that I do not care if I dip below 60 frames, as long as I'm above 30 and getting reasonable frame times.

I also know that consoles extend the life of PC hardware. The 7800GT is a 20-pipe version of the GTX, which is in turn the GPU found in the PS3.Devs have gotten much better at optimization in titles that matter to me.Reply

You spend well over $1,600 on a decent system.It makes no sense to spend all that money, then buy monitors the cards in question cannot successfully drive on 3 year old Crysis game, let alone well over half the benchmarks in this article set without turning DOWN the settings.You cannot turn up DX11 tesselation, keep it on medium.You cannot turn up MSAA past 4X, and better keep it at 2X.You had better turn down your visual distance in game. That in fact, with "all the console ports" moanings "holding us back".I get it, the obvious problem is none of you seem to, because you want to moan and pretend spending $1,000.00 on a monitor alone, or more, is "how it's done", because you whine you cannot even afford $500 for a single video card. These cards successfully drive 1920X1080 monitors in the benchmarks, but just barely - and if you turn the eye candy up, they cannot do it.Reply

You can use cards 2 generations back for that, but like these cards, you will be turning down most and near all of the eye candy, and be stuck rweaking and clocking, and jittering and wishing you had more power. These cards cannot handle 1920X at current "console port" games unless you turn them down, and that goes ESPECIALLY for the AMD cards that suck at extreme tesselation and have more issues with anything above 4XAA, and often 4XAA. The 5770 is an eyefinity card and runs 5760X1200 too. I guess none of you will ever know until you try it, and it appears none of you have spent the money and become disappointed turning down the eye candy settings - so blabbering about resolutions is all you have left.Reply

After checking Newegg it would seem that, unfortunately for Nvidia, this will be another piece of vaporware. Perhaps they should scale the Kepler's to 22nm and contract Intel to fab them since TSMC has major issues with 28nm. Just a thought. Reply

I guess I should retract my comments about TSMC as other customers are not experiencing supply issues with 28nm parts. Apparently the issues are with Nvidia's design, which may require another redo. I'm guessing AMD will be out with their 8000 series before Nvidia gets their act together. Sad because I have used several generations of Nvidia cards and was always happy with them. Reply

The GTX680 by EVGA in a single sku outsells the combined total sales of the 7870 and 7850 at newegg. nVidia "vaporware" sells more units than the proclaimed "best deal" 7000 series amd cards. ROFL Thanks for not noticing. Reply

I am a new reader and equally new to the subject matter, so sorry if this is a dumb question. The second page mentioned that NVIDIA will be limiting its partners' branding of the cards, and that the first generation of GTX 690 cards are reference boards. Does NVIDIA just make a reference design that other companies use to make their own graphics cards? If not, then why would anyone but NVIDIA have any branding on the cards?Reply

anyone who sides with AMD or NVIDIA are retards - side with yourself as a consumer - buy the best card at the time that is available AND right for your NEEDs.

fact is the the 690 is trash regardless of whether you are comparing it to a NVIDIA card to a AMD card - if im buying a card like a 690 why the FUCK would i want anything below 1200 Peven if it is uncommon its a mfing trash of a $1000 card considering:

$999 GeForce GTX 690$499 GeForce GTX 680$479 Radeon HD 7970

and that SLI and CF both beat(or equal) the 690 at higher res's and cost less(by 1$ for NVIDIA but still like srsly wtf NVIDIA !? and 40$ for AMD) ... WHAT !?

furthermore you guys fighting over bias when the WHOLE mfing GFX community (companies, software developers is built on bias) is utterly ridiculous, GFX vendoers (AMD and NVIDA) have skewed results for games for the last decade + , and software vendors two - there needs to laws against specfically building a software for a particular graphics card in addition to making the software work worse on the other (this applies to both companies)

hell workstation graphics cards are a very good example of how the industry likes to screw over consumers ( if u ever bios modded - not just soft modded a normal consumer card to a work station card , you would know all that extra charge(up-to 70% extra for the same processor) of a workstation card is BS and if the government cleaned up their shitty policies we the consumer would be better for it) Reply

I know this is a really old review, and everyone has long since stopped the discussion - but I just couldn't resist posting something after reading through all the comments. Understand, I mean no disrespect to anyone at all by saying this, but it really does seem like a lot of people haven't actually used these cards first hand.

I see all this discussion of nVidia surround type setups with massive resolutions and it makes me laugh a little. The 690 is obviously an amazing graphics card. I don't have one, but I do use 2x680 in SLI and have for some time now.

As a general rule, these cards have nowhere near the processing power necessary to run those gigantic screen resolutions with all the settings cranked up to maximum detail, 8xAA, 16xAF, tessellation, etc....

In fact, my 680 SLI setup can easily be running as low as 35 fps in a game like Metro 2033 with every setting turned up to max - and that is at 1920x1080.

So, for all those people that think buying a $1000 graphics card means you'll be playing every game out there with every setting turned up to max across three 1920x1200 displays - I promise you, you will not - at least not at a playable frame rate.

To do that, you'll be realistically looking at 2x$1000 graphics cards, a ridiculous power supply, and by the way you better make sure you have the processing power to push those cards. Your run of the mill i5 gaming rig isn't gonna cut it.Reply

More than 1 year since it is announced. I hope new products will be better. My suggestion: 1 Add HDMI, it is standard. 2. consider to allow us to add memory / SSD for better/ faster performance, especially for rendering 3D animation, and otherReply