$740 for the EVBot controller + a card with a blower fan, dinky heatsink (compared to MSI Lightning 7970/680 and Sapphire TOXIC 7970) is overpriced imo. It's not like the extra 100mhz or 4GB of VRAM over 2GB 1230-1290mhz GTX680s will make this card more future proof for next generation games.

If I wanted the best single-GPU with bragging rights, I'd rather get the GTX690 at that point. The 690 would be much faster and quieter and actually be good to go for next gen games. This seems like a marketing exercise. Reply

The thing is that I really don't consider a blower to be a negative thing here. This card barely passed 50dB under load, and at the same time it's fully exhausting. Open air coolers have their place, but not having to worry about additional case cooling is quite convenient.Reply

Ya, I agree that this blower is one of the better ones. But even from your review, the blower was struggling to keep this card under 70*C with overvolting. Under regular overclocking, it worked perfectly fine but the card only reached 1211MHz (1301 with GPU Boost). Those clocks are nothing special and plenty of $500-560 GTX580s such as Gigabyte Windforce 3x, Asus Direct CUII, Zotac AMP!, Galaxy KFA2 can reach those clocks. So the question is why is this card $660? 4GB of VRAM is a waste at 1080/1200P and at 2560x1600 with AA, 1.2ghz HD7970 is faster.

But if you are going to overvolt, the cooler suddenly becomes a limitation, especially after buying EVBot for $740. Suddenly you aren't too far off from a real special card - GTX690 - that's actually going to be fast enough to play today's and next generation games. Going from 1301mhz to 1377mhz with volt mod is not going to make GTX680 any better for newer games since that's not good enough, especially after you consider that because it gets too hot, the delta is less than 76mhz in actual gaming.Reply

I don't think you can call 56.3dB "keeping the noise down", though. That's approaching conversational levels of noise - more importantly the fans on blower heat-sinks sound subjectively worse; less like the broadband noise of a desk fan and more like the drone of a hair-dryer.

So, it's not all about "style". There are other legitimate concerns at play here.Reply

" Supplying this power is a pair of 8pin PCIe power sockets, which means on paper the GTX 680 Classified can safely draw up to 375W. In practice it’s not clear whether GK104 can actually take that, at least with air cooling, so pushing this card much beyond 300W is mostly in the realm of hardcore water and liquid nitrogen overclockers. "

blag blah blah blah: is overpriced imo. It's not like the extra 100mhz or 4GB of VRAM

blah blah blah blah: To make it even more laughableblah blah blah blah: Those clocks are nothing specialblah blah blah blah: This seems like a marketing exercise

We were discussing this card's value vs. the GTX680 Lightning. It's worse than the Lightning in overclocking, price and noise levels. Compared to the 7970, it's ridiculously overpriced and will get beaten by a 1250mhz 7970. So there are at least 2 better options on the market: GTX680 Lightning and Sapphire Vapor-X 7970 GE. Plus, at these prices, you can now get HD7950 Crossfire or catch a sale on 2 GTX670s for $340 each. Sorry but $660 for the Classified is a rip-off.

Nice link to a deactivated product in which 4/9 of the people that bought it had trouble. I wonder why it was deactivated? Speaks to the quality of your entire post. "Dinky heatsink". Did you read any of the article at all?

Yeah, I'm going to be spending all my video card money on Galaxy products, for sure. I especially like that their card had one DisplayPort and 3 HDMI ports. I'm sure I'm going to run 3 TVs off it.Reply

alright so you must realize that:1; you will never see the potential out of this card with only air or normal liquid cooling.

2; because of #1 anyone who isnt into phase change/ln2/ or sub zero cooling thinks this is a waste.

you must realize that this thing has 18 power phases O.o like holy crap. the only real perk this card has is its amazing ability to reach extreme clocks at low voltages given the fact you use subzero cooling. anything else is just not adiquate. so yes, for 90% of people, this thing is a 660 dollar brick, for us extreme overclockers, this is a godsend! just look at kingpin and what he got off this beast. for you gamers out there that only use a max of normal liquid cooling, a 680 FTW+ is probly all you will want to pay for. anything else (like this card) is useless to you. so yes, amazing card, small market target, but they made it anyway, which is rare for a company to do!Reply

This isn't really intended for ordinary gamers, but rather overclockers using exotic cooling. In that case, the overclocking features this card provides makes it a far more valuable card to them in comparison to reference cards.Reply

As far as cooling solution, that's just your opinion, (actually from what I've read it's wrong because the Lightning gets warmer) and a lot of people aren't going to like MSI's because they want the warm air moved out of the case.

The big kicker for me though is the 4GB of memory; if you plan on running 3x 2560x1440, 2GB just isn't enough. I'm an MSI fan, but I can't use their product to fill my needs. If I want 4GB and "unlocked voltage" my only choice is the EVGA Classified.Reply

Probably because stable memory overclocking is difficult to achieve when you are trying to drive double the VRAM. Seeing 4GB of VRAM seems to be overkill, keeping 2GB of VRAM and increasing memory clocks would probably have been more worthwhile although it doesn't quite have the same marketing ring to it as "4GB".Reply

I wondered where all the blabbering amd fanboys skittered off to in their constant 3GB ram drone psychosis....

Let me just share a quote : " Quote :

The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference. "

LOL

So now the blabbering jerks will yapper about cost, complain about the 7970 6GB being "superior" and have the most enormous and gigantic brain fart concerning their endlessly godless and irritatingly stupid 3GB ram superiority dance vs 2GB 680 670.

It's a freaking TOTAL BLACKOUT at alcoholic blood toxic death level.

Just wait, because no amount of evidence will do it for the amd fanboy, and their masters at amd have known this for years, and have been playing them like a retarded out of tune fiddle gets played. A week or a day on they will be back at it, on some other article , any webspot they land... and the brain fart will be what they are not even aware of. It's clear how Hitler came to power.Reply

What does midrange mean to you ? The 460 560 560ti 570 580 have "midrange" covered... unless of course you mean non mid mid range, or middler lwor midrange, or range of ranges unranged of which there are none....

WHAT are you people expecting ? What cards exactly is this mythical purportedly missing midrange supposed to fall in between for you ?

I'm serious, it's been many, many months, but by logic alone, there isn't a card spot you so desire, and by absolute omission for just as long....

What the freak do you people expect ? The only thing I can possibly imagine is a "midrange card" that falls above the 580, above the 7870, above the 6970, below a stock 670... and costs perhaps "$150=$200" for your "midrange budget" - right ?

I don't get it. Won't one of you midrange wannabes explain it - sometime before it appears, or like is the fantasy supposed to be an absolute mystery forever ?Reply

What do we expect?? Mid range prices with last gen top of the line performance but new gen power consumption and temperatures.... Seems pretty clear to me. Ok you need an example... gtx 560ti between gtx470-480 performance but less power and lower temperatures so I don't have to change power supply if I go sli nor change my case, in the end, save some money and game as well + overclock better.

Everything that came out from nvidia from THIS very generation is overkill for gaming at 1080p, and that's the most used resolution in the whole freaking world, end of the discussion... who do you beleive you are criticizing everyone's desire/needs? GOD?Reply

Remember when the 4850 first came out as a $250 card... yet eventually ended up as a $100 card. Even todays modern $100 video cards are not much faster than the 4850... and that card is over 4 years old!

If we go by the usual scale of GPU performance increases at targeted price points...

For today, we should be able to get the performance of a 5870 card at a $100 price.

What do we have? The 6870 is slightly slower than the 5870 (great model naming there AMD - idiots), it costs about $155. (okay, the 5870 was a $500 card).

The smaller and cheaper to make 7850 is slightly faster than the 5870, but it costs about $225?! The 6870 is a better deal since its $75 cheaper yet about 7% slower.

So realistically, the $130 7770 is over-priced as its 2/3rd the performance for a $20~30 savings over the older 6870.

Of course, the 5770/ 6770 and 7770 are all pretty much the same card... not impressive.Reply

I agree with you that most of the performance increase in GPUs has happened in the $250+ level. Although HD7850 OCed = GTX580. The 7850 can be found for $200-230 no problem and GTX580 cost $500 just 1.5 years ago. So it is progress, just not as fast as in the past.

It's too expensive to make fast GPUs in the ~ $100 level. If you can only afford $100-130 GPUs, I think you are better off just getting a PS4 or the next Xbox. The allure of the PC are the games you can't play on consoles, controls, mods and better graphics and much cheaper game prices. $300 for a GPU isn't expensive when you consider the prices of games on the PC.

But ya I agree with you that HD7750/7770 are a joke. The latter is just 25-30% fater than a 2.5 year old HD5770. NV has nothing for less than <$400 (GTX670) worth buying. I guess that's what happens when wafer prices rise and the market for <$100 GPUs disappears.Reply

I paid $190 for the GF7600GT with the extra large cooler to reduce nice (Exhaust heat out the back)... and I laughed when the reviewers complained about the dual-slot being a "problem"... WTF?! Blowing heat out is better than blowing heat off the GPU and having it stay inside the case.

After than, I spent $85 on the ATI 4670 with the HIS blower... With the way PC gaming is, I don't see the value of spending a dime over $200. And considering its been 3 years since the ATI 5000 series, the 7850 should be a $150 card at the most.

Yes, I'm planning on the PS4 to replace my PC gaming and to rid me of Windows. NO PC games = Why use Windows?Reply

Console gaming has its appeals. Sitting on a nice couch in front of a 50-60 inch LED/Plasma after a long-day's of work is often more comfortable than gaming on a chair at a desk. However, that PS4 won't be $150, probably more like $400-500. Reply

Eh, what does your screen have to do with the rest of the hardware?I've been playing my PC games on a couch 2m away from a 47" TV for the last 5 years, a lot of them with a wireless XBOX360 controller as well, at least when I feel the extra precision of a mouse is not needed, and always at a resolution and details settings much better than the console alternative. I only play exclusives on the actual XBOX360. There is no way in hell I will ever consider console gaming a serious option.Reply

So the short answer is that the memory requirements on Ultra are so high that we wouldn't be able to test most of our previous-generation 1GB cards at 1920 if we used it. I did want to have Ultra in there somewhere so that was the compromise I had to make to balance that with the need for a useful test at 1920.

At the same time that would be pretty useful to see if GTX570/580 run out of VRAM in Shogun with Ultra settings at 1080P. What if GTX660Ti only has 1.5GB of VRAM? We'd want to know if it's already starting to become a bare minimum in games :)Reply

The 570 and 580 don't run out, but the 5750, 5870, and 6950 1gb and 6970 1gb do. A lot of amd fans have those 1gb cards because as usual, the amd fan is all about scrimping pennies and claiming they have the best anyway. Sad, isn't it.

Sadder is the 1920x1200 rez they use here, which allows crap amd cards to lose by less when most people have 1920x1080 where nVidia stomps on amd ever harder, because as usual, amd fan boys are hacking away over pennies and buy the much cheaper and far more common 1920x1080 monitors instead of 1920x1200, saving $50 minimum amd more like $100+.

While EVGA's cooler is an improvement over stock, I wonder how a capable card like this would perform if paired with an high performance cooler like the Arctic Accelero Xtreme III. Kepler-based cards drop their boost clocks above 70C to compensate for increased leakage, so it would be interesting to see how fast this card could get while staying below that mark. Even at maximum RPMs the fans would probably be quieter than this one.Reply

Amd cards never give the same performance as they lack so many features.you can perhaps, if you're lucky, get an fps only equivlanet in a few old games, or a hacked equivalent with crappy IQ that I'm sure you cannot see anyway, and in that case your power/performance is a big fat loser too - we cannot suddenly forget that for just this latest round when it was the most important point ever made for several years just prior now can we... pffffft !~Reply

Not with this card. When you buy reference for liquid cooling then you can't go wrong with EVGA. Best cards around. When you buy EVGA Hydrocopper - you can't go wrong. But EVGA Classified are usually only highly overpriced reference designs. Yes there are tweaks here and there, but for max performance [air cooler] out of GTX family most people [including my humble person] go to MSI TwinFrozr3 Lighting/EX.or Asus 3 slot bricks (name escapes me).

Lately EVGA sliding with theirs top offerings. SR-X motherboard is cruel joke when compared to ASUS dual CPU creation and now this. Another misfire.

But I think EVGA doesn't care too much. They have devoted customers who buy everything EVGA without thinking...Reply

I remember being sick to my stomache seeing the same old red red red red red pcb on them all. Finally one amd fan promoter claimed he had a blue pcb amd card and linked a pic but it has the same old sad red square cover with the black lines.

I do realize when the amd double D breast design recently hit many fanboys went into some sort of sexually perverse mental mode, but that shouldn't wipe out the endless years of amd standard fare we were all tortured with.

In the case of this card, there's a lot of white on the outside I haven't seen anywhere else, the white "top" with printing will be staring at you out of the case, something so many cards have been oblivious too for far too long... then we also have the black carbon look - another unusual feature although with the fanboysim over anything and everything black that is understandable as I'm sure their pr boys figured that part a clear win, sadly enough.Reply

With 4gb RAM it seems like it's almost intended to be the ultimate Second Life card; powerful enough to handle that app's mediocre but insanely demanding graphics with the RAM to hold all the hundreds of overly high-resolution textures plastered onto every visible surface.

But once you put it in the case, usually within a few minutes of having an insane "unboxing" session much like a religious pilgrimage with a possible absolutely boring youtube minutes somehow considered a "treat" by the disturbed (of which there are many), you shove it in the case and put on the side cover... never to really see it ever again in it's fully glory, until it's death.

What you will see is the big fat WHITE laberl and red classified printing jamming at your face if you have a side window..... clearly the most important aspect - even though 98% don't have a window to look through... but if you do - you're set.

Don't mind me - I'm still amazed how "the feel" of some look makes it or breaks it for 99% of the retarded humans that surround me - especially when "the looking" is done like .000001% of the time as in the case of these video cards.

It must have to do with their estrogen levels I tell myself, or maybe they don't have a girlfriend and that's why...Reply

The instant I saw the original 680 I said that the 256bit memory bus was going to limit it severely. Even before I saw any other stats for the thing I knew id never buy one. Nvidia was cheap when they released the 680 because they saw what the 7970 was putting out and they said we'll call our 660 midrange our 680 high end and we can make more money (also love the fact that you guys test the handful of games that amd's 7 series beats the nvidia 6 series... not cherry picking your benchmarks at all nooo).

This card does push the 680 to its limit which is cool and all but it just proves that a) the 256bit mem bus is still a midrange card designator no matter how much they claim gddr5 is fast enough to not need more than that... it does. And b) Nvidia could have pushed the 680's base clock up much higher and, while it would still be bottle necked bad, it would have been more attractive.

To be fair, AMD started the gouging with the 7970 series, its moderate boost over the 580 series, and its modest mark-up over that line.

When nVidia saw what AMD had launched, they must have laughed and rubbed their hands together with glee. Because their mainstream part was beating it and it cost them a LOT less to make. So they COULD have passed those savings onto the customer and launched at nominal pricing, pressuring AMD with extremely low prices that AMD could probably not afford to match...

...or they could join with the gouging. They joined with the gouging. They knocked the price down by $50 and AMD's pricing (besides the 78xx series) has been in a freefall ever since.Reply

You people are using way too much tin foil, it's already impinged bloodflow to the brain from it's weight crimping that toothpick neck... at least the amd housefire heatrays won't further cook the egg under said foil hat.

Since nVidia just barely has recently gotten a few 680's and 670's in stock on the shelves, how pray tell, would they produce a gigantic 7 billion transistor chip that it appears no forge, even the one in Asgard, could possibly currently have produced on time for any launch, up to and perhaps past even today ?

See that's what I find so interesting. Forget reality, the Charlie D semi accurate brain fart smell is a fine delicacy for so many, that they will never stop inhaling.

Wow.

I'll ask again - at what price exactly was the 680 "midrange" chip supposed to land at ? Recall the GTX580 was still $499+ when amd launched - let's just say since nVidia was holding back according to the 20lbs of tinfoil you guys have lofted, they could have released GTX680 midrange when their GTX580 was still $499+ - right when AMD launched... so what price exactly was GTX680 supposed to be, and where would that put the rest of the lineups on down the price alley ?

Has one of you wanderers EVER comtemplated that ? Where are you going to put the card lineups with GTX680 at the $250-$299 midrange in January ? Heck ... even right now, you absolute geniuses ?Reply

One good reason not to have it is the fact that software overclocking can sometimes be rather wonky. I can see Nvidia erring on the cautious side to protect their customers from untidy programs.

EVGA is a company I want to love, but they are, in my opinion, one that "almost" goes the extra mile. This card is a good example, I think. Their customers expressed a desire for unlocked voltage and 4GB cards (or "more than 2GB"), and they made it for us.

But they leave the little things out. Where do you go to find out what those little letters mean on the EVBot display? I'll tell you where I went - to this article. I looked in the EVBot manual, looked up the manual online to see if it was updated - it wasn't; scoured the website and forums, and no where could I find a breakdown of what the list of voltage settings meant from EVGA!

I'm not regretting my purchase of this card; it is a very nice piece of hardware. It just doesn't have the 100% commitment to it a piece of hardware like this should.

But then, EVGA, in my opinion, does at least as good as anybody, in my opinion. MSI is an excellent company, but they released their Lightning that was supposed to be over-voltable without a way to do it. Asus makes some of the best stuff in the business - if their manufacturing doesn't bungle the job and leave film that needs to be removed between heatsinks and what they should be attached to.

Cards like this are necessarily problematic. To make them worth their money in a strict results sense, EVGA would have to guarantee they overclock to something like 1400MHz. If they bin to that strict of a standard, why don't they just factory overclock to 1400 to begin with?

And, what's going to be the cost of a chip guaranteed to overclock that high? I don't know; I don't know what EVGA's current standards are for a "binning for the Classified" pass, but my guess is it would drive the price up, so that cost value target will be missed again.

No, you can judge these cards strictly by value for yourself, that's quite a reasonable thing to do, but to be fair you must understand that some people are interested in getting value from something other than better frame rates in the games they are playing. For this card, that means the hours spent overclocking - not just the results, the results are almost beside the point, but the time spent itself. In the OC world that often means people will be disappointed in the final results, and it's too bad companies can't guarantee better success - but if they could, really what would be the point for the hard-core overclocker? They would be running a fixed race, and for people like that it would make the race not worth running.

These cards aren't meant for the general-population overclocker that wants a guaranteed more "bang for the buck" out of his purchase. Great OCing CPUs like Nehalem and Sandy Bridge bring a lot of people into the overclocking world that expect to get great results easily, that don't understand the game it is for those who are actually responsible for discovering those great overclocking items, and that kind of person talks down a card like this.

Bottom line - if you want a GTX 680 with a guaranteed value equivalent to a stock card, then don't buy this card! It's no more meant for you than a Mack truck is meant to be a family car. However, if you are a serious overclocker that likes to tinker and wants the best starting point, this may be exactly what you want.

Nvidia wasn't happy with the partners' designs, eh? Oh please. We all remember the GTX 480. That was Nvidia's doing, including the reference card and cooler. Their partners, the ones who didn't use the awful reference design, did Nvidia a favor by putting three fans on it and such.

Then there's the lack of mention of Big Kepler on the first page of this review, even though it's very important for framing since this card is being presented as "monstrous". It's not so impressive when compared to Big Kepler.

And there's the lack of mention that the regular 680's cooler doesn't use a vapor chamber like the previous generation card (580). That's not the 680 being a "jack of all trades and a master of none". That's Nvidia making an inferior cooler in comparison with the previous generation.Reply

I, for one, find the 3rd to the last paragraph of the 1st review page a sad joke.

Let's take this sentence for isntance, and keep in mind the nVidia reference cooler does everything better than the amd reference:" Even just replacing the cooler while maintaining the reference board – what we call a semi-custom card – can have a big impact on noise, temperatures, and can improve overclocking. "

One wonders why amd epic failure in comparison never gets that kind of treatment.

If nVidia doesn't find that sentence I mentioned a ridiculous insult, I'd be surprised, because just before that, they got treated to this one: " NVIDIA’s reference design is a jack of all trades but master of none "

I guess I wouldn't mind one bit if the statements were accompanied by flat out remarks that despite the attitude presented, amd's mock up is a freaking embarrassingly hot and loud disaster in every metric of comparison...

I do wonder where all these people store all their mind bending twisted hate for nVidia, I really do.

The 480 cooler was awesome because one could simply remove the gpu sink and still have a metal covered front of the pcb card and thus a better gpu HS would solve OC limits, which were already 10-15% faster than 5870 at stock and gaining more from OC than the 5870.

Speaking of that, we're supposed to sill love the 5870, this sight claimed the 5850 that lost to 480 and 470 was the best card to buy, and to this day our amd fans proclaim the 5870 a king, compare it to their new best bang 6870 and 6850 that were derided for lack of performance when they came out, and now 6870 CF is some wonderkin for the fan boys.

I'm pretty sick of it. nVidia spanked the 5000 series with their 400 series, then slammed the GTX460 down their throats to boot - the card all amd fans never mention now - pretending it never existed and still doesn't exist... It's amazing to me. All the blabbing stupid praise about amd cards and either don't mention nVidia cards or just cut them down and attack, since amd always loses, that must be why.Reply

Nvidia cheaped out and didn't use a vapor chamber for the 680 as it did with the 580. AMD is irrelevant to that fact.

The GF100 has far worse performance per watt, according to techpowerup's calculations than anything AMD released in 40nm. The 480 was very hot and very loud, regardless of whether AMD even existed in the market.

AMD may have a history of using loud inefficient cooling, but my beef with the article is that Nvidia developed a more efficient cooler (580's vapor chamber) and then didn't bother to us it for the 680, probably to save a little money.Reply

The 680 is cooler and quieter and higher performing all at the same time than anything we've seen in a long time, hence "your beef" is a big pile of STUPID dung, and you should know it, but of course, idiocy never ends here with people like you.

Let me put it another way for the faux educated OWS corporate "profit watcher" jack***: " It DOESN'T NEED A VAPOR CHAMBER YOU M*R*N ! "

Hopefully that penetrates the inbred "Oxford" stupidity.

Thank so much for being such a doof. I really appreciate it. I love encountering complete stupidity and utter idiocy all the time. Reply

Something the regular joe moronic masses cannot seem to comprehend for the life of them, is when the cores peter out before the ram can be effectively used while maintaining a playable framerate, no amount of memory no matter how much "can help".

Let me put it another way:

The card makers need a more powerful core to use more than 2GB memory.(actually less than 2, but I won't go into that)

The results are all over the web, and have been for months. No one should still be so utterly blind to the disclosed facts, still.

I tried 2 MSI 680 Lightning are the temps go up to 75-76 each card, while 2 680 reference cards reach 72°. So you guys forget that for sli users, you have to get cards with a blower -unless you have watercooling.I know the price is not ideal, but for people like who are looking for a couple 680 with overclock and a proper watercooler-free solution, this is the card to go.You could go for a MSI Lightning, but when you put in another card, temps will go up to 15-20° on each card, and i dont like playing while having 76-78° plus the noise.Reply