GeForce GTX 670 Conclusion

IMPORTANT: Although the rating and final score mentioned in this conclusion are made to be as objective as possible, please be advised that every author perceives these factors differently at various points in time. While we each do our best to ensure that all aspects of the product are considered, there are often times unforeseen market conditions and manufacturer changes which occur after publication that could render our rating obsolete. Please do not base any purchase solely on our conclusion as it represents our product rating specifically for the product tested, which may differ from future versions of the same product. Benchmark Reviews begins our conclusion with a short summary for each of the areas that we rate.

GeForce GTX 670 replaces the GTX 570 in NVIDIA's product stack, and based on our test results the performance differences between them are night and day. On average GeForce GTX 670 delivers an impressive 45% increase over GTX 570, but occasionally reached as high as 70% over its predecessor. NVIDIA have designed GTX 670 to operate faster, offer more features, deliver more functionality, use less energy, and generate less heat... all things we proved it succeeds in achieving. Based on the $399 price tag GeForce GTX 670 competes with the AMD Radeon HD 6950, however after running each video card through several different benchmark tests our FPS results often favored NVIDIA's GeForce GTX 670 over the more expensive AMD Radeon HD 7970. Let's look at the break-down:

In the DirectX 10 game Crysis Warhead, the GeForce GTX 670 was ahead of its predecessor (GTX 570) a full 11 FPS at 1920x1080 while still leading ahead of the AMD Radeon HD 7970 by 4 FPS. DirectX 11 tests also had the GeForce GTX 670 ahead in most tests. The demanding DX11 graphics of Batman: Arkham Asylum made use of Kepler's optimized architecture, delivering a staggering lead of 18 FPS to the GeForce GTX 670 over the more expensive Radeon HD 7970. Battlefield 3 continued the run, pushing the stock GTX 670 more than 8 FPS beyond the Radeon HD 7970. Lost Planet 2 played well on all graphics cards when set to high quality with 4x AA, yet the GeForce GTX 670 still surpassed Radeon HD 7970 performance by 6 FPS. In one of the few exceptions, Aliens vs Predator gave back the lead to AMD Radeon products over their NVIDIA counterparts. Metro 2033 is another demanding game that requires high-end graphics to enjoy quality settings, but like AvP this game benchmark favors Radeon products.

Appearance is a much more subjective matter, especially since this particular rating doesn't have any quantitative benchmark scores to fall back on. NVIDIA's GeForce GTX series has traditionally used a recognizable design over the past two years, and with the exception to more angular corners, the GTX 670 looks very similar to the GTX 570 model. Some add-in card partners may offer their own unique cooling solution design, but this might not happen with the GeForce GTX 670 since it operates so efficiently and allows nearly all of the heated air to exhaust outside of the computer case. Expect most partners to dress up the original reference design by placing exciting graphics over the fan shroud or using colored plastic components. While looks might mean a lot to some consumers, keep in mind that this product outperforms the competition while generating much less heat and producing very little noise.

Construction is the one area NVIDIA continually shines, and thanks in part to extremely quiet operation paired with more efficient cores that consume less energy and emit less heat I'm confident that GeForce GTX 670 will continue this tradition. Requiring two 6-pin PCI-E power connections helps keep this video card compatible with most power supply units, while tweaking heatsink and fan placement to optimize cooling performance proves there are still ways to improve on a commonplace technology. GeForce GTX 670 has one of the shortest PCBs we've seen from a GTX-series model, which further reduces heat output and makes this a product suitable for more robust HTPC applications. Even better yet, now consumers have a single-GPU solution capable of driving three monitors in 3D Vision Surround with the inclusion of two DL-DVI ports with supplementary HDMI and DisplayPort output.

Defining value at the premium-priced high-end segment isn't easy, because hardware enthusiasts know that they're going to pay top dollar to own the top product. Even still, rating value is like chasing a fast moving target, so please believe me when I say that prices change by the minute in this industry. The GeForce GTX 670 "Kepler" graphics card demonstrates NVIDIA's ability to innovate the graphics segment while maintaining a firm lead in their market, but it comes at a cost. The NVIDIA GeForce GTX 670 shares the same $399 price segment with AMD's Radeon HD 7950, yet performs like the 7970. So with regard to value, the GeForce GTX 670 delivers more features and better performance than the less-powerful AMD Radeon HD 7950, but occasionally meets or exceeds performance to the $550 Radeon HD 7970. Even if we ignore GTX 670's faster FPS results, its features and functionality run off the chart. Furthermore, only NVIDIA's video cards offer multi-display 3D gaming, Adaptive VSync, PhysX technology, GPU Boost, FXAA, and now TXAA.

As of launch day 10 May 2012, GeForce GTX 670 is available at Newegg from several NVIDIA partners:

While the NVIDIA GeForce GTX 680 may be the reigning champion, GTX 670 saves gamers $100 while delivering incredibly similar performance. NVIDIA's 28nm GK104 'Kepler' GPU has made a huge difference in power consumption and heat output, features that really have my attention. It won't surprise me if enthusiast find themselves divided on their purchase: overclock the GeForce GTX 670 to perform like a GTX 680 (which we may test in a follow-up article), or combine two into an SLI set. Regardless, the performance is there and it reinforces value. Still, I think most people are waiting to see what GeForce GTX 650/660 will offer.

So what do you think of the NVIDIA GeForce GTX 670 graphics card, and are you planning to buy one?

Hey Olin great Job! would love to see some overclocks and SLI benchmarks at higher setting like enthusiast crysis warhead and why not turn on physx on the Batman benchmark?Also was wondering if Battlefield 3 on three monitors and 3dVision will be as memory hungry or if I should wait for the upcoming 4 gig cards?

Go to Toughest bench first -> Metro2033 3 frame differenceGo to To will it play nerfed -> Crysis2 3 frame differenceGo to Gbuffer tech to see memory intensive app -> Battlefield3 3 frame difference!Finish off with Physx example/open city that opened with shakey driver support: ArkhamCity... 3 Frame Differnce

Perhaps a metro 2033/ Battlefied3 3Dvision + NV surround With dual tri ( quad? ) SLI is needed to show more advantage not yet exposed?Otherwise I sure am glad there have not been any GTX 680 stock.

( I suppose a new wave of powerful games that will not be as nerfed by console economics when the next gen of consoles are released will validate the $100 difference as well... But I would suspect that Metro 2033 should have been enough of a burden for the gtx 680 to prove the advantage of it's muscle?? )

Was a GK 100 chip or GTX 780 rumor ever accounted for??It would seem reasonable by looking at the results of this bench that perhaps some of the rumors have some truth? Perhaps the gtx 680 is a different flagship class then the 480 and 580???( otherwise the gtx 570 was virtually equal to a 480... Compared to the KO "beatdown" the 670 gives the 580? ( boom! )

Don't know why the article starts with a concentration comparing against the 570 it replaces? The obvious story is: Why is this card practically a gtx 680 gk104 with the same memory?? Just turned down slightly? Better power requirement handling under load? While at lower temperture under load? ( with a 3 frame difference at the max settings given I would love to see if it can overclock as well? )

If it can... Then for $300 dollars cheaper! I can easily see the gtx670 tri sli the big story of the year ( if it scales well? )

Just seems so weird at the price point to actually dive in just yet. But if nothing develops that explains or ruins the ridiculous value... after x-mas I will definitely tri Sli the 670.???unbelievable!

Considering I pulled the brakes on Keplers launch cuz the air stank of fish...Knowing that I might have saved $300 on my tri sli replacement makes me so happy I have spent the past round of benchmarks squealing in glee like a little girl riding a pretty pink bony bareback!(some of which showing even less of a divide like Tom's showing "less than" < the 3 seconds in Metro 2033 even at the extrem rez of 2650 x 1600!!) Sure is hard for me to consider considering the possible implications "pointless"? Tho I suppose current developments might just be dumb luck.But let us assume u r exactly right? In which case, what happens to gtx 680 production? How can they possibly sell any more of them at $100 more now? What kind of idiot would u have to be? (I suppose some might just blindly trudge on despite the obvious value disparity?)When the single gpu flagship pride has been a point of prestige as much as anything else! And in light of the recent Jon Pedi Research numbers showing the projected multi-Billion ( $23 Billion ) Dollar PC gaming market explosion driven by performance gaming systems. Where according to their research:The Enthusiast Market of systems costing over $3400 dollars a build represent as much of the final dollar share in the end as the performance systems costing over $1000 a build! And each of which alone, are more than double the markets dollar share compared to the ghetto gamer (gaming builds costing about $800)who invariably QQ's the loudest in support of their needs when they are contributing the least to the detriment of the gaming experience suffered by the peformance and above market who actually created this explosion in the first place!) In other words, The low and Mid Range users at the end of this console cycle are poised to become the new least common denominator that brings down the level of gaming for the rest of us! (Now that soon the blame can not be leveled squarely with the consoles)In light of the new PC gaming revolution the last thing Nvidia would want to do is to lose credibility on their flagship model when the enthusiast Market numbers show how important they actually are.( Unless they r going to play smoke and mirrors and just Promote their dual GPU flagship and the GTX 670 and Ignore the GTX680?

That strategy of simply ignoring the gtx 680 actually works for me come to think of it?Just as long as there is a price drop now on the gtx 690 so original gtx 680 hopefuls do not do something really dumb out of anger and haste like "go with AMD".That would be a tough choice between them.I am kind of excited now to see what results from the upcoming 660 benchmarks?At this point maybe we will see another deal. Where the card is to powerful compared to the 670!

It could happen. I haven't gone cheap in a long time. it would be awesome actually? Considering Maxwelll seems to be creeping up incredibly fast?

But then again Metro 2033 was actually 4 seconds not 3! So maybe the 680 did "start" to show it's muscle by pulling away at the burden of Metro 2033?( So perhaps a stress test representing next gen 2.0 upcoming market with the added burden of extreme resolution and 3dvision? )

That Metro 2033's burden still only resulted in a 1 frame difference does not seem to inspire much confidence though? ( heck 1 frame isnt enough concern for most users to give any credence to an PCI-E X8 speed difference )

I had hoped that that GTX 660Ti would be out (around May 10, 2012) as suggested in related news, but it appears that nVidia wants to wait until Q3 to release it. The price difference ($249 for the 660 versus $399 for the 670) makes the GTX a more economical choice for me - I still can't justify spending $400 for the screaming GTX 670, even with these really good reviews. I suppose I should be glad that the Kepler cards aren't available yet, and limp along with my gimpy old Radeon HD 5770 and 4650, but as I watch the GPU temperature on the 5770 hover around 90C, I start getting really nervous - I've already moved all my games over to the older 4650 and only use the 5770 for word processing and the like. Excellent review, Olin - it really makes me want one of these new Kepler cards.

I've already done that - took the card out, disassembled it, blew compressed air through the fan, cleaned every surface, applied fresh heat sink compound, cleaned the case, checked all the case fans, made sure there was a blank slot between it and the other card, check the temperature in the case (never exceeded 30C), made sure the GPU fan turned when the card was on (it does). I contacted ATI support about it, nothing - they only sent me an auto-reply. I tried to set the fan speed manually using CCC, but it doesn't do anything - the fan never changes speed. I'm using the latest driver (12.4). The fan may be defective, but it is still turning, and there is no easy way to replace it. What does work, is to leave the case open and place a floor fan up against it to blow cold air on it. I wear a sweater and turn up the A/C, but my wife hates having the house so cold so I can only use that tactic when I know she is going to be out for a few hours.

hmmm sounds like then the only option open to you is an third party HSF for it then as 90c is ridiculously high temp mine never goes over 57c when folding 54c when gaming if I leave the fan set to auto.. Arctic have some nice and reasonably priced HSF's for graphics cards you may want to try one of those

You gain greater longevity with redundancy.. Hence it is better to have two kids than one.. If you have one, greater likelihood it will die, unless with two one kills the other..But lonely people are prone to suicide..