NVIDIA GeForce GTX 660 Ti 2GB Kepler Graphics Card Review

Another GK104 Option for $299

If you missed our live stream with PC Perspective's Ryan Shrout and NVIDIA's Tom Petersen discussing the new GeForce GTX 660 Ti you can find the replay at this link!!

While NVIDIA doesn't like us to use the codenames anymore, very few GPUs are as flexible and as stout as the GK104. Originally released with the GTX 680 and then with the dual-GPU beast known as the GTX 690 and THEN with the more modestly priced GTX 670, this single chip has caused AMD quite a few headaches. It appears things will only be worse with the release of the new GeForce GTX 660 Ti today, once again powered by GK104 and the Kepler architecture at the $299 price point.

While many PC gamers lament about the lack of games that really push hardware today, NVIDIA has been promoting the GTX 660 Ti as the upgrade option of choice for gamers on a 2 -4 year cycle. Back in 2008 the GTX 260 was the mid-range enthusiast option while in 2010 it was the GTX 470 based on Fermi. NVIDIA claims GTX 260 users will see more than 3x the performance increase with the 660 Ti all while generating those pixels more efficiently.

I mentioned that the GeForce GTX 660 Ti was based on GK104 and what you mind find interesting is that it is nearly identical to the specifications of the GTX 670. Both utilize 7 SMXs for a total of 1344 stream processors or CUDA cores and both run at a reference clock speed of 915 MHz base and 980 MHz Boost. Both include 112 texture units though the GeForce GTX 660 Ti does see a drop in ROP count from 32 to 24 and L2 cache drops from 512KB to 384KB. Why?

The only other major specification to change is the memory bus width: the GeForce GTX 660 Ti will have a 192-bit memory bus while the GTX 670 and the GTX 680 run at 256-bit. While the memory will still run at 6.0 GHz, the total available memory bandwidth drops from 192.2 GB/s down to 144.2 GB/s - a decrease of 35%. How that will play out in actual gaming will be shown on the benchmark pages but I can tell you that it altered performance more than I expected it to.

Here is the full breakdown of specifications as detailed above with other notable change seen in the TDP decrease of 20 watts.

For this launch NVIDIA didn't send us any reference cards and instead we got partner overclocked solutions from Galaxy, MSI, EVGA and Zotac. (We are going to follow up this week with a quick roundup.) Because of that, our pictures are going to vary from card to card and design to design, but we'll use the MSI option for our descriptions here.

The output configuration could vary but we expect all the GeForce GTX 660 Ti cards to include a pair of DVI connections, a full size HDMI and a full size DisplayPort.

Like the GTX 670, the GeForce GTX 660 Ti only requires a pair of 6-pin PCIe power connections.

3-Way SLI support continues here as well!

If you remember, the reference GTX 670s used a smaller length PCB and while our MSI and Galaxy cards for the GeForce GTX 660 Ti launch do NOT, the EVGA and Zotac options did, seen here.

For gamers looking to pick up a GeForce GTX 660 Ti right away, you might be stoked to find out that NVIDIA is including a copy of Borderlands 2 with MOST of the retail units. Considering how successful we expect this game to be, that $50-60 value really is great to see for a launch like this.

NVIDIA sent along this video that demonstrates Borderlands 2 OF COURSE with a PhysX emphasis - why not include it here to get you excited for the game?

Hi, i play in a 25' Monitor, and i'm going to buy a 660Ti When the 660Ti goes "Cheap" here in Brazil ( A Lot Expensive here :\ ), but the question is i can Run the 660Ti i have risk of lose the 660ti With a 400w Power Supply ? with a core i3 2120 and 8 gb Kingston Memory ? Thank You. And Sorry for The bad english, i'm Training.

What brand PSU is it? A seasonic, thermaltake, or corsair 400w PSU likely would be okay, but the 660ti like a 450+. Other than that, your next upgrade would likely be a Quad Core i5, but even then the difference would be marginal between an i3 and an i5.

Question for Tom:
--------------------
Do you think the 660 has been nerfed enough so that overclocked versions won't exceed a more expensive 670? Because that was a real problem with the 670 vs 680!
--------------------

at lower rez like 1080 don't think would take much OC to get it to around 670 speed, but that disabled memory controller is a ~25% memory bandwidth hit which would be a pretty size-able OC to get that back. I would say it could get in 670 ball park. This card was aimed at single monitor people.

Unless I missed it in my rather quick glance at this review (which has been discarded because of this fact), you should have included the 560 Ti. Did you just assume that no one was interested in the increase of power between the generation gap?

Should have used the 560 Ti, 570, and a 670 as a bench mark competitor - this review is useless with out it.

Can this one card be used to drive an HD TV, and 3 monitors in or out of eyefinity? I want to display a TV on on screen and have 3 monitor computing, not hardcore gaming but along those lines. I don't play many games, but I'm looking for a card that can do this. I currently run 3 heads off of a 6950/70 and run the TV off the z68 gigabyte iSSD mobo. THe current setup works, but I'm looking to upgrade...

I am planning a build which will use 3 660ti's in a 3 way SLi configuration so I should find out just how much of an improvement that makes in the next year. Your next question is why? Well I am not as interested in the performance gain as I am interested in getting at cheap GPUs. I am building the computer as mainly a GPU compute platform its ability to serve as a gaming platform is a perk. I am looking to take advantage GPU processing capabilities for GIS work.

It's my understanding that a 660Ti is a crippled 670gtx on a 192bit bus. So why? can these cards not SLI? surely the 670 would reduce its bus speed down to 192. before you go saying whats the point (I picked up a 660ti for £80 Black friday on Amazon) two 660ti's beat a 670gtx in most youtube benchies.