It’s hard not to notice that NVIDIA has a bit of a problem right now. In the months since the launch of their first Kepler product, the GeForce GTX 680, the company has introduced several other Kepler products into the desktop 600 series. With the exception of the GeForce GT 640 – their only budget part – all of those 600 series parts have been targeted at the high end, where they became popular, well received products that significantly tilted the market in NVIDIA’s favor.

The problem with this is almost paradoxical: these products are too popular. Between the GK104-heavy desktop GeForce lineup, the GK104 based Tesla K10, and the GK107-heavy mobile GeForce lineup, NVIDIA is selling every 28nm chip they can make. For a business prone to boom and bust cycles this is not a bad problem to have, but it means NVIDIA has been unable to expand their market presence as quickly as customers would like. For the desktop in particular this means NVIDIA has a very large, very noticeable hole in their product lineup between $100 and $400, which composes the mainstream and performance market segments. These market segments aren’t quite the high margin markets NVIDIA is currently servicing, but they are important to fill because they’re where product volumes increase and where most of their regular customers reside.

Long-term NVIDIA needs more production capacity and a wider selection of GPUs to fill this hole, but in the meantime they can at least begin to fill it with what they have to work with. This brings us to today’s product launch: the GeForce GTX 660 Ti. With nothing between GK104 and GK107 at the moment, NVIDIA is pushing out one more desktop product based on GK104 in order to bring Kepler to the performance market. Serving as an outlet for further binned GK104 GPUs, the GTX 660 Ti will be launching today as NVIDIA’s $300 performance part.

GTX 680

GTX 670

GTX 660 Ti

GTX 570

Stream Processors

1536

1344

1344

480

Texture Units

128

112

112

60

ROPs

32

32

24

40

Core Clock

1006MHz

915MHz

915MHz

732MHz

Shader Clock

N/A

N/A

N/A

1464MHz

Boost Clock

1058MHz

980MHz

980MHz

N/A

Memory Clock

6.008GHz GDDR5

6.008GHz GDDR5

6.008GHz GDDR5

3.8GHz GDDR5

Memory Bus Width

256-bit

256-bit

192-bit

320-bit

VRAM

2GB

2GB

2GB

1.25GB

FP64

1/24 FP32

1/24 FP32

1/24 FP32

1/8 FP32

TDP

195W

170W

150W

219W

Transistor Count

3.5B

3.5B

3.5B

3B

Manufacturing Process

TSMC 28nm

TSMC 28nm

TSMC 28nm

TSMC 40nm

Launch Price

$499

$399

$299

$349

In the Fermi generation, NVIDIA filled the performance market with GF104 and GF114, the backbone of the very successful GTX 460 and GTX 560 series of video cards. Given Fermi’s 4 chip product stack – specifically the existence of the GF100/GF110 powerhouse – this is a move that made perfect sense. However it’s not a move that works quite as well for NVIDIA’s (so far) 2 chip product stack. In a move very reminiscent of the GeForce GTX 200 series, with GK104 already serving the GTX 690, GTX 680, and GTX 670, it is also being called upon to fill out the GTX 660 Ti.

All things considered the GTX 660 Ti is extremely similar to the GTX 670. The base clock is the same, the boost clock is the same, the memory clock is the same, and even the number of shaders is the same. In fact there’s only a single significant difference between the GTX 670 and GTX 660 Ti: the GTX 660 Ti surrenders one of GK104’s four ROP/L2/Memory clusters, reducing it from a 32 ROP, 512KB L2, 4 memory channel part to a 24 ROP, 384KB L2, 3 memory channel part. With NVIDIA already binning chips for assignment to GTX 680 and GTX 670, this allows NVIDIA to further bin those GTX 670 parts without much additional effort. Though given the relatively small size of a ROP/L2/Memory cluster, it’s a bit surprising they have all that many chips that don’t meet GTX 670 standards.

In any case, as a result of these design choices the GTX 660 Ti is a fairly straightforward part. The 915MHz base clock and 980MHz boost clock of the chip along with the 7 SMXes means that GTX 660 Ti has the same theoretical compute, geometry, and texturing performance as GTX 670. The real difference between the two is on the render operation and memory bandwidth side of things, where the loss of the ROP/L2/Memory cluster means that GTX 660 Ti surrenders a full 25% of its render performance and its memory bandwidth. Interestingly NVIDIA has kept their memory clocks at 6GHz – in previous generations they would lower them to enable the use of cheaper memory – which is significant for performance since it keeps the memory bandwidth loss at just 25%.

How this loss of render operation performance and memory bandwidth will play out is going to depend heavily on the task at hand. We’ve already seen GK104 struggle with a lack of memory bandwidth in games like Crysis, so coming from GTX 670 this is only going to exacerbate that problem; a full 25% drop in performance is not out of the question here. However in games that are shader heavy (but not necessarily memory bandwidth heavy) like Portal 2, this means that GTX 660 Ti can hang very close to its more powerful sibling. There’s also the question of how NVIDIA’s nebulous asymmetrical memory bank design will impact performance, since 2GB of RAM doesn’t fit cleanly into 3 memory banks. All of these are issues where we’ll have to turn to benchmarking to better understand.

The impact on power consumption on the other hand is relatively straightforward. With clocks identical to the GTX 670, power consumption has only been reduced marginally due to the disabling of the ROP cluster. NVIDIA’s official TDP is 150W, with a power target of 134W. This compares to a TDP of 170W and a power target of 141W for the GTW 670. Given the mechanisms at work for NVIDIA’s GPU boost technology, it’s the power target that is a far better reflection of what to expect relative to the GTX 670. On paper this means that GK104 could probably be stuffed into a sub-150W card with some further functional units being disabled, but in practice desktop GK104 GPUs are probably a bit too power hungry for that.

Moving on, this launch will be what NVIDIA calls a “virtual” launch, which is to say that there aren’t any reference cards being shipped to partners to sell or to press to sample. Instead all of NVIDIA’s partners will be launching with semi-custom and fully-custom cards right away. This means we’re going to see a wide variety of cards right off the bat, however it also means that there will be less consistency between partners since no two cards are going to be quite alike. For that reason we’ll be looking at a slightly wider selection of partner designs today, with cards from EVGA, Zotac, and Gigabyte occupying our charts.

As for the launch supply, with NVIDIA having licked their GK104 supply problems a couple of months ago the supply of GTX 660 Ti cards looks like it should be plentiful. Some cards are going to be more popular than others and for that reason we expect we’ll see some cards sell out, but at the end of the day there shouldn’t be any problem grabbing a GTX 660 Ti on today’s launch day.

Pricing for GTX 660 Ti cards will start at $299, continuing NVIDIA’s tidy hierarchy of a GeForce 600 at every $100 price point. With the launch of the GTX 660 Ti NVIDIA will finally be able to start clearing out the GTX 570, a not-unwelcome thing as the GTX 660 Ti brings with it the Kepler family features (NVENC, TXAA, GPU boost, and D3D 11.1) along with nearly twice as much RAM and much lower power consumption. However this also means that despite the name, the GTX 660 Ti is a de facto replacement for the GTX 570 rather than the GTX 560 Ti. The sub-$250 market the GTX 560 Ti launched will continue to be served by Fermi parts for the time being. NVIDIA will no doubt see quite a bit of success even at $300, but it probably won’t be quite the hot item that the GTX 560 Ti was.

Meanwhile for a limited period of time NVIDIA will be sweeting the deal by throwing in a copy of Borderlands 2 with all GTX 600 series cards as a GTX 660 Ti launch promotion. Borderlands 2 is the sequel to Gearbox’s 2009 FPS/RPG hybrid, and is a TWIMTBP game that will have PhysX support along with planned support for TXAA. Like their prior promotions this is being done through retailers in North America, so you will need to check and ensure your retailer is throwing in Borderlands 2 vouchers with any GTX 600 card you purchase.

On the marketing front, as a performance part NVIDIA is looking to not only sell the GTX 660 Ti as an upgrade to 400/500 series owners, but to also entice existing GTX 200 series owners to upgrade. The GTX 660 Ti will be quite a bit faster than any GTX 200 series part (and cooler/quieter than all of them), with the question being of whether it’s going to be enough to spur those owners to upgrade. NVIDIA did see a lot of success last year with the GTX 560 driving the retirement of the 8800GT/9800GT, so we’ll see how that goes.

Anyhow, as with the launch of the GTX 670 cards virtually every partner is also launching one or more factory overclocked model, so the entire lineup of launch cards will be between $299 and $339 or so. This price range will put NVIDIA and its partners smack-dab between AMD’s existing 7000 series cards, which have already been shuffling in price some due to the GTX 670 and the impending launch of the GTX 660 Ti. Reference-clocked cards will sit right between the $279 Radeon HD 7870 and $329 Radeon HD 7950, which means that factory overclocked cards will be going head-to-head with the 7950.

On that note, with the launch of the GTX 660 Ti we can finally shed some further light on this week’s unexpected announcement of a new Radeon HD 7950 revision from AMD. As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance.

For this review we’re going to include both the 7950 and 7950B in our results. We’re not at all happy with how AMD is handling this – it’s the kind of slimy thing that has already gotten NVIDIA in trouble in the past – and while we don’t want to reward such actions it would be remiss of us not to include it since it is a new reference part. And if AMD’s credibility is worth anything it will be on the shelves tomorrow anyhow.

Well it depends on the samples, the 660 ti I bought for my wife, I tested it in my pc and over 1290 core clock(with boost) after 10-15 minutes gaming in a game that doesn't even taxes the gpu past 70%, the video card crashes and windows tells me ''the adapter has stopped responding''.

Crysis 2 stutters on some levels but it's mainly stable 95% of the time wheras my 7950 overclocked is not doing this.

It would artifact in MSI kombustor with a slight increase in voltage and core clock above 1260. Good thing it's for my wife and not me, she won't overclock as it's way enough for her mere 1080p resolution. The memory overclocks at 6,6ghz easily.Reply

TXAA - AWESOME - THE JAGGIES ARE GONE.Thank you nVidia for having real technology developement, unlike amd loserThank you nVidia for being able to mix ram chip sizes or to distribute ram chips across your memory controllers with proprietary technology that you keep secret depsite amd fanboys desiring to know how you do it so they can help amd implement for free. Thanks also for doing it so well, even with reviewers putting it down and claiming it can result in 48 bandwidth instead of 144 bandwidth, all the games and tests they have ever thrown at it in a desperate amd fanboy desire to find a chink in it's armor has yielded ABSOLUTELY NOTHING, as in, YOU'VE DONE IT PERFECTLY AGAIN nVidia. I just love the massive bias at this site.It must be their darn memory failing.Every time they make a crazy speculative attack here on nVidia where all their rabid research to find some fault provides a big fat goose egg, they try to do it again anyway, and they talk like they'll eventually find something even though they never do. By the time they give up, they're off on some other notional and failed to prove it put down against nVidia. 192 bit bus / 2GB ram / unequal distribution / PERFECT PERFORMANCE IMPLEMENTATION Get used to it. Reply

ROFL... I should have just read more posts...Might have saved me a crapload of typing Cerise...LOL. Nah, it needs to be said more by more than ONE person :) Call a spade a spade people.

I tried to leave out the word BIAS and RYAN/Anandtech in the same sentence :)

But hold on a minute, while I fire up my compute crap (or 2008 game rendered moot by it's own 2011+2012 hires patch equivalent) so I can run up my electric bill so I can prove the AMD card wins in something I never intend to use a gaming card for or run at a res that these things aren't being used for by 98% of the people. Folding? You must be kidding. Bitcoin hunting?...LOL that party was over ages ago - you won't pay for your card getting bitcoins today - it was over before anandtech did their article on bitcoins - but I bet they helped sell some AMD cards. Quadro+fireGL cards are for this crap (computational NON game stuff I mean). Recommending cards based on computational crap is pointless when they're for gaming.

I'm an amd fanboy but ONLY at heart. My wallet wins all arguments regardless of my love for AMD (or my NV stock...LOL). I'm trying to hold out for AMD's next cpu's but I'm heavily leaning Ivy K for Black Friday, fanboy AMD love or not. They ruined their company by paying 3x the price for ATI, which in turn crapped on their stock and degraded their company to near junk bond status in said stock (damn them, I used to be able to ride the rollercoaster and make money on AMD!). I'm still hoping for a trick up their sleeve nobody knows about. But I think they're just holding back cpu's to clear shelves, nothing special in the new ones coming. Basically a sandy to ivy upgrade but on AMD's side for bullsnozer. The problem is it's still going to be behind ivy by 25-50% (in some cases far worse). Unless it's an EXCEPTIONAL price I can't help but pick IVY as I do a lot of rar/par stuff and of course gaming. I'd get hurt way too much by following my heart this round (I had to take xeon e3110 s775 last time for the same reason).

My planned Black Friday upgrade looks like, X motherboard (too early for a pick or homework not knowing AMD yet), Ivy 3770K (likely) and a 660TI with the highest default clock I can get at a black friday price :) (meaning $299 or under for zotac AMP speeds or better). I already have 16GB ddr3 waiting here...LOL. I ordered it ages ago, figuring it's going to go through the roof at some point (win8? crappy as it is IMHO). I'm only down $10 so far after purchasing mem I think in Jan or so...LOL. In the end I think I'll be up $30-80 at some point (I only paid $75 for 16GB). Got my dad taken care of too, we're both just waiting on black friday and all this 28nm vid card crap to sort out. End of Nov should have some better tsmc cards available (or another fabs chips?). I'm guessing a ton at high clocks by then for under $299.

Anyway, THANKS for the good laugh :) I needed that after reading my 4th asinine review. Guru3d looking up for the 5th though...LOL. He doesn't seem to care who wins, & caters more to the wallet it seems (great OC stuff there too). He usually doesn't have a ton of cards or chips in each review though, so you have to read more than one product review there to get the picture, but they're good reviews. Hilbert Hagedoorn (sp?) does pretty dang good. By the end of it, I'll have hit everyone I think (worth mentioning, techreport, hardocp, ixbtlabs, hexus etc - sorry if I left a good one out guys). I seem to read 10+ these days before parting with cash. :( I like hardocp for a difference in ideas of benchmarking. He benches and states the HIGHEST PLAYABLE SETTINGS per card. It's a good change IMHO, though I still require all the other reviews for more games etc. I'm just sure to hit him for vidcard reviews just for the settings I can expect to get away with in a few games. I wish guru3d had thrown in an OC'd 660TI into the 7950 boost review since they're so easily had clocked high at $299/309. But one more read gets that picture, or can be drawn by all the asinine reviews and his 7950 boost review...LOL. I have to get through the rest of guru3d, then off to hardocp for the different angle :) Ahh, weekend geek reading galore with two new gpu cards out this week ;)Reply

(II) Those same after-market 7950s hit 1100-1200mhz on 1.175V or less in our forum. At those speeds, the HD7950 > GTX680/HD7970 Ghz Edition. How is that for value at $320-330?

The review didn't take into account that you can get way better 7950 cards and they overclock 30-50%, and yet the same review took after-market 660Tis and used their coolers for noise testing and overclocking sections against a reference based 7950.