Well it depends on the samples, the 660 ti I bought for my wife, I tested it in my pc and over 1290 core clock(with boost) after 10-15 minutes gaming in a game that doesn't even taxes the gpu past 70%, the video card crashes and windows tells me ''the adapter has stopped responding''.

Crysis 2 stutters on some levels but it's mainly stable 95% of the time wheras my 7950 overclocked is not doing this.

It would artifact in MSI kombustor with a slight increase in voltage and core clock above 1260. Good thing it's for my wife and not me, she won't overclock as it's way enough for her mere 1080p resolution. The memory overclocks at 6,6ghz easily.Reply

TXAA - AWESOME - THE JAGGIES ARE GONE.Thank you nVidia for having real technology developement, unlike amd loserThank you nVidia for being able to mix ram chip sizes or to distribute ram chips across your memory controllers with proprietary technology that you keep secret depsite amd fanboys desiring to know how you do it so they can help amd implement for free. Thanks also for doing it so well, even with reviewers putting it down and claiming it can result in 48 bandwidth instead of 144 bandwidth, all the games and tests they have ever thrown at it in a desperate amd fanboy desire to find a chink in it's armor has yielded ABSOLUTELY NOTHING, as in, YOU'VE DONE IT PERFECTLY AGAIN nVidia. I just love the massive bias at this site.It must be their darn memory failing.Every time they make a crazy speculative attack here on nVidia where all their rabid research to find some fault provides a big fat goose egg, they try to do it again anyway, and they talk like they'll eventually find something even though they never do. By the time they give up, they're off on some other notional and failed to prove it put down against nVidia. 192 bit bus / 2GB ram / unequal distribution / PERFECT PERFORMANCE IMPLEMENTATION Get used to it. Reply

ROFL... I should have just read more posts...Might have saved me a crapload of typing Cerise...LOL. Nah, it needs to be said more by more than ONE person :) Call a spade a spade people.

I tried to leave out the word BIAS and RYAN/Anandtech in the same sentence :)

But hold on a minute, while I fire up my compute crap (or 2008 game rendered moot by it's own 2011+2012 hires patch equivalent) so I can run up my electric bill so I can prove the AMD card wins in something I never intend to use a gaming card for or run at a res that these things aren't being used for by 98% of the people. Folding? You must be kidding. Bitcoin hunting?...LOL that party was over ages ago - you won't pay for your card getting bitcoins today - it was over before anandtech did their article on bitcoins - but I bet they helped sell some AMD cards. Quadro+fireGL cards are for this crap (computational NON game stuff I mean). Recommending cards based on computational crap is pointless when they're for gaming.

I'm an amd fanboy but ONLY at heart. My wallet wins all arguments regardless of my love for AMD (or my NV stock...LOL). I'm trying to hold out for AMD's next cpu's but I'm heavily leaning Ivy K for Black Friday, fanboy AMD love or not. They ruined their company by paying 3x the price for ATI, which in turn crapped on their stock and degraded their company to near junk bond status in said stock (damn them, I used to be able to ride the rollercoaster and make money on AMD!). I'm still hoping for a trick up their sleeve nobody knows about. But I think they're just holding back cpu's to clear shelves, nothing special in the new ones coming. Basically a sandy to ivy upgrade but on AMD's side for bullsnozer. The problem is it's still going to be behind ivy by 25-50% (in some cases far worse). Unless it's an EXCEPTIONAL price I can't help but pick IVY as I do a lot of rar/par stuff and of course gaming. I'd get hurt way too much by following my heart this round (I had to take xeon e3110 s775 last time for the same reason).

My planned Black Friday upgrade looks like, X motherboard (too early for a pick or homework not knowing AMD yet), Ivy 3770K (likely) and a 660TI with the highest default clock I can get at a black friday price :) (meaning $299 or under for zotac AMP speeds or better). I already have 16GB ddr3 waiting here...LOL. I ordered it ages ago, figuring it's going to go through the roof at some point (win8? crappy as it is IMHO). I'm only down $10 so far after purchasing mem I think in Jan or so...LOL. In the end I think I'll be up $30-80 at some point (I only paid $75 for 16GB). Got my dad taken care of too, we're both just waiting on black friday and all this 28nm vid card crap to sort out. End of Nov should have some better tsmc cards available (or another fabs chips?). I'm guessing a ton at high clocks by then for under $299.

Anyway, THANKS for the good laugh :) I needed that after reading my 4th asinine review. Guru3d looking up for the 5th though...LOL. He doesn't seem to care who wins, & caters more to the wallet it seems (great OC stuff there too). He usually doesn't have a ton of cards or chips in each review though, so you have to read more than one product review there to get the picture, but they're good reviews. Hilbert Hagedoorn (sp?) does pretty dang good. By the end of it, I'll have hit everyone I think (worth mentioning, techreport, hardocp, ixbtlabs, hexus etc - sorry if I left a good one out guys). I seem to read 10+ these days before parting with cash. :( I like hardocp for a difference in ideas of benchmarking. He benches and states the HIGHEST PLAYABLE SETTINGS per card. It's a good change IMHO, though I still require all the other reviews for more games etc. I'm just sure to hit him for vidcard reviews just for the settings I can expect to get away with in a few games. I wish guru3d had thrown in an OC'd 660TI into the 7950 boost review since they're so easily had clocked high at $299/309. But one more read gets that picture, or can be drawn by all the asinine reviews and his 7950 boost review...LOL. I have to get through the rest of guru3d, then off to hardocp for the different angle :) Ahh, weekend geek reading galore with two new gpu cards out this week ;)Reply

(II) Those same after-market 7950s hit 1100-1200mhz on 1.175V or less in our forum. At those speeds, the HD7950 > GTX680/HD7970 Ghz Edition. How is that for value at $320-330?

The review didn't take into account that you can get way better 7950 cards and they overclock 30-50%, and yet the same review took after-market 660Tis and used their coolers for noise testing and overclocking sections against a reference based 7950.

LOL.. I can't read your language, and am unsure of any of the cards speeds etc in that link.

You're comparing something you can do on your own, NOT out of the box. Which I already proved you can easily hit ridiculous speeds with the 660TI.

So how do I know I got a special binned chip before buying like your forum (again we can't read your language - most of us anyway)? But again I'd say it's not out of the box at those speeds. There is a ref card in GREEN for the 660TI or did you miss that?1150mhz?http://www.guru3d.com/article/radeon-hd-7950-overc...I know, exactly what this article runs at...LOL. Only using 79 watts more to get it done and he NEEDED 1.25v to do it. There is a reason AMD has this as default on the BOOST (not all chips can easily do it...they're not purposefully running hot and overvolted ya know...They have to in order to get more to do it!). No amount of cooling will save you money on your electric bill. Your magical 1150mhz is examined in great detail in that article with caveats regarding how long your life may be...LOL. OC at your own risk. Firmware in the 600 series makes this impossible on the 600 series cards. Roll your own dice thanks. Feb2012 article, there isn't some magical binned versions of these chips YOU can magically guarantee I'll get. Not all chips are created equal and no manufacturer is guaranteeing your speeds or anywhere near them. Point me to your magically binned advertised chips? I can't see them on newegg. So you must have some magical website I didn't know I could buy from. Enlighten me please.

Crysis 1 and warhead? Already debunked it's relevance. But here if you missed it:Games based on Cryengine v2? 7 total. CryEngine3? Check out the list, including crysis 2, and the coming crysis 3:https://en.wikipedia.org/wiki/CryEngineCrysis and warhead (from march 2008 for warhead, earlier for crysis) are NOT relevant. There are only 5 other games made on it. Point me to some benchmarks showing something I can read (#1) and where I can actually see the test setup (#2). Until then all your benchmarks are meaningless. Also don't bother showing me anything over 1920x1200 and claiming victory as I already debunked that as being less than 2% market share according to steampowered.com hardware survey AND more importantly no 24in or below is sold with a native res above 1920x1200 on newegg! 68 monitors without even ONE being above 1920x1200 recommended.

I already showed crysis 2 7950boost review vs. ref 660ti being a wash even at 2560x1920 (though a useless res). If you have to force the 660TI into something we'd never run at to show a victory your results are pointless. I'm sorry, does MSI sell your twinfrozr at 1150core out of the box? I must have missed that version on newegg. Value at $320? Not out of the box, and I can do the same thing for $299 on 660TI if I'm going to be doing overclocking myself and they are guaranteed out of box 100mhz over on both core and boost as shown before. Also I can't damage it (built into 600 series, the firmware won't let me do a dangerous overclock to shorten its life). The only two cards for $319 on newegg in 7950 are AFTER rebate #1 and only clocked at 800mhz #2. They're not going to spend on quality components to HELP your overclocking/life of the chip at that price. Quite a few of the overclocked 660's are SILENT in use."For the card in a fully stressed status (in-game) 39 dBA, now that simply is silent. So if you do not tweak the card or something, during gaming you can expect a silent card."http://www.guru3d.com/article/evga-geforce-gtx-660...And that one kind of sucks russian.http://www.guru3d.com/article/evga-geforce-gtx-660...All of the 660TI's out of the box on the heels of the 7970 in anno2070. But I know, if you can get a magically binned chip, you might be able to hit a speed that makes the 7970 look like crap for bang/buck and at a speed not warrantied out of the box. But I can get almost 7970 doing nothing, no worries for less $$. So what's your point? :) Note 3 of the 660's beat the 670gtx out of the box...LOL. You got some version where the 7950 beats a 7970 out of the box for $299? I know I "CAN" get lucky, but no guarantee @1.175 like you say. Or do you think AMD is just stupid and clocks all the boost versions (that aren't out yet) at 1.25v for nothing? A hell of a lot of them WON'T get to boost guaranteed without 1.25 or they wouldn't be doing it and purposely making their cards look like shite in reviews. How dumb do you think AMD is? It's a reference for a reason.

Ryan's review of the 7970ghz edition notes NV cards shipped clocks may not be the highest you'll get even default out of the box, only guaranteed (they will perform based on the tdp, better!):"Every 7970GE can hit 1050MHz and every 7970GE tops out at 1050MHz. This is as opposed to NVIDIA’s GPU Boost, where every card can hit at least the boost clock but there will be some variation in the top clock".http://www.anandtech.com/show/6025/radeon-hd-7970-...Where out of the box all radeon cards perform the same (except watts used varies), but NV cards can go higher than out of the box even out of the box on boost speeds...LOL.Same article:"With that said there’s no such thing as a free lunch, and in practice the 7970GE’s power consumption has still increased relative to the 7970, as we’ll see in our benchmarks."These chips aren't special or wattage wouldn't climb at all. Your magical 7950 isn't special either.Skyrim 1920x1200 same article - Gtx580/670/680 cpu limited and BEATING the ghz 7970 edition 98fps to 86fps! Note no improvement from 7970 vs. 7970 ghz edition. with 4xmsaa/16af. (neither shows a difference at 2560x1600 useless res either...so overclocking to ghz edition didn't improve the scores over the regular 7970? in either res...LOL)I can hear you say, that's not the 660TI. Got me..:http://www.anandtech.com/show/6159/the-geforce-gtx...What's that? AT 98% of the user res of 1920x1200 (and below) at 4xmsaa/16af all 660ti's beat the 7970? But the 660ti CRUMBLES you said at mxsaa...LOL. Whatever dude. I can keep going on...The 7950/7950B/7970/7970ghz all score the same at 1920x1200. You'll have to check both articles to get the 7970ghz edition as Ryan conveniently left it out of the benchmarks in the 660ti review...LOL. Gee why? Because it got nothing here too? Including warhead vs. crysis2 with HD and enhancement pack instead? A 2008 game vs. 2012 that has a crapload more games based on CryEngine 3.0? Only 7 on CryEngine 2.0 (and 2 of them are crysis 1 and warhead...LOL).Check all the 1920x1200 scores (sorry I already proved 2% or less run above this and most of those OVER 2560x1600, usually with more than one card), anand's games (as everywhere else) are maxed out at every res. You can't turn anything else on to help your cards. :)Shogun, 660's beating 7970"Overall this has become a fairly NVIDIA-friendly benchmark, with the GTX 660 Ti challenging even the 7970 at 1920."Challenging Ryan? Every 660ti but the reference beats the 7970 (which arguably NOBODY on newegg SELLS, most are clocked much faster, MSI N660 1019/1097boost far higher than 915 ref $299 since launch) ...But again he draws his conclusion based on 2560x1600, which by his own words just below the first benchmark (worthless Warhead from 2008 instead of crysis 2 maxed out) these cards are designed for 1920x1200/1080!Dirt3, tied with 7950 at 1920x1200, but again Biased Ryan (?)"while the GTX 660 Ti falls behind at 2560 as it runs out of memory bandwidth.". WHO FREAKING CARES what happens where 2% or less of us run, and at a res by your own words "For a $300 performance card the most important resolution is typically going to be 1920x1080/1200". TYPICALLY?...ROFL. should say 98% of users would use this or BELOW (actually only 29% use these two). He goes further in dirt too...LOL "Looking at the minimum framerates that 660/7950 standoff finally breaks in NVIDIA’s favor. Though a lack of memory bandwidth continues to pose a problem at 2560"Yeah, I know, because you've beat it like a dead horse as much as you can, it runs out again where NONE OF US RUN. Damn, as I read the review there is nothing to say but Ryan is getting some cash from AMD :) Jeez, twice on the same freaking page about the 2560 crap.

Sorry ryan, AMD lost this round at 1920x1200 where 98% of us run (or below) and nothing you can say about 2560 changes the world for 98% of us where it's either a WASH or a dominating victory for 660 (heck all 600 series cards, you can argue none are for above these resolutions in single cards, 98% no matter what they have, gtx690/680/670/660 etc all run 1920x1200 or below, no 24in monitor on newegg is ABOVE THIS). Who you advertising for?Portal 2 - LANDSLIDE (even at useless 2560 beats 7970 >25% nevermind the far slower 7950 here...LOL), 45% faster than 7950 at 1920x1200...ROFL. Guess you better have a 35-45% magical card just to catch the out of the box 660 here...ROFL."If NVIDIA could do this on more games then they’d be in an excellent position.". Umm...They did ryan, just quit looking at the res only 2% of us use. He points out the 660 can handle SSAA here (more taxing than MSAA Russian!!) so they concentrated on it.Google ssaa vs. msaa and you'll find stuff like this "SSAA theorhetically AA's the whole screen and would give a much more consistent AA. MSAA simply is limited to edges." and "Of course, there's a reason why people don't use SSAA: it costs a fortune". Tougher....Yet smoking on 660...Things like this are about the GPU/Shaders etc...NOT memory bandwidth as ryan beats like a dead horse at 2560x1600.Battlefield 3 same anandtech ryan article? 1920x1200 4xmsaaLANDSLIDE, ALL 660'S WINNING vs 7970!http://www.anandtech.com/show/6159/the-geforce-gtx...Pay attention to what Ryan is doing here people. 17% faster than the 7950B! FXAA is worse same page "At 1920 with FXAA that means the GTX 660 Ti has a huge 30% performance lead over the 7950, and even the 7970 falls behind the GTX 660 Ti." That's the GREEN REF bar that NOBODY sells as already noted, AMP speeds are almost had for $299. So really it's more like >34% faster than 7950B, not vs. 7950 reg. OH and it IS playable then at MSAA as ryan says is disappointing for the REF version...LOL. I know, and nobody will buy it (on accident maybe?), so for the rest of us, MSAA is ok with battlefield dropping to minimums of ~30fps. Just keep getting those digs in where you can... :)Sorry russian, I'd like to destroy more of your data, but the rest aren't benchmarked here, and I can't be bothered to do more than I already have just now to prove you wrong on the "crushed" comments... :) I think I proved my point. Sniper Elite2? What's that? Sell much?http://www.metacritic.com/game/pc/sniper-elite-v2Nope metacritic score 65...I wouldn't even pirate a yellow game (under 74 score pretty unanimous not so good), let alone pay for it or care about it's benchmark. Too many shooters at 80+ scores.Dirt Showdown is based on the same engine as dirt 3 here. If it performs worse, I'm thinking it's a driver issue...But never mind:http://www.metacritic.com/game/pc/dirt-showdown Score 72, user score 4.8 out of 10...ouch Gamespy quote:GameSpy wrote: “DiRT: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing DiRT series.”Keeping quoting useless games if you'd like though. :) Sory, their review is frontpage at metacritic :)

I'd like to see some bulletstorm, alan wake, serious sam3 benches so please LINKS? Hardware sites only please. I'd prefer review sites, rather than a forum.. People like ryan will slowly become useless with too many of these reviews. Forum users have no such worries and can more easily post anything they want. It's a good addition to something I'm looking into if I have questions after regular reviews, but I wouldn't want to base my buying purchase on forum posters benchmarks. Note I'm not posting my OWN benchmarks here after I do who knows what to my 660 TI (that I don't own yet...LOL), I'm pointing to results of review sites using stuff we BUY.

Personally I'm buy this card (and my 22nm quad soon) to run it below default to soundly beat my Radeon 5850 but do it without driving me out of my AZ computer room. The quad should give me a great boost also at 3ghz (3770K downclocked) vs my current heater in the E3110 3ghz dual core. Sounds crazy, I know...But this week it hit 114 outside! I have a great cpu that can easily clock to 3.6ghz also - prime95 stable below default of stable 1.25v-1.35v reg E8400's! The default for my chip is 1.08 and boots well below this at 3ghz stable!...So it gives you an idea of the heat in AZ and it's affect on even what I'd call one of the best E8400 3ghz wolfdales in the world. I can't beat the heat with my bada$$ cool e3110 (a purposely bought xeon s775 for better thermals). Electric is already $250 here so, reducing temps is kind of up to my PC itself :) You can bet I went through a bunch of places to pick my specific week/lot/country of origin on that baby :) I'm no stranger to ocing, but I don't think people will all rush home with their shiny new $330 7950 (not boost) and OC the heck out of it to beat a TI that runs cooler by default and OC's out of the box at .987 volts. ONE more dig at Ryan...LOL :

" As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance."

Umm...Must be looking at those 2560x1200 2% user benchems again eh? At the res we all play at, and all 24in monitors on Newegg (68!) are at (1920x1200/1080), WHAT LEAD? IN that 2008 warhead game? Debunked already. The rest, at best the race is a wash (and not often, pretty much metro2033 ~4% faster 7950Boost) and the rest are landslides and at times landslides vs. the 7970 ghz edition. You got the AMD part right though...Only two makers actually announced it...None seem to even care as they already sell tons of 900mhz, which is faster than the 850 boost and you should have added here as you wouldn't buy a boost when you get get a 900 for likely less...LOL Just like NV cards basically come OC'd no matter what you buy at newegg...Review what we buy, who cares what AMD/NV want?

You still confused about the conclusion you SHOULD have stated Ryan? 660TI rocks, and is a no brainer for all people using 24in and below (98%). For the other 2%...LOL. Whatever. Go ahead and remain confused about that...WE DON'T CARE. You can't be this dumb (I hope not), so I'll give you the benefit of the doubt and run with it can only be bias.

Jeez, I just had to check real quick at 27in...ROFL.http://www.newegg.com/Product/ProductList.aspx?Sub...Check the recommended resolutions on the left side people. 41 at 1920x1080! Only 11 others, and they are not even 2560x1600! They are 2560x1440! My god man, you aren't even right at 27in! OK, now I think you're just a freaking moron. Still confused Ryan? Anand, you there? Still care about your site? Let me know if you need a new reviewer :) This is borderline ridiculous to not have a conclusion even at 27inches! Only 20% are LESS than the res Ryan draws all conclusions from the other 80% of the 27inchers on newegg recommend LOWER than tested 1920x1200...ROFLMAO. NOT a single 27in has a recommended resolution of 2560x1600 (is only 1440...less stressful). I digress...For now...ROFL.Reply

Good calculation there, logical but still we shouldn't forget about nvidia's advantage. Not everyone is overclocking like enthusiasts do. For anyone not fiddling with clocks and voltages, Nvidia is the clear choice. Overclocked 7950 might be better, but using even the aftermarket coolers will need a well ventilated case. Crossfire will mean lowering your overclock unless you watercool them...

And most of the peolpe don't own superclocked CPU to get rid of the bottleneck it might cause. So Nvidia having a better relation with lower frequency cpus performs better at 1080p where lots of games are simply cpu limited unless you got a beast at 5ghz.Reply

It's not a good calculation by the amd fanboy - I went to his forum link, then to the review he linked, and saw the 660Ti SMASHING his 7950 black edition $350+ card to bits. He lists one game then a bunch of power and heat charts and goes on a PR selling spree... boy it's amazing... talk about obsessed fanboyism...Reply

Okay, so an expensive 7950, or an aftermarket HS, or water cooling, and expensive air positive case with lots of fans, then a healthy PS for the extra voltage, then endless twiddling and driver hacks for stability. So +$50 on the cooler or card, $50 or $100 extra on the case and fans, then add in $100 for the CPU to be able to take advantage... After all that dickering around and dollars, just amd fan boy out and buy a rear exhaust 7970 and be cheaper and somewhat stable at stock.Right ? I mean WTH. Then we have the less smooth problem on the 78xx 79xx series vs nVidia - the gaming is not as smooth - plus you don't have adaptive v-sync - another SMOOTH OPERATOR addition. These are just a few reasons why the amd prices have plummeted. I suppose now if you go amd you shouldn't worry much about losing a lot of value quickly, but for 8 months we took a giant hit in the wallet for buying AMD, now our cards are worth CRAP compared to what we paid for them a short time ago - with the 6 months+ of crap driver support. It's great - yeah just great - amd did such a great job. Reply

Jesus Christ, TheJian, you wrote a goddam Russian novel when you could have just come out and merely said that you want to have Jen Hsun's baby, you silly nvidiot.

Nvidia has simply pushed the default clocks on their cards much harder than AMD. So what? So AMD leaves more o/c'ing performance on the table. Big deal. That's hardly a decisive, knock-out blow for nvidia. As a matter of fact, I'm selling my Gigabyte Windforce GTX670 2Gb tomorrow (gorgeous cooler setup by the way, and utteryly silent) because, for $400, it only beats a 7870 by about 3-5FPS on my 3 monitor setup (5040x1050 resolution) in most games, and the thought that I could buy a 7870 for $240, or a Gigabyte 7950 card with 3Gb of memory for $300 made me ill. Long and short of it: if you're playing at 1920x1080, the GTX660 Ti looks pretty good (except for those AMD-optimized games) but if you're running 2560x1080 or higher, AMD's 3Gb-equipped 7950 is going to have the extra memory and muscle to keep your minimum frame rates playable, while the 2Gb GTX660 Ti is going to choke.

Besides, I'm sick of nvidia's shitty 3 monitor driver support. Every time I update the video driver, I have to perform brain surgery to get 2 of the monitors to come back up again. On the other hand, the Asus Direct CUII (another outstanding cooler) 7850 I had temporarily about 2 months ago for a few days drove my 3 monitor setup instantly, and setup took about 2 minutes. The AMD driver even 'guessed' the bezel compensation accurately the first time), and played Diablo III at a solid 60FPS at 5040x1050 on one card with all the quality settings at maximum. That card now costs $189.99 after factory rebate here in Canada:

Must people if you'd read all that, don't run over 1920x1200. The amount who do is <2% and you have to spend a lot to do it reliably over 30fps as hardocp showed etc. You made my point. It's great where 98% of us use it. Which is pretty much what the walls have been saying :) I won't apologize for being complete ;) But feel free to call me wordy, I'll accept it. Out of the 5 games tested at hardocp they found 2 (batman/withcer2) that hit 10-15fps (for a while) and 16fps on the 7950. You'd have to double it to have a good time in those games, which was my point. These are for lower res. specifically 1920x1200 or less. At which both do a great job. No disputing that.

People usually resort to calling you nvidiot, and having the ceo's baby (really?...I'm a dude) when they lack an effective opposing opinion. Thanks for both. Look in the mirror. ;)Reply

The GTX 660 Ti seems like a good "bang for your buck" card. NVidia should count itself lucky for having trouble keeping up with demand. My worry is they lose focus with the number of markets they are trying to fill. Something I am sure AMD will be watching for.Reply

They're not loosing focus, it's a new strategy and it must work wonders. Instead of releasing new products as quickly as possible and fill the market with all the parts from low to high-end performance, they get out the new higher-end parts and rely on their last gen cards to fill the holes.

Clean out the shelves so dealers don't get stuck with older technology not selling. And at the same time, not taxing new fabrication process(28nm in this case) by needing alot more to fill demand in every way.Reply

If they had released this at 249$ they would have never been able to supply the demand. . .why not just go for the jugular of amd? Oh yeah balance and perceived value in the market, only hurts us really.Reply

If they can't supply it - it cannot lower competitor prices, and can't be bought, so they make little or no money, and everyone else buys the available competitors product. Why doesn't AMD release a card that drives down the 680's price $170 per card and makes nVidia give away 3 free games with it too ? That would make too much sense for amd and we consumers and some competition that crushes evil corporate profiteering nVidia, so AMD should do it.(roll eyes)To answer your question> nVidia is being nice not draining all the red blood from amd's jugular since amd is bleeding out so badly already that if nVidia took them out a million raging in 3d fanboys would scream for billions in payola in a giant lawsuit they protest for in front of the UN and the IMF and the International Court and the 25k persons traveling EURO unelected power bureaucrats. So instead of all that terribleness and making amd fans cry, nVidia is nice about it.

First and foremost, it's 7am in the morning. Newegg doesn't always post stock updates this early, so GTX 660 Ti cards may not show up until a bit later in the day. Though EVGA already has their cards up on their site.

As for our concerns about launch availability, it's the difference between what is being claimed and what is being delivered. NVIDIA told us right from the start that the supply of the GTX 680 would be tight, and that's exactly what happened. AMD told us that the 7970GE would be available in late June, and that did not happen.

Mind you, AMD isn't having supply issues either. The 7970GE wasn't late because AMD was having any kind of trouble supplying partners with suitable GPUs.Reply

I got badly bitten this generation by the classic "new products are just around the corner" conundrum. I wanted to upgrade exactly this time last year but didn't because I wanted the new and more efficient 28nm AMD cards that were supposedly just around the corner. Instead, they came late and were a huge letdown: too expensive and not the kind of performance boost I was expecting. Then NVIDIA countered but only focused on the high-end for all these months. However it did seem that NVIDIA had the better, more efficient product. So I was waiting for their card in the $200-300 range and finally after a whole year it's here.

The price is a bit high, but I'm tired of waiting and will probably jump on this. I like the cool and silent operation of the Gigabyte but Ryan speaks highly of the ZOTAC as well, not sure which to get.

I have to disagree.. they were not a letdown at all.. Price seemed to be the only major complaint people were having. But in a way AMD has been on the ball. Their a step ahead of Nvidia in getting next gen products out and a half step behind in performance. To me that seems pretty good.Reply

Yeah sure there rarson, it wasn't AMD in a fit of corporate piggery and immense greed scalping the crap out of us and abandoning the gaming community for as long as they could possibly keep selling their junk at a huge inflated price..... Oh wait.... it was.Reply

And would you stop exaggerating with your conspirations and overpriced SHIT language all over the place. What hurts the most in the past if we speak about pricing stupidity is probably buying an i7-980x cpu for 1100$ and then one week later sandy bridge gets out and tramples the cpu at 220-300$ price points with an i5-2500k and i7-2600k... speak about inflated prices for pieces of hardware that must cost not even 30$ to manufacture.

Nvidia as any other company isn't being nicer, it's still based on maximum profit. They probably sell video cards that costs 20$ to manufacture and beleive me the profit doesn't go into funding environmental projects to get rid of all the electric wastes we create every year.

They pay developpers like any other company to get games optimized for their hardware and what's happening now, reduces our choices depending on the games we play. They still ask reviewers to ensure their products have the best performance in their reviews. That's, to me, liying to the consumer(US) so they sell more stuff. Even if their product is good(which they are), that's still manipulating the opinion, thus they do not work for the people, they work for the profit.

So... please, stop speaking about Nvidia like if everything they do has been decided by god before and thus is perfect in every way... it doesn't work. You won't be a better human nor be more happy if you play more game thanks to the performance of your video card, you'll just end up being more addicted and that's it.Reply

Attacking and attacking again, you have so much respect it's almost admirable. Respect is the most important thing in the world, if you can't have some for even people you don't know, I'm sorry but you're missing on something here.Reply

Every review shows the 660ti under EVEN the 7870 and your review shows the 660 ti performing to the level of a 7970, flawed bullscrap. Your website has a problem, the same you have, it has a choosen side aka Fanboyism.

I have both right now my wife uses the 660 ti in her pc for Guild wars 2 at 1080p and I bought the 7950 and overclocked both in my pc to test and the 7950 hands down tramples over the gtx 660 ti even both fully overclocked. I tested with skyrim on 3 monitor 5760*1080 and that's the only game I play.

Now don't get MAD, I never said the gtx 660 ti is a bad card, it works wonders. But it gets trampled at 5760*1080 in skyrim end of the line...Reply

Actually I think they need to raise the clocks, and charge more, accepting the fact they will run hotter and use more watts. At least they can get more for the product, rather than having people saying you can OC them to 1100. Clock the normals at 900/1000 and the 7970@1050/1100 or so. Then charge more. Of course Nv is putting pricing pressure on them at the same time, but this move would allow them to be worth more out of the box so it wouldn't be as unreasonable. AT out of the box right now you can't charge more because they perform so poorly against what is being sold (and benchmarked) in the stores.

With NV/Intel chewing them from both ends AMD isn't making money. But I think that's their fault with the mhz/pricing they're doing to themselves. They haven't ripped us off since the Athlon won for 3 years straight. Even then, they weren't getting real rich. Just making the profits they should have deserved. Check their 10yr profit summary and you'll see, they have lost 6bil. So I'd have to say they are NOT pricing/clocking their chips correctly, at least for this generation. These guys need to start making more money or they're going to be in bankruptcy by 2014 xmas.Last 12 months= sales 6.38bil = PROFITS= - 629 million! They aren't gouging us...They are losing their collective A$$es :(http://investing.money.msn.com/investments/stock-p...That's a LOSS of 629 million. Go back 10yrs its about a 6.x billion loss.

While I hate the way Ryan did his review, AMD needs all the help they can get I guess... :) But Ryan needs to redo his recommendation (or lack of one) because he just looks like a buffoon when no monitors sell at 2560x1600 (30inchers? only 11, and less than this res), and steampowered.com shows less than 2% use this res also. He looks foolish at best not recommending based on 1920x1200 results which 98% of us use. He also needs to admit that Warhead is from 2008, and should have used Crysis 2 which is using an engine based on 27 games instead of CryEngine 2 from 2007 and only 7 games based on it. It's useless.Reply

You speak like if they had to overcome Intel and Nvidia's performance is easy and it's all their fault because they work bad. AMD got a wonderful team, you speak like you ever worked there and they don't do shit, they sit on their chair and that's the result of their work.

Well it isn't, if you wanan speak like that about AMD, do it if you work there. No one is better placed to say if a company is really good or bad than the employees themselves. So just stop speaking like if designing these over 3 billions transistor things is as easy as saying ''hello, my name is Nvidia fanboy and AMD is crap''.Reply

Too late Cerise, you lost all credibility by not being able to have an objective(it means it is undistorted by emotions) opinion and you rather proved you're way too much emotive to speak about video cards manufacturer.

You too speak like if you ever worked at AMD and sure it is not the case, just visiting their headquarters would make your eyes bleed because in your world, this place is related to hell, with an ambient temperature averaging 200 degrees celsius, surrounded by walls of flesh, where torture is a common thing. And in the end, the demons poop video cards and force you to buy or kill your family.Reply

It seems that MSI has added some secret sauce, no other board partner has, to their card's BIOS. One indicator of this is that they raised the card's default power limit from 130 W to 175 W, which will certainly help in many situations. The card essentially uses the same power as other cards, but is faster - leading to improved performance per Watt.Overclocking works great as well and reaches the highest real-life performance, despite not reaching the lowest GPU clock. This is certainly an interesting development. We will, hopefully, see more board partners pick up this change.ROFL HAHAHAAHAAAAAAAAAAASo this is the one you want now Galidou." Pros: This thing is pretty amazing. Tried running Skyrim on Ultra, 2k textures, and 14 other visual mods. With this card, I ran it all with no lagg at all, with a temp under 67. Love it. "http://www.newegg.com/Product/Product.aspx?Item=N8...Reply

Gibgabyte did the same, the board power is up to 180 watts if you tweak it and still both overclocked(my wife's gigabyte 660 ti OC and my 7950 sapphire 7950 OC) the 7950 wins hands down at 3 monitor resolution.

How can you still trying to explain things when the only side of the medal you can speak of is Nvidia. Sorry, I see the good of both while you can't say a good thing about AMD. Both of my computer uses intel overclocked sandy bridge/ivy bridge K cpus, I'm no AMD fan but I can recognize I did the right thing and I did my research and having BOTH freaking cards in HANDS and testing them side by side with my 3570k @ 4,6ghz.

My 7950 wins @ 3 monitors in skyrim EASILY, you can't say anything to that because you ain't got both cards in hands. Geez, will you freaking understand some day. And no I ain't got any freaking problem with my drivers... And I paid the 7950 the same price than the gtx 660 ti. EXACT same price. 319$ before taxes.

Geez it's complicated when arguing with you because you ain't open to any opinions/facts other than: AMD IS CRAP, NVIDIA WINS EVERYTHING, AMD IS CRAP, NVIDIA WINS EVERYTHING, HERE'S MY LINK TO A WEBSITE THAT SHOWS THE 660TI WINNING AGAINST A 7970 AT EVERYTHING EVEN 6 MONITORS LOOK LOOK LOOK.Reply

I was speaking to their finances. If you see in one of my other posts, I believed they deserved 20bil from Intel, but courts screwed them. That is part of what I meant. They deserved their profits and more. Tough to get profits when Intel is stealing them basically by blocking your products at every end.

No comment was directed at "dumb" employees. I said it was hard to overcome, not easy. Also that they had the crown for 3 years and weren't allowed to get just desserts. I'm sorry you didn't get that from the posts. I like AMD. I just fear they're on their last financial leg. I've owned their stock 4 times over the last 10 years. There doesn't look like there will be a 5th is all I'm saying. I speak from a stock/company financial position sometimes since I've bought both and follow their income statements. I'm sure they're all great people that work there, no comment on them (besides management's mishandling of Dirk Meyer, ATI overpurchase).Reply

Exactly this. Anyone who follows these respective cards, 7950:670, 7970:680 etc knows that the AMD alternatives have excellent overclocking potential. All these reviews are comparing high clocked GTX vs stock or very conservatively boosted AMD cards. I can get my 7950 to 1000 mHz on stock voltage. That will destroy this toy they call a TI. Sorry but the results seem a bit biased. Reply

Just so we're clear, are you talking about our article, or articles on other sites?

if it's the former, in case you've missed it we are explicitly testing a reference clocked GTX 660 Ti in the form of Zotac's card at reference clocks (this is hardware identical to their official reference clocked model).Reply

That's the sales numbers referred to there rarson - maybe you should drop the problematic amnesia ( I know you can't since having no clue isn't amnesia), but as a reminder, amd's crap card was $579 bucks and beyond and nVidia dropped in 680 at $499 across the board... Amd was losing sales in rapid fashion, and the 680 was piling up so many backorders and pre-purchases that resellers were gbegging for relief, and a few reviewers were hoping something could be done to stem the immense backorders for the 680.So:" "the GTX 680 marginalized the Radeon HD 7970 virtually overnight," That's the real world, RECENT HISTORY, that real bizarro world you don't live in, don't notice, and most certainly, will likely have a very difficult time admitting exists. Have a nice day.Reply

Go look up Bias in a dictionary instead of flinging around insults like a child. When the adults converse amongst themselves they like to Add things to the actual conversation, not unnecessarily degrade people. Thanks! @$$-OReply

The point I was making was that Nvidia has seeded overclocked cards to the majority of the tech press, while you had a go at AMD for their 7950 boost.

After all the arguments and nonsense over the 7950 boost, hardly anyone benchmarked it but still plenty went ahead and benched the overclocked cards sent by Nvidia. Two AMD partners have shown they are releasing the 7950 boost edition asap, prompting a withdrawal of the criticisms from another nvidia fansite, hardwarecanucks.com

So again I ask, AMD's credibility? The only credibility at stake is the reviewers who continually bend over to suit Nvidia. Nvidia has no credibility to lose.Reply

I'm afraid I have to back you up on this one. NVIDIA released not one, not two but THREE GT 640s, and I think people have forgotten about that one. AMD have replaced the 7950 BIOS and as such have overclocked it to the performance level where it probably should've been to start with (the gap between 7950 and 7970 was always far more than the one between 7870 and 7950).

Yes, AMD should've given it a new name - 7950 XT as I said somewhere recently - but it's not even two-thirds as bad as the GT 640 fiasco. At least this time, we're talking two models separated only by a BIOS change and the consequently higher power usage, not two separate GPU generations with vastly different clocks, shader counts, memory types and so on.

If I'm wrong, I'm wrong, however I don't understand how AMD's GPU division's credibility could be damaged by any of this. Feel free to educate me. :)Reply

For your education and edification: amd failed in their card release by clocking it too low because they had lost the power useage war(and they knew it), and charging way too much on release. They suck, and their cred is ZERO, because of this. Now it not only harmed amd, it harmed all of us, and all their vender partners, we all got screwed and all lost money because of amd's greed and incompetence. Now amd, in a desperate panic, and far too long after the immense and debilitating blunder, that also left all their shareholders angry (ahem), after dropping the prices in two or three steps and adding 3 games to try to quell the massive kicking their falling sales to nVidia injuries... FINALLY pulled their head out of it's straight jacket, well, halfway out, and issued permission for a GE version. Now, maybe all you amd fans have been doing much and very excessive lying on 78xx79xx OC capabilities, or amd is just dumb as rocks and quite literally dangerous to themselves, the markets, their partners, all of us. I think it's a large heaping of BOTH. So there we have it - amd cred is where amd fanboy cred is - at the bottom of the barrel of slime.Reply

Anyway, with you AMD fails, always failed and will continue to fail at everything... I don't know if you think people will read your posts like religious madmans and beleive it a 100%, you're making it so exagerated, that it's barely readable.

The words nazi and such comes back so often when you go on the madman route, that it's a wonder if anyone gives you any credibility. A little sad because you have nice arguments, you just display them surrounded by so much hate, it's hard to give you any credit for them.

We do exagerate AMD's performance just for the sake of being fanboys, but not to the point of saying such debilitating stuff like you're so good at it. Not to the point of totally destroying Nvidia and saying it's worth NOTHING like you do for AMD. I may lean a little on AMD's side because for my money they gave me more performance from the radeon 4xxx to the 6xxx series. I won't forget my 8800gt either, that was a delight for the price too. But I can recon when a video card wins at EVERYTHING and is doing WONDERS and none is happening now, it's a mixed bag of feeling. between overclockability, optimization on certain games, etc...

When the 8800gt and radeon 4870 came out, there was nothing people could say, just nothing, for the price, they were wonders, trampling over anythingbefore and after but at the same time you said they were mistake because they were not greedy enough moves.

Wanna speak about greed, why is Nvidia so rich, you defend the most rich video card maker in history but you accuse the others of being greedy, society is built on greed, go blame others. Couldn't they sell their GPU at lower prices to kill AMD and be less greedy? No, if AMD die, you'll see greed and 800$ gpus, speak about greed.Reply

Didn't read your loon spiel, again, not even glossed, part of 1st sentence.I won't tell you to shut up or change what you say, because I'm not a crybaby like you.AMD sucks, they need help, and they only seem to fire more people. Reply

To date your best argument that repeats itself is ''AMD sucks'' which is something you learn to say when you're a kid. You're not a crybaby ohhh that's new, you keep crying more than everyone else I've seen, TheJian might be a fanboy but you're more related to the fanatic side of the thing.

Still, they are the most rich video maker in history, but they still try to manipulate opinions like every company does. Why? if their product is so good and perfect, why do they have to manipulate? I hear you already saying something like: It's because that AMD suck, they suck so much that Nvidia has to make em suck even more by manipulating the stoopid reviewers because the world is against Nvidia and I'm their Crusader.... good job.Reply

This is only the point of the iceberg when we speak about credibility. Anandtech was nice enough to have a stock clocked part, we can't say that for most of the reviews on the internet.

I even got on a website ''not gonna say it, could be too much shame for them'' that was comparing a non reference 660ti overclocked with... suspense... a 7850. And then some times in the review offered an ''alternative analysis'' against a 6870, who's dirty now?

I won't name any but of all the review sites I usually read, they were all testing overclocked cards (plus the included Nvidia boost) against stock clocked AMD cards, ALL of them... Only one included minimum frame rates to all of the games tested which was interesting to see the limiting bandwidth acting at certain points. One can only wonder if the games released won't have any problem with that.

I first came here on anand and almost pulled the trigger buying one RIGHT after finishing reading. Then I visited my other sites and it got all messed up. Anand didn't have minimum frames everywhere, others had different results, the games I play switch from one brand to another for the ''best bang for my bucks''.

With all that mixed up mess, one can only wonder where the ''real'' truth is. I'll probably just end up buying a 7950 overclocking it 40-50% higher and not wonder about future games. At least I waited long enough to see the 660ti. Anyway the other reviewers had quite good result with the 7950 and it was STOCK omg 40-50% overclock can't give a bad performance...Reply

*OC 660ti's on newegg and only 3 Stock.The author pointed out there is no default version, and Partners have a somewhat free reign on released clocks. Now be a good person and go look for yourself, you'll have a hard time finding a stock card vs an OC oob card. I'd also like to see that 40-50% 7950 OC....(methinks you really spewed overboard there) Reviews are noting a 17%-22% max performance gain on maximum 7950 OC, and that does not mean it's stable, except on a sole rider, non internet server, spanking clean, just defragged, built for benching, top of the line components, reviewer super massive rig. So, can we get that 50% OC bench set from you ? NO, of course we can't.Reply

My friend bought the Twin frozr 3 while it was on special on newegg(300$ a week ago). overclocked 1150/1700 stable that's a 44% overclock and he could go higher, with the stock cooler. We reported gains of around 30 to 36% performance gain in games.

On newegg, there's plenty of people reporting 1150 to 1200 core overclock, because it is in fact a 7970 board at a very cheap price. If you really can't accept one good thing about AMD that's where I differ from you.

The thing is, Nvidia won this round for the average user, most of us don't overclock and are not fiddling with voltages and such. Including a nice boost is good for those average users, the fact is and whatever you might say, overclockers know it. AMD is very overclocker friendly this gen, end of the line, cry about it some more, it doesn't change the fact that they already know it, sorry. If you tried to misinform the people, you're too late, it's already circulating on the internet my friend.

Now you shall say and I've heard it: ''People have been able to get their gtx680 overclocked to 1300 core in some cases so they are..........''. I know the drill, 680 has for the most part, a boost clock of around 1100 - 1150 boost clock. Lemme translate that, 200mhz overclock on a 1100 boost clock, 18% overclock on the cherry picked 680, because I'm comparing it with a 7950 which didn't pass the 7970 requirement.Reply

And you think AMD is much different, 7950 on newegg, 4 out of 18 models are reference models and no overclocker will ever buy em because we're already informed of that.Seems like you're not checking both side stock, that's what it is to be an Nvidia fanboy, you speak of things without having a clue about what you say.

Blah blah blah they only have 3 reference cards....

I get on newegg.ca and I see 4 reference designs 2 of them are overclocked, I see 4 stock clocked cards 2 are non reference coolers. It isn't different from amd with his 4 out of 18. Poor newbie, speaking like he knows something while he's in total darkness.Reply

Hey dummy, the initial statement was a crybaby whine that the 660Ti was reviewed in OC models.Try to DEAL with what I said in response to some idiot not knowing why the 660Ti had a lot of OC model reviews. The author here pointed out why, but I'm sure you crybabies didn't read, or just had a rage3d brain fart instead of any comprehension, otherwise you wouldn't have whined.PS dummy - amd cards have been out for 9 months for OC models, so checking egg now is pretty dang stupid... amazingly stupid, but that's what fanboys do - make stupid idiotic complaints, get corrected, then make more stupid idiotic replies. Get your head screwed on straight.Reply

You're right on that one, still, AMD AT LAUNCH had almost as much overclocked parts available to the public as they have now. How could you know, like you ever check the stock of AMD cards, you lack informations anyway.

But still on almost 80% of the website they were reviewed stock clocked only. On the opposite of Nvidia having pushed for more overclocked parts to be reviewed, 80% of the 660 ti were OCed, why being so unfair if your products are FAR superior, well, because they simply are not THAT much superior.Reply

LIAR " You're right on that one, still, AMD AT LAUNCH had almost as much overclocked parts available to the public as they have now."

WRONG. THEY HAD EXACTLY 1.

But as it’s turning out the Radeon HD 7970 isn’t going to be a traditional launch. In a rare move AMD has loosened the leash on their partners just a bit, and as a result we’re seeing semi-custom cards planned for launch earlier than usual. XFX looks to be the first partner to take advantage of this more liberal policy, as alongside the reference cards being launched today they’re launching their first semi-custom 7970s.Fully custom cards will come farther down the line. Of these 4 cards, 2 of them will be launching today: XFX’s Core Edition pure reference card, and their customized Black Edition Double Dissipation model, which features both a factory overclock and XFX’s custom cooler. It’s the Black Edition Double DissipationANANDTECH

There's no reason for ''Hey dummy'', ''crybaby whine'', ''PS dummy'' in our discussion, the lack of respect you use there just shows even more you're not able to maintain a good level of objectivity.

''Get your head screwed on straight''

You're always trying to attack... always, sad, you're just trowing your credebility out the window yourself, not others are doing it, you alone are just trowing what little dignity you could have on a forum speaking about video cards.... comon, we're better humans that that.Reply

You live in alaska maybe this is good. You live in AZ like I do, you're mentally having issues if you leave the store without a 660 TI in this battle at $299+.

Power@load 315w zotac amp vs. 353 7950 (vs 373w for 7950B)!

Nevermind what happens when you OC your beloved 7950.

NOISE? The worst 660 TI (Zotac Amp) is 49.2DB. The 7950 is already 54.9, and 7950B is even worse at 58DB (the highest card in the list!). So do you want to hear your speakers and game sounds or that fan driving you out of the room. We are talking a 9DB difference (and DB's are exponential in noise rising), not overclocked on your 7950B in these results in this article. The 7970 is less noisy than the 7950B here.Reply

I'm looking for the Sapphire 7950, no overclocker will buy a reference design and overclock it unless they want a jet engine..... The sapphire OC is really quiet even under load, and it gets to 1150 core on low voltage and low enough temp, if it's too hot I'll even let 50mhz go away anyway it will translate into a 40% overclock.

The thing is I don't play Battlefield 3 and never will, same for portal 2, I'm a skyrim player looking for more memory than the 1gb of my 6850 crossfire ATM. I'll play other games, but the real power of Nvidia is mainly in those 2 games.Reply

You got a 30 inch monitor? Otherwise the 660TI is better in almost all games and usually by a larger than 20% margin, in some cases as I already proved, even vs. the 7970ghz edition.

Already stated 660TI is less noisy even the Factory OC'd tested here and anywhere around the web. These are not HIGHLY clocked (none of the 79xx series are in the benchmarks except for the ghz edition) and all are noisy compared to 660TI.

Power of nv only in two games?...Already debunked unless you have a 30in or have multi-monitor spanned res (way over 2560x1600) and in both cases you're running something like a gtx690 or more than one card. See my other posts in this review comments section. I've done the comparison work for you (and tore ryan a new one while at it). All you have to do is read it and verify it all. Easy. You're wrong. Or prove it at 1920x1200 or below.

This card comes with 2GB :) You're welcome ;)Show me a REVIEW site showing a 1150 core clock on 7950 that isn't pumping out an extra 80 watts to do it and I'll accept that opinion.http://www.guru3d.com/article/radeon-hd-7950-overc...1.25v, 79 watts more than 7950 default.http://www.bls.gov/ro5/aepchi.htmBureau of Labor Stats, avg cost per watt in usa 13.5c.http://www.citytrf.net/costs_calculator.htm3hrs per day (how much do you game?) @13.5c for roughly 80watts extra to run at 1150 per year=~$12 per year...Game heavy on weekends, or more during the week too? I know people that put in 10-20 hours on a weekend easy with a hot game...LOL. I wish I had that much time. Factor that over 3yrs, or live in a higher cost (Miami FL, anywhere in CA, Chicago IL) and it goes up. So at your modified version it will at least cost you ~$36 or so, not to mention the heat it generates for 3 years which may cause you to turn up your AC if you're in AZ/TX etc... :) Assuming you own the card for a good 3years before upgrading again and are a regular gamer (you have dual cards already - safe to say SLI users game more than most?), this is going to cost you some extra cash. It's NOT just the purchase price that gets you. Also I'm only talking VS the 7950 regular speeds. I'm not talking vs an already lower watt (.987v) 660TI by default, that is already cooler too no matter where you look. So the savings in more pronounced (say $40 instead of $36 over 3yrs at 3hrs).

What can you do with a $40 savings over 3 years? Add another 8GB DDR3 module (or 2x4GB) to your machine...Buy a K chip instead of regular IVY/Sandy (really, any point in sandy now? for a $10-20 savings? Don't buy Sandy people), and get a 50PK of blank Taiyo Yuden dvdr's or some DL's, or a 25pk of blurays with that K chip. You get the point. :) It's FREE money if you choose correctly. Not to say I'm telling you to buy a 660TI...IF you have a 30in and run at 2560x1440 I'd hint at a 680 and OC it instead :) I am completely against dual cards because of heat, noise and watts...But if you couldn't care less about these things the sky is the limit for you. I will NEVER buy 2 cards when I can get ONE with a dual chip config for anywhere near the same price. It will always win on watts/not splitting mem/heat etc...Just hard to beat a card like GTX690 for all these areas vs. sli/crossfire. I want my pc silent and COLD. If you have no speakers or like my nephew game 90% with headphones, again maybe 2 cheaper cards would rock for you vs a 690 etc. Personally it's against my religion... :)Model #: 100352OCSRModel #: 100352FLEXModel #: 100352SRAll newegg...Make sure you get the right one if you really want a 7950 sapphire oc. Note the connectors differences, and I'd take the top one if I was you. Not sure why the bottom one exists, as it seems a dupe of the same priced top...But you'll need to look closer, I don't want one :) Sorry.http://www.newegg.com/Product/ProductList.aspx?Sub...Nice card though, and I saw a review of the 7970 version at Hardocp (one of the best they've seen they said), it was BUILT very nice and they commented on the component level parts being better and more suited to OC. Not sure if it's the same here. Just an observation of the big brother :)Reply

I play on either my 1080p TV or my 3 monitors 1920*1200 monitors(mainly on the monitors now. I only had the tv before so it was ok with my 6850 crossfire but now I'll need more memory and the main game I play, almost the only one, is Skyrim. I can't use the texture pack and some high details because in some caves and some places, the memory is limiting me badly. I plan to change the tv for one of those new high resolution one when they come out. In crossfire, you don't add the memory, you ahve the same video memory of only one card, so no it's not 2gb it's 1gb, thanks.

I want the 7950 OC with 950mhz core I already did my research prior to the 660 ti review. They've been out for many months, I just work too much during the summer so I wasn't in a hurry and I wanted to see the competition.

There you go, they even have thermal pictures of the whole system/card which is something I was looking for. It's not a performance review against competition it's only on overclock and power usage. I don't care about the 80 watts supplementary too much because I was ready to buy an GTX 580 at a low enough price which has stock clock power usage like 7950 overclocked power usage.

Page 11 is the one I'm looking for, I just want to get 1100 mhz which seems everyone gets 1150 with this card.

Before the patch in AMD drivers, Nvidia had a clear advantage so the 660 ti was like the obvious choice when it would come out(I thought so). Then these drivers and optimized games are mixing all the stuff up... If only they would care about us and not only their product performance, thus, their profit, but I know it will never happen. Money makes wars everywhere and it will NEVER stop.Reply

In SLI/Crossfire this card will give you 2GB (double your 1GB now, in either single or sli/cross you'll get 2GB), that's what I meant. I was talking about the 660TI, or whatever upgrade you do. You get the size of 2GB (one card's worth - SLI same thing). 2GBx2GB should be fine for skyrim at 1920x1200 (no card here was punished at 2GB 1920x1200). I already pointed to an article that shows NO difference from 2gb to 4gb in ANY game they tested.:http://www.guru3d.com/article/palit-geforce-gtx-68..."But 2GB really covers 98% of the games in the highest resolutions." It even worked at 5760x1200 below :) Look @ hardocp who tested on 3 monitors@5760x1200. 30fps min, 58.6avg 104max. On a single gtx680. Two 660's should smoke this single 680.

Not quite sure I understand your gaming on 3 monitors comment...You mean 5760x1200? Or are you only gaming on ONE of them 1920x1200? or you mean something else?So you are planning on buying 2 video cards again? I thought you were just replacing with one at 2GB, but if you're gaming at 5760x1200 that's another story. I'd just buy a GTX680 and OC.http://www.hardocp.com/article/2012/03/22/nvidia_k...For the apples-to-apples test we lowered the AA setting down to 4X AA, just in case there were some hidden bottlenecks. Lowering the AA setting to 4X AA only made things worse for the Radeon HD 7970. The GeForce GTX 680 increased its advantage to 29% performance advantage over the HD 7970!"Granted as shown below with anand 12.7 they got better, but still lost, so don't expect NV to become behind in the above test even if they changed drivers. It didn't help below. NV won both tests anyway in anandtech testing... :)Anand 12.7 drivers skyrim, 7970ghz edition, slaughtered at 1920x1200http://www.anandtech.com/show/6025/radeon-hd-7970-...All NV cards are cpu limited at that res, gtx580/670/680. But they all win by 10fps with 4xmsaa/16af. Even at the useless 2560x1600 res 680 still wins...Thats a REF model they're comparing it to also. 680 does much better than this with what you BUY, but this is vs. ghz edition, so don't expect much more from your overclock and heat/noise will get worse. You'll only get another 15% at 1150 than they are already benching here (if that, scaling isn't perfect).

From your own quoted article voltages vary, and as I pointed to another guy needed 1.25v to hit 1150."Secondly, PowerTune doesn’t register increases in the GPU voltage and the big energy consumption increases that come with it. The technology is therefore incapable of fully protecting the GPU and the card. AMD says that OVP (Over Voltage Protection) and OCP (Over Current Protection) are still in place but, as we were able to observe, these technologies cut all power to the card when limits are exceeded."YOU can damage your card as I stated before this can't happen on 600's.http://www.guru3d.com/article/radeon-hd-7950-overc...1.25v 1150. Not all cards are the same.Look at your own chart on your link. Scroll down to where they show the chart and 1.20+ being REQUIRED to hit this statement:"The maximum clock on the Radeon HD 7900s generally seems to be between 1125 and 1200 MHz when the GPU voltage is adjusted.". Look at the chart above that line...1.20 is REQUIRED in most cases to hit anything over 1100. That is the FIRST voltage where they all meet 1100. @1.174 two couldn't even hit 1100 (the HIS and the Reference card). ONLY 3 cards hit 1125@1.25, only ONE went above. RUSSIAN, are you listening?...LOL. There are 11 cards in the list, be careful assuming all Sapphire OC's are the same. They are not. From page 19:"When the GPU voltage is changed this goes up to an increase of between 21 and 78%, which is enough to put the power stages of these cards under stress."

That's 78% more power draw is a lot of watts, and they'll product a lot of heat/noise. Thanks for the link, it's a good article, I look forward to the 660 TI article there to see the comparison (hopefully he'll make one, there were only radeons in this article). Also note his page 14 comments and the charts showing heat stuck inside the pc frying other components as they don't expel the heat out the rear well:"The reference HIS, MSI Twin Frozr III and Sapphire OverClock Radeon HD 7950s and Radeon HD 7970 Lightning however tend to direct more hot air towards the hard drives.". Your card is in there...Only two cards didn't do that. Another downer for the 7950 cards in my mind. The CPU in your case (your OC card) would be 5C higher as shown in the chart vs. reference model. That's a lot of C added to your CPU. Paying attention russian? All of this affects your glorious 1150 speeds.

Galidou has a better article than I used, as it is more complete regarding the "OTHER" effects a 1150mhz overclock on your radeon will get you (cpu raises, HD's, mem sticks etc), not to mention they don't expel the heat. I'm wondering if his 660TI article will show case temps sucking also now...ROFL. Thanks again Galidou. GOOD INFO. Right now my decision is the same, but after he does a 660 article I hope the issue doesn't become confused :)Reply

Lol you're so fun, people overclocked gtx 580 and it gave out more heat and still ate more current than anything this gen can ever imagine and no system died as of this yet... I gave you an example it can easily hit 1100-1150 without breaking 80w more and still you have to speak and speak and always say the same thing and try to make me beleive that if I ever buy a radeon card I will be deceived, I will eventually catch cancer and die of it.... thanks your fanaticism is appreciated. The more I speak with fanboys of both side, the more I think I'll have to stop playing computer for the risk of becoming like them.

Skyrim,3 monitors is clearly ahead, I really thought you'd have sense after telling you the games I play but it seems you're as stubborn as someone can be. My wallet is speaking, my radeon 6850 crossfire made more heat in my system than this 7950 overclocked alone will do and yet I'll gain in performance.

Skyrim with mods isn'T shown anywhere, it fills over 2gb of ram as soon as you ramp up the mods in there, I'm playing with 30 and my 1gb of ram is crying at me to stop. If you don't know the game you're speaking of, just don'T comment please.

You can't damage your 660 or 670? lol fun stuff I'm a fan of reading 1 egg review on newegg and there's plenty on both side(AMD and Nvidia) claiming that their card died, for ati they died in the first week. 670 is newer than 7950 and still there are numerous cards that have 10 to 30% of 1egg-2eggs because of fried cards with and without overclock ranging from 1 days of ownership to 3 months of ownership.

I guess 28nm isn't at it's peak and that is reflected. One guy claimed he overclocked his 670 and when the boost would go past 1290 it would black screen and shut down in battlefield 3(verified owner in the forums). Why would someone own the card say stuff like that?

The articles you'Ve shown me in skyrim are before the big patch in AMD drivers, don'T get what I say, look at anandtech review of the gtx 680(4 months ago I think) and the review of the gtx 660 ti and watch the difference. You obviously didn'T do the research I did. The driver is about a month old and it gave VERY big improvements in skyrim, just watch any new review dating end of july to now and you will understand.

Seeing you had to speak again so much and seeing the lack of information you have, this is ending now, thanks anyway but your stubbornness dragged me out of myself and I'm tired to speak with someone who already has a choosen side that blinds him to the point he cannot bring arguments that are valid on every front. Been nice tho.Reply

I meant over 2gb of Vram with texture pack. I'm an overclocker, I've been overclocking things for the past 15 years and still you try to explain me some things about heat while you can'T even find articles relative the the real performance of AMD with the newer drivers.

I'm the kind of enthusiast you can find on overclock.net forums, you've argued the wrong way with the wrong person sorry if my english wasn'tperfect all along but I'm a french canadian.

Still, temperatures raises on components like you said has been experienced for years, my radeon 4870 is super hot and still is working in my wife system and still nothing died, it leaves more heat in my whole system than the radeon 7950 alone will and I overclocked the darn thing.

You're speaking like you'Re trying to make a show to everyone reading you but no one is reading cauz it's too long for them to bare and it isn'T even actual. So cut the show and the ''Galidou has a better article than I used'' like if you'Re speaking to someone, we'Re arguing together and you got lost in your information and didn'T even know of the BIG perf improvements as of 12.8 catalyst drivers show your inexperience. GGReply

Oh and I forgot, last thing then you'll NEVER ever hear from me again on this forum so free to you to speak in the emptiness, the more we add, the less people will get to read it.

2560*1600 = 4 millions pixels

1920*1200 x 3 monitors = 7 millions pixels

2560*1600 is cpu limited on anadtech, meaning all video cards in the review that are stuck at 84-86fps will go higher, so the 7950 will be ahead of the 7970 and ahead of the gtx 670 which is 100$ over the price I will pay because I'll wait for the special to come back and there's never any special on Nvidia cards because they are too good.Reply

And stop showing overclocked results and damaging thing about reference lousy stupid board, my system is watercooled and the video card will surely be in the end and stop shpowing me gtx 680 results it's above what I want to pay, I'm replacing my crossfire for one 7950 and watercooled they get EASILY to 1300 core clock which is rapeage of even the gtx 680 stock clocked ANYWHERE.Reply

now look at 5760*1080(lower resolution than I'll use) and look at where the 7950(not overclocked reference lousy board) just get te picture with the new drivers now. I leave up to you to find any 5760*1080 before drivers release and in the future, learn to find results by yourself.Reply

I know you don't want any AMD cards, it'S pretty obvious while I don't care about the brand it's only the fanboys from each side seems convinced the gap is SO IMMENSE while I don't see much of a gap when you consider price/performance and as always it depends on the game.

Considering I'll play Skyrim on 3 monitors, unless you're a freaking blind fanboy, it would be hard to recommend the 660 ti... The sapphire OC to 950mhz isn't even on this site, it's simply a reference 7970 board with an 6+8 pin PEG connector.Which supports the overclock with the reference voltage on 1000mhz core easily.

If I didn't have the knowledge and desire about overclocking I have now, the choice would be freaking obvious, 1080p gaming without overclocking, welcome 660 ti the card would be on it's way. But I want to overclock and everyone in the forums at overclock.net already know it, and whatever how big your doubts are, 90% of them report super overclock.

Now don't bring me some of your comparison with the 7950b and reference coolers or I just do not answer back to you, AMD has as much non reference boards than Nvidia has. WTF wake up.... guru 3d overclocking a reference board with the lousy worthless not selling reference fan when 75% of their cards have way better coolers, good way of representing the reality..... COMON.

BTW, I live in Canada, province of quebec, and where I live, the electricity is quite cheap, like really cheap.Reply

And it's $350. The only BOOST edition on newegg 2 days after this review.

A full 6 660 TI's for $299 (one after rebate). So, unfair to not include a card that looks like there's a $50 premium to the TI? I beg to differ. Also there are 11 cards available to BUY for 660 TI. Nuff said?

It was rightly picked on.Google 7950 boost, you get $349 cheapest and availability is next to none. Google 7950b you don't even get a result for shopping. The radeon 7950 cheapest at newegg is already $319.99 (most after rebate). If you're looking at 1920x1200 and below the 660 TI is a no brainer. It is close in the games it loses in, and dominates in a few it wins in. Not sure why the nvidia 660 ti is even in the list, you don't buy that. Zotac's $299 is basically the bottom you buy and is faster than the ref design at 928mhz/1006 boost (not 915/boost 980), so consider the TI GREEN bar slower than what you'll actually buy for $299. Heck the 6th card I mentioned at $299 after rebate is running it's base at 1019 boost at 1097! So they are clocking regular cards at a full 100mhz faster than REF for $299. Another at $309 is also like this (1006/1084 boost). Knowing this you should be comparing the Zotac AMP (barely faster than the two I mention for $299 and 309) vs. the 7950 which is $320 at minimum!

Zotac AMP (only 14mhz faster base than $299/309 card) vs. 7950 (again more expensive by $20) @ 1920x1200Civ5 <5% slowerSkyrim >7% fasterBattlefield3 >25% faster (above 40% or so in FXAA High)Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)Batman Arkham >6% fasterShogun 2 >25% fasterDirt3 >6% fasterMetro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)Crysis Warhead >19% loss.Power@load 315w zotac amp vs. 353 7950 (vs 373w for 7950B)! Not only is the 660TI usually faster by a whopping amount, it's also going to cost you less at the register, and far less at the electric bill (all year for 2-4 years you probably have it - assuming you spend $300-350 for a gaming card to GAME on it).

For $299 or $309 I'll RUN home with the 660 TI over 7950 @ $319. The games where it loses, you won't notice the difference at those frame rates. At todays BOOST prices ($350) there really isn't a comparison to be made. I believe it will be a while before the 7950B is $320, let along $299 of the 660 TI.

NVIDIA did an awesome job here for gamers. I'll wait for black friday in a few months, but unless something changes, perf/watt wise I know what I'm upgrading to. I don't play crysis much :) (ok, none). Seeding higher clocked cards or not, you can BUY them for $299, can't buy a BOOST for under $350. By your own account, only two makers of 7950 BOOST. Feel free to retract your comment ;)Reply

They use Crysis 2 almost everywhere on the internet again because of one reason, it's heavy, no one plays 3dMark because it's not a game still it's always included in reviews because it's relevant to performance.Reply

Read it again...He said NOBODY plays CRYSIS. He's confirming what I said.

The complaint wasn't about crysis 1...It was about benchmarking a game from 2008 that isn't played, and is based on CryEngine 2 which a total of 7 games were based on since 2007. Crysis 1, warhead, Blues Mars (what? Not one metacritic review), Vigilance (what? no pc version),Merchants of Brooklyn, no reviews, The Day (?) and Merchants of Brooklyn,(?) Entropia Brooklyn (?). Who cares?

The complaint is Anantech should use CRYSIS 2! With the hires patch and DX11 patch, with everything turned on. The CryEngine 3 game engine is used in 23 games, including the coming crysis 3! Though after a little more homework I still think this will be a victory for AMD, it's far more relevant and not a landslide by any means. But it IS relevant NV loser or not. Crysis 2 is still being played and I'm sure crysis 3 will for at least a while soon. 3x the games made on this engine...Warhead should be tossed and Crysis 2 used. But not without loading the 3 patches that get you all this goodness.Reply

Yes we all play 3dmark and upload our scores and compare.Not sure about you, you only play one game that now conveniently got an amd driver boost. Good for amd they actually did something for once - although i'll be glad to hear how many times it crashes for you each night @ 1300 WC. It will be a LOT. Believe me. 30 mods, not as many as myself, but you'll be going down with CCC often.Reply

Of all the video cards I had, and I had ALOT from the geforce 2 GTS up to my actually retreated 6850 crossfire(just received my Sapphire 7950 OC) I had close to 0 problems. How could you know anything about CCC while it's obvious you didn't have an AMD video card in years.

I have 30 mods because it was already straining my limited video memory and I had a problem with one of them already(realistic sounds of thunder) which was related to my hi-fi sound card driver(asus xonar STX) that I found lately.

I had no problem with CCC at all, other than using it to scale my LCD TV so it fits all the screen and using my game profiles. I didn't touch it much in the last year. It played Dirt 2, 3, Skyrim, GTA 4, Fallout 3, Fallout NV, Oblivion!!, and so on without a problem. And yet, you try to tell me I'll have problem with a program you don't know a thing about.

But just so you might appreciate me for my efforts, my wife decided to change the 4870 for the forthcoming Guild wars 2 for energy and temperature reason. So I got her a 660 ti as my 6850 were already sold to a friend. She game at 1080p only and I didn't want to overclock her stuff so, it was obvious. At the same time I'll be able to compare both, but I already know I like Nvidia's UI more than AMD's CCC though they look quite alike now.

you're another idiot that gets everything wrong, attacks others for what they HAVE NOT SAID, gets corrected again and again, makes another crap offshot lie, then, OF COURSE - HAS A PERFECT DUAL AMD SETUP THAT HAS NEVER HAD A PROBLEM, EVA! That means you have very little experience, a freaking teensy tiny tiny bit. Look in the mirror dummy.Reply

The 7950b is crap, I don't even want to hear about a reference design with a little boost. On newegg there are 4 cards out of 18 that are reference and the others are mainly overclocked models with coolers ALOT better which will overclock terribly good.

It's easy for the average user to see the win for nvidia considering 20% of the overclock has been already done and there's not much headroom left..... Once overclocked, the only one that's faster for the 660 ti, remains portal 2.

The Zotac might only have 14mhz more on base clock but the core clock is not the thing here, the zotac is the better of the pack because it comes with memory overcloked to 6,6ghz which is the only weakness of the 660ti, memory bandwidth. There's a weird thing in here tho, I found the minimum fps on another review, but on anandtech, the minimum appeared only in the games that it was less noticeable, good job again Nvidia.Reply

660 can go to 1100/1200 as easily as the 7950 gets to 1150 (so another 10% faster)..Check the asus card I linked to before. You'll have a hard time catching the 660 no matter what, it costs you also as noted by anandtech, my comments on watts/cost/heat etc.

Memory bandwidth isn't the issue. here and all of it overclocks fairly close. We don't run in 2560x1600. It's not the weakness. That is a misnomer perpetuated by Ryan beating it like a dead horse when only 2% of users use any res above 1920x1200. I just debunked that idea further by showing even monitors at newegg including 27 inchers don't use that res. IE, no, bandwidth isn't the problem. Bad review on ryan's part, and no conclusion is the problem. The CORE clock/boost is the thing when it's not an bandwidth issue, and it's already been shown to not be true.. LOL, yep, nvidia conspiracy, the minimums were used here to...ROFL. Good luck digging for things wrong with 660TI. Minimums are shown at hardocp, guru3d, anandtech and more. Strange thing you even brought this up with no proof.

The NV cards have only been upped 100mhz, which is about ~10%, not 20 like you say. 915/1114 isn't 20%. You CAN get there, but not in out of box exp. I'd guess nearly all of the memory will hit 6.6ghz. Common for 7970OC / gtx680 to hit 7+ghz.Reply

I said 20% because most of their cards are way above reference clocks, I was just representing the reality, not the reference thingys. When you can buy factory overclocked cards at the same price, let's say 10$ premium, mentioning the reference clocks is almost... useless. Plus over the internet, 80% of the reviews had factory overclocked cards so the performance we see everywhere and is in everyone's head, is close to 20% overclock has been done.

So in fact there's maybe 10-15% of the juice left for fellow overclockers. I'm estimating, it could be more in the case of better chips. While the 7950 as we know it, has been reviewed everywhere on it's reference clocks/fan and if you take an aftermarket cooler and get, let's be honest and say 40%, it's far ahead in terms of comparison from the reference reviews we have.

When I look at things again and again. the memory bandwidth doesn't seem to be much of a problem. The only games where I can guess it could harm it is any new games that will come out with directx11 heavy graphics. Something that taxes the cards on every aspects, else than that, for now, the card doesn't seem to have any weaknesses at all.

I never thought that for the moment it was a real weakness for it, the future will tell us but even there, 90% of the gamers plays at 1080p or less and 80% of that 90% pays less than 150$ for their video cards. For those paying more, it all depends on choosen side, games they play, overclocking or not and money they want to spend.

Remove overclocking of the way Nvidia wins almost everything by a good margin. Anyone playing 1080p won't be deceived by any 200$+ card if they are not so inclined playing everything on ultra with 8x MSAA.Reply

You're going to have a CRAP experience and stuttering junk on your eyefinity in between crashes. Come back and apologize to me, and then thejian can hang his head and tell you he tried to warn you.Reply

In Shogun 2 and Batman AC at 1080P almost none of the new cards are being stressed. I think you should increase the quality to Ultra for the new 2-3GB generation of cards even if the < 1.5GB VRAM cards suffer and bump AA to 8X in Batman. Otherwise all the cards have no problem passing these benchmarks. Same with SKYRIM, maybe think about adding heavy mods OR testing that game with SSAA or 8xAA at least. Even the 6970 is getting > 83 fps. Maybe you can start thinking of replacing some of these games. They aren't getting very demanding anymore for the new generation of cards. Reply

Russian, it's unlikely that we'll ever bump AA up to 8x. I hate jaggies, but the only thing 8x AA does is to superficially slow things down; the quality improvement isn't even negligable. If 4x MSAA doesn't get rid of jaggies in a game, then the problem isn't MSAA.

Consequently this is why we use SSAA on Portal 2. High-end cards are fast enough to use SSAA at a reasonable speed. Ultimately many of these games will get replaced in the next benchmark refresh, but if we need to throw up extra roadblocks in the future it will be in the form of TrSSAA/AAA or SSAA, just like we did with Portal 2.Reply

I was speaking a bit on both. The article insinuates that the 660ti is on the same performance level as the 7950. The obvious caveat to your results is that it is ridiculously easy to overclock the 7950 by 35-45%, and GCN performance scales pretty well with clock increases. It should be noted in the article that the perf of 7950 OC'd is beyond what the 660ti can attain. Unless you guys can OC a 660ti sample by 30% or more.Reply

Is this the exact same way we recommended the GTX460 reviews ? With some supermassive OC in the reviews, so we could really see what the great GTX 460 could do ? NO>>>>>>>The EXACT OPPOSITE occurred here, by all of your type people.Did we demand the 560Ti be OC'ed to show how it surpasses the amd series ? NOPE. Did we go on and on about how massive the GTX580 gains were with OC even though it was already far, far ahead of all the amd cards with it's very low core clocks ? NOPE - here we heard power whines. Did we just complain that the GTX680 is not even in the review while the 7970 is ? Nope.How about the GTX 470 or 480 ? Very low cores, where were all of you then demanding they be OC'ed because they gained massively.. ? Huh, where were you ?Reply

Performance scales pretty well on both design but AMD just is a little better at overclocking because it seems like the base clock is terribly underclocked. It just feels like that but that must be for power constraints and noise on reference designs.Reply

http://www.guru3d.com/article/radeon-hd-7950-overc..."We do need to warn you, increasing GPU voltages remains dangerous."From page 2. Also note he says you can't go over 1ghz or so without hitting raising volts (he hit 1020 default). Raise your hand if you like to spend $300 only to blow it up a few days or months later. Never mind the Heat, noise, watts this condition will FORCE you into. His card hit a full 10c and 6db noisier than defaults. He hit 1150/1250boost max. Only 50mhz more than the 660ti. Nice try. From page 11 7950 BOOST review at gurud3:http://www.guru3d.com/article/radeon-hd-7950-with-...

"In AMD's briefing we notice that the R7950 BOOST cards will be available from August 16 and onwards, what an incredibly coincidental date that is. It's now one day later August 17, just one AIC partner has 'announced' this product and there is NIL availability. Well, at least you now have an idea of where the competition will be in terms of performance. But that's all we can say about that really."

Note you can BUY a 660TI for $299 or $309 that is clocked by default only 14mhz less than the zotac AMP in this review. If Ryan is to note what you want in the article, he should also note it will possibly light on fire, or drive you out the room due to heat or noise while doing it. AMD isn't willing to BACK your speeds. Heck the DEFAULT noise/heat alone would drive me out of the room, never mind the extra cost of running it at your amazing numbers...LOL. A quick look at the 7950B already tells the story above it's ref speeds/volts. RIDICULOUS NOISE/HEAT/COST.

From guru3d article above:Measured power consumption default card=138 WattsMeasured power consumption OC at 1150 MHz (1.25 Volts)=217w!! Note the one in Ryan's review is clocked at 850mhz and already 6.5db's higher than 660TI AMP. I want a silent (or as close as poosible) PC that won't heat the room or every component in my PC.http://benchmarkreviews.com/index.php?option=com_c...1122mhz CORE on 660TI. The gpu boost hit 1200! That's 31% above stock (and I only googled one oc review), and I don't think this is as HOT as your 1150 would be, nor as noisy. Scratch that, I KNOW yours will be worse in BOTH cases. Just look at this review at anandtech with zotac at 1033mhz already. The zotac Amp is also 5c cooler already and you haven't got over 850mhz on the 7950 boost here. Try as you might, you can't make AMD better than they are here. Sorry. Even Anand's 7950 boost review 4 days ago says it's hard to argue for the heat/noise problem added to the already worse 7950 regular vs 660 ti. Not to mention both 7950's are more expensive than 660ti. It's all a loss, whether or not I like AMD. Heat/temp/watts are WORSE this time around on AMD. Raising to higher clocks/volts only makes it worse.

I already pointed out in another post Ryan should have posted 900mhz scores, but not to help AMD, rather that's what I'd buy if I was looking for a card from AMD for the going market cards on newegg. You just wouldn't purchase an 800mhz version (or even 850mhz), but AMD would have paid the price in the heat/watts/noise scores if Ryan did it. I would still rather have had it in there. Anandtech reviews seem to always reflect "suggested retail prices and speeds" rather than reality for buyers. That still doesn't help your case though.

It's not ridiculously easy to OC a 7950boost to 35-45% higher...Which loses a lot at 1920x1200, by huge margins, and warhead is useless as shown in my other posts, it's a loser in crysis2 now for boost vs. 660ti's of any flavor above ref.http://www.guru3d.com/article/radeon-hd-7950-with-...Crysis 2 ultra uber dx 11 patches everything on high. WASH for 7950 boost vs. REF 660ti! Why did Anandtech choose 2008 version?Ref 660TI which nobody would buy given pricing of high clocked versions at $299/309 for 660 TI, default no fiddling necessary and no voiding warranty or early deaths of hardware. You seem to ignore what happens when you OC things past reasonable specs (already done by AMD with heat/noise/watts above zotac Amp here). I suspect AMD didn't want their chip to look even worse in reviews.

Argument over. I win... :)...So does your wallet if you put your fanboyism away for a bit. Note I provided a google search to my RADEON 5850 XFX purchased card complaints (regarding backorder) at amazon in another post here. I love AMD but, c'mon...They lost this round, get over it. You may have had an argument for a 7950 boost at $299 that was actually COOLER than the 7950 regular and less noisy. But with both being worse, & price being higher...It's over this round. Note the cool features of the 600 series cards in the above oc article. It's safe at 1122/1200! It's safe no matter the card, though they vary you can't hurt them (per card settings are different...Ultimate OC without damage). Nice feature.

I'd argue blow by blow over 2560x1600 (as you can prove NV victories depending on games) but I think it's pointless as I already proved in other posts, only 2% actually use that or above. Meaning 98% are using a res where the 660TI pretty much TRASHES the 7950 in all but a few games I could find (1920x1200 and below).(hit post but didn't post...sorry if I'm about to double post this).Reply

It's fun to see that Nvidia as reached a very good power consumption and heat level compared to the generation before. How they mention it, but when AMD fanboys were mentioning it, coparing 6xxx against gtx5xx, they were just denied and being told it wasn't important.

''Measured power consumption OC at 1150 MHz (1.25 Volts)=217w''

Wow it's amazing, 217 watts, almost as much as a gtx 580 stock.....

Comparing the 7950b noise and temperature with the very bad reference cooler against a very quiet aftermarket cooler on the 660 ti, very nice apples to apples comparison. The 7950b is for the average users, we all know the 7950 models that are overclocked and got VERY nice coolers already, thanks for the refresh.Reply

Well, the reference design works wonders on 660 ti because it has alot better power consumption and temperatues, the 7xxx reference coolers are just plain crap, good thing there's not much around, else the opinion of the 7xxx series would be uber bad.

Overclockers tend to love the radeons and I'm an overclocker, not an AMD fanboy, I just can't support all the hate when there'S no reason for it.Reply

580 was referred to as housefire and worse, nVidia was attacked for abandoning gamers w/computeROFL - abandoning the gamers

green earth became more important than gaming

losing frame rates was a-okay because you saved power and money

Compare that to now - nVidia is faster, quieter, smoother, and uses less power

amd loses frames, and sucks down the juice, and choppier

The 580 had a HUGE lead at the top of the charts....

So, that's the same how ?

It would be the same if amd hadn't completely failed on frame rates and had a giant lead stretching out in disbelief at the top of the charts - then one can say "the power doesn't matter" because you get something for it

It's really simple. So simple, simpletons should be able to understand. I don't think fanboys will though.Reply

Well in fairness AnandTech did test reference clocked 660Ti cards, which is a fair review. They also could have included factory pre-overclocked 660Ti cards and just commented on the price difference (i.e., up to $339). This was also mentioned in the review.

But what I find the most amusing is that after how much talk was around the amazing overclocking capabilities of GTX460, NV users want to ignore that HD7950 can overclock to 1.1-1.15ghz and match a $500 GTX680. Can a GTX660Ti do that? At the end of the day an overclocked 7950 will beat an overclocked 660Ti with AA. Overclockers will go for the 7950 and people who want a quiet and efficient card will pick the 660Ti. Reply

No the 7950 does not, it takes a 1200-1250 core 7970 to "match up".Even then, it can only match up in just "fps".It still doesn't have: PhysX, adaptive v-sync, automatic OC, target frame rate, TXAA, good 3D, Surround center taskbar by default without having driver addons, STABILITY, smoothest gaming. I could go on.Hey here's a theory worthy of what we hear here against nVidia, but we'll make it against the real loser amd. It appears amd has problems with smooth gameplay because they added a strange and unable to use extra G of ram on their card. Their mem controller has to try to manage access to more ram chips, more ram, and winds up stuttering and jittering in game, because even though the extra ram is there it can't make use of it, and winds up glitching trying to mange it. There we go ! A great theory worthy of the reviewers kind he so often lofts solely toward nVida.Reply

Look at all the big words: ''PhysX, adaptive v-sync, automatic OC, target frame rate, TXAA, good 3D, Surround center''. Stability is my preffered, I owned so many video cards and had so little problems with them, Nvidia ATI or AMD but still Nvidia fanboys still have to make us feel that everytime you buy a video card from AMD, you gotta have to face the ''inevitable'' hangups, drivers problems, the hulk is gonna come at your home and destroy everything you OWN!!!! Beware if you buy an AMD video card, you might even catch.... ''CANCER''. oohhh cancer, beware....

I had none of that and still has none of that and ALL my games played very good, memory is the problem now, not the lack of adaptive crapsync, physixx and such. You just made me remember why I do not listen to TV anymore, the adds always try to make you feel like everything you own should be changed for the new stuff, but then you change it and you feel almost nothing has been gained.

You forgot to mention the extra heat, noise and watts it takes to do that 1250mhz. Catching NV's cards isn't free in any of these regards as the 7950 BOOST edition already shows vs. even the zotac Amp version of the 660ti.

7950 boost is a reference board, as we said before, 4 out of 18 boards are reference on newegg. It's unlikely people will buy reference cooler from AMD because they're plain bad if you overclock. Will people putting the reference design out of the way when speaking about overclock....

Noise and temperature when overclocked on either for example Twin Frozr3 from MSI or sapphire OC are ALOT more silent and cool even fully overclocked....Reply

The GTX 580 was already way ahead, so it didn't have to catch up to 6970. When OC'ed it was so far ahead it was ridiculous. That's why all the amd fans had a hissy fit when the 680 came out - it won on power - frames, features, price - all they could whine about was it wasn't "greatly faster" than last gen (finally admitting of course the 580 STOMPED the pedal to the floor, held it there, and blew the doors off everything before it).

See, these are the types of bias that always rear their ugly little hate filled heads.Reply

The sentence speaks for itself about who's writing it, you're one of them sorry, if it wasn't so filled with hate maybe I could pass but.... no.

It won for the first time on power/performance, let'S not do like it has always been like that.... and before even if they didn't win on that front you speak like they won on everything forever. There's nothing good about AMD we already know your opinion, you don'T have to spread some more hatred on the subject, we know it.

Just do yourself a favor, stop lacking respect to others for something everyone already knows about you, you hate AMD end of the line.Reply

You were wrong again and made a completely false comparison, and got called on it. Should I just call you blabbering idiot instead of hate filled biased amd fanboy ? How about NEITHER, and you keeping the facts straight ? HEY !That would actually be nice, but you're not nice.Reply

You didn't get the point of the memory controller/memory quantity they said here, poor newbie, let me explain things for you. On GCN there's 6 64bit memory bus, now divide 3gb of memory into 6, bingo 512mb each controller. Now take 2gb of memory and divide it by 3 memory bus of the 660 ti(192bit), ohhh you can't do this, what that means is there are 2 64 bit buses that takes care of a 512mb chip and the 2 other memory buses take care of 2x512 memory each.

Asynchronous memory may not be that bad, it isn't a weirdo theory either, my little girl that's 10 years old would understand it if I would explain her that simple mathematical equation. Mr Cogburn the expert with less logic than a 10 y old girl... what are you doing here, you'Re so good, maybe you should join us at overclock.net and discuss with some pro overclockers and show us your results of your cpu/gpu on nitrogen if you're so pro about hardware.Reply

I didn't use a damn word that lacked of respect to you in all my previous posts unlike you calling me dummy, stupid, and so on in some previous post. You used words nazi, evil, and so on against AMD but I'm the one who spread the HATE LOL COMON.

Don't try me, you are the most disrespectful, but still knowledgeable, person I've ever seen at your times. You expect me to stay totally cold all the time in face of the fact you diminish and attack people calling them names because they are Fanboys.... COMON.

I speak once against your logic but every post you have is full of hatred and attack to people but ohhh when I say something ONCE, it's BAD. COMON.... Well I know it wasn't my best one and I offer you my dearest excuses, I truly am sorry. I was at the end of the roll, you're hard to follow, using everything you can to make us feel like AMD is realted to cancer, hell, nazis and such while I have a 4870 and 6850 crossfire that served me well.

Imagine someone attacking and diminishing the user of a product because he uses the said product and you're one owner of that said product. Someone lacking of respect to YOU in every way because you use that thing and you actually like it and it served you well, you had no problem with it but still he can't stop and almost tries to make you beleive that if you bought that, it'S because you are plain stupid. You'd be mad, well that's what you make feel to most of AMD users the way you speak of them.

I'm sorry for being such an ass comparing your logic to my 10 years old daughter while I know you're more logical than her. But you should really be sorry to any AMD video card owner that reads you because you really make em beleive AMD is the devil and their products are worthless while they're not.Reply

Don't start out with a big lie, and you won't hear from me in the way you don't like to hear or see. You've got yourself convinced, you already did your research, you've said so, you've told yourself a pile of lies, WELL KEEP YOUR LIES TO YOURSELF THEN, in your own head, swirling around, instead of laying them on here then making up every excuse when you get called on them ! Pretty simple dude.Here' let me help you, this is you talking:" I've been brainwashed at overclockers and all the amd fanboys there have convinced me to "get into OC" and told me a dozen lies, half of which I blindly and unthinkingly repeated here as I attacked the guy who knew what he was talking about and proved me wrong, again and again. I hate him, and want him to do what I say, not what he does. I want to remain wrong and immensley biased for all the wrong reasons, because being an amd fanboy is my fun no matter how many falsehoods I spew that are very easily smashed by someone, whom I claim, doesn't know a thing after they prove me wrong, again and again". Hey dude, be an amd fanboy, just don't spew out those completely incorrect falsehoods, that's all. Not that hard is it ?LOLYeah, it's really, really hard not to, otherwise, you'd have a heckuva time having a single point. I get it. I expect it from you people. You've got no other recourse. Honestly it would be better to just say I like AMD and I'm getting it because I like AMD and I don't care about anything but that.Reply

For one there are 6 superclocked 660TI cards on newegg today available at $319 or less DEFAULT. They are fully warranted at those speeds:http://www.newegg.com/Product/Product.aspx?Item=N8...1032 core/1111mhz boost.http://www.newegg.com/Product/Product.aspx?Item=N8...1019/1097 for $299.Can you do that with a 7950?How hot and noisy is yours. I can see what AMP speeds do here at anandtech. How many watts will yours use doing what you said? Just look at the boost edition here and scores around the web at 1920x1200 and you realize it's getting whipped. GTX680?Lets just get who can go faster totally out of the way at ridiculous overclocks:http://www.hardocp.com/article/2012/07/30/msi_gefo...GTX680 MSI Lightning $580 review at hardocp vs. Sapphire 7970 OC $460 at 1280GPU/1860memory at 1.300v! all @2560x1600 min/avg 680 1st 7970 2ndMax Payne 341/86.2 vs. 42/79.7

battlefield 329/52.8 vs 32/50.6

Batman Arkham42/68 vs. 29/57

Witcher 2 enhanced22/51.5 vs 21/50.3

battlefield 3 multiplayer 1920x1200 (sorry multi isn't run at higher)59/78.7 vs. 50/64.2So based on avg framerate,Batman >19% faster for gtxWitcher2=Wash based on min/max either waybattlefield multi >22% faster for gtxBattlefield3 singleplayer wash I guess based on min maxMaxpayne 3 >8% fasterBottom line from hard OCP conclusion:"The video card also one of the fastest out-of-box operating speeds. It even went head to head with one of our fastest overclocked AMD Radeon HD 7970's and swept the floor with it."

If you can find a better review of these two GPU's clocked faster let's see it. I mean any GTX 680 vs. any 7970, where both are ridiculously OC'd like these here. You mentioned 1.15 for 7970, well they got it to 1280! And it got the snot knocked out of it anyway. Sorry Russian. Note they got the mem to 7.44ghz (I'd say a bit lucky draw) vs. the GTX 680 mem hit a wall at 7.05ghz. I'm guessing there will be a few cards do quite a bit better in GTX 680 land here. IT's just a luck of the draw either way, but the sapphire came with a good mem draw for their particular samples so consider the sapphire a great score and still swept. Max scores (check the article) were worse. I tend to think avg is a better rating, you live there mostly. Min/max are rarely hit. Just4u already said it. 680 wins. It's either a wash or landslide depending on games and having cash as no obstacle I'd go gtx 680.More regarding all 600 series:http://www.hardocp.com/article/2012/05/29/galaxy_g..."Since the introduction of the GeForce GTX 680 we have seen the launch of the GeForce GTX 690 and GeForce GTX 670 all providing the best performance and value in their class."Same article bottom line on $535 cards in SLI:"These cards are a beast in SLI, providing us the best performance possible at 5760x1200. There is no question these also beat Radeon HD 7970 CrossFireX to the punch at every turn."

Smack...As an AMD fanboy I hold little hope for AMD. They are fighting with billions in debt (junk bond status credit, think of them as Spain/Greece, hard to borrow and gets worse and worse), little to invest, lots of interest on the debt, vs. NV with 3bil in the bank in CASH, no DEBT, no Interest on NO debt. I believe the age of AMD catching Intel or Nvidia is over. Bummer. Our only hope is NV buying AMD once they plummet enough (after losing billions more no doubt), and getting back some CPU competition as NV could invest in cpu design to get this game going again. I could see IBM or Samung pulling this off too (maybe even better as I think both have far more cash, and both have fabs). IBM/Samsung could really put the hurt on Intel with AMD buyouts. It would be a fair fight on cpu for sure. NV may be able to pull off both gpu and cpu as they have no fabs to keep up (which can kill you quickly if you screw up). Interesting thoughts about all that roll around in my head...LOL. For now though, NV is on a path to get 10bil in the bank by 2014 xmas I'd say if not at least by 2015. Like NV or not, they're CEO is smarter with money and never loses it unless he hurts someone else for doing it. He never prices their products at a loss like AMD. He makes smarter moves and thinks further ahead with a bigger picture in mind.

Motleyfool.com thinks they're the next 100bil company :) The Gardner brothers are NOT STUPID. I will be piling my money in this stock until it goes over $20. They're getting close to returning to the profits of old when it was $35 in 2007 and no dilution in the stock since then, with another 1.5bil of buyback scheduled last I checked. Same cash as 2007 and much stronger company with acquisitions since 2007. AMD is going the other way and investors are scared sh1tless :( Bankruptcy or bought by 2014 xmas. You heard it here :) Unfortunately. I cringe as I say it, but at this point it may help our future cpu cheap prices to just get this over with and get them bought by someone who can help AMD before there's nothing left and they're cpu's are even worse, while gpu's are starting to show desperation too. The 7950boost is just that. A company with money in the bank would have dev'd a lower wattage/cooler less noisy version for less cost, rather than all 3 going up to try to spoil a launch. OUCH. As much as Ryan etc try to help them, there's no getting around the facts (despite page titles like "that darn memory"...LOL...Yet better performance anyway...why even title pages like that?). Despite attempting to make this a 2560 discussion when only 2% of the world uses it according to steampowered.com hardware survey. Even then, if you look at updated games you could argue it's still a no brainer. Toss out warhead (from 2008) and replace with Crysis 2 you get a 660 victory instead of a loss. Hmmm...

2GB a hindrance? 4GB do anything? :"The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference."http://www.guru3d.com/article/palit-geforce-gtx-68...Funny I thought the 4GB version sucked after reading their review but whatever :) I'd rather have a FASTER 2GB than same speed 4GB costing a lot more. I'd call the measurable difference the MONEY for nothing.Reply

Including the useless Warhead isn't enough? Screaming the entire time about it having bad bandwidth wasn't enough?Skyrim is in there too..Just forgot to mention it

Again, 680 SLI vs. 7970sli78fps to 62fps (avg. to avg).

Heck even the single beat it with 72fps.."GALAXY GTX 680 GC SLI was 26% faster than AMD Radeon HD 7970 CFX at 8X MSAA + FXAA."

Skyrim not good enough too? So what game would you like me to point to? I'm sorry it's difficult to point to a winner for AMD. :)So lets see, it's biased in your mind on Skyrim, Batman AC, Witcher2, Battlefield3, Battlefield 3 Multiplayer, Portal 2, max payne 3...That rules out HardOCP I guess. Anand added a few more, Shogun 2 (another landslide for 660 TI, even against 7970), Dirt3 used here anand - Wash (though minimums do show Nvidia as Ryan points out)...Civ5, landslide again at 1920x1200 here anandtech...Metro2033 here anandtech, <5% win for Nvidia %1920x1200 (I call it a wash I guess)...

So which game can I point to that will be OK to you? I'm running out of games to find a victory for AMD, so just give me what you want to see...IT's kind of hard as you can see to give you the viewpoint you want which is apparently "nothing will make me happy until AMD wins"...Am I wrong, or at what point do you accept reality? Reply

Yet another result before the big driver improvements, poor fanboys, they lack on informations and they're totally uninformed about drivers enhancements. A while ago AMD said they were changing their tactics about drivers development. That was like 3-4 months ago I think. Since then, we see really big improvements from the drivers.

Last 12,8 catalyst brings:•Up to 25% in Elder Scrolls: Skyrim•Up to 3% in Battlefield 3•Up to 6% in Batman: Arkham City•Up to 3% in Dues Ex: Human Revolution•Up to 6% in Crysis 2•Up to 15% in Total War: Shogun•Up to 8% in Crysis Warhead•Up to 5% in Just Cause 2•Up to 10% in Dirt 3

All in one driver realesed in august, any review prior to gtx 660 ti is then flawed. And there's probably much more to come considering Nvidias fanboys have been whining about their drivers for years. Their team focused on the good way to improve drivers, how much will they be able to improve them? They were SO bad at making drivers than anyone buying an AMD card couldn't even play the slightest game without it crashing, overheating the card and making you cry to your mother to buy an Nvidia card..... Imagine!!Reply

A lot of reviewers have very recently commented on how crappy amd drivers are - and this just past release came out with - NO DRIVERS for amd...

Then 5 top review sites had half the games crash on amd, and had to toss some out of the reviews.

do you have AMNESIA ? are you sick ? a little under the weather, or just a fanboy liar who plays only one game Skyrim ( ROFL) except for the other games you said you play the prior page - and you're just an overclocker...

And then again with the attacks and hatred on my choice, attacking my personnal life once again.

''do you have AMNESIA ? are you sick ?''''your not a very smart Ocer. ''

I did not buy them for ocing, I got them because I had an opportunity and I paid them like, dirt cheap. But still you attack me lacking of respect not even knowing the reasons why I did that. And again with the crashing, I changed to 12,8 driver on my 6850 and it did improve my skyrim performance and none of my other games crashed. Sorry for those reviewers.Reply

Ok, so in other words, you're the noob, that knows very little to nothing.BTW- knowing your an amd fanboy means we all KNOW you scrinched and scrunched (at least in your own stupid head) every little tiny "penny" in your purchase of the AMD videocard...LOL - THAT'S A GIVEN DUMMY. Ok, so, in light of that STUPIDITY - you have that same crappy set in WATER COOLING,- DUAL WC... And.... "I don't even know why you got them".ROFL - dude, either you're lying an awful lot, or you actually needed my help back then, desperately. So you waggled up your big water overlcock OC manness, and now we find out... LOL This is not happening ! (x-files quote)Reply

There are 8 OC 660Ti's on newegg right now and only 3 released at stock.By chance alone the reviewers will be reviewing an OC'ed 660Ti, as was pointed out in the article you did not read, there is NO "standard design" pushed by nVidia so the partners have a free reign to come out of the gate with OVERCLOCKS a rocking. They have done so. So now, OC is the standard and overwhelming production with the 660Ti Get use to it. Unfortunately AMD has been a severely restrictive control freak nazi master dom smacking down and hurting their partners and has not allowed freedom. Then, in the usual control monster hold the gamers back fashion, they finally okayed their GE crap to their broken and hurting partners so they could charge a lot more. Evil, greedy, tyranny control, amdReply

4 out of 18 cards for AMD are reference coolers on newegg, nothing different on the other side... 11 out of 18 are overclocked and the other 3 non-reference coolers that are not overclocked are begging to be boosted.

So if OC is the standard, why not try to push it farther, factory overclocking versus aftermarket overclocking isn't much different if the video cards take it so easily. :)

''severely restrictive control freak nazi master dom smacking down and hurting'' even had to use the word Nazi... comon, be less of a fanboy, it's just childish, that was ridiculous. I'm not diminishing any of both companies, if you really want AMD to die, we'll all cry for no more competition will be alive. We'll be back to the days of geforce 2 gts at 800$. Praise the war and be a little more respectful please.Reply

I don't want AMD to die, but I wouldn't mind seeing them bought as they're already well on their way to death without our help. I can't justify buying cpu's that completely suck now (granted our crappy court system took forever to pay AMD for Intel's crap), and won't do it just to help them out.

If the courts had seen fit to pay them what they truly got screwed out of (I'm reminded of buying white box ASUS boards because Asus was afraid to even put their name on the box!) when they were on top for 3 years at least, we wouldn't be having this discussion. They should have been given 15-20 billion from the ill-gotten 60+ billion earned from that time forward (as I'm sure market share would have gone up with more money to produce more stuff, keeping the fabs etc). It's not my wallets job to help now. They need to claim bankruptcy and get bought. Management has blown their ability to compete due to the financial burdens now facing them.http://investing.money.msn.com/investments/financi...Take a look at the last 10 years. Overall a loss of 6 billion or so. The previous 10 looks no better. In fact I think if you go back to inception, they haven't made a buck overall. That's not good. Shares outstanding in those last 10 years...DOUBLED. DILUTED! 344mil to 698 mil. You can't keep selling shares to cover old debts forever. Eventually nobody will loan you money and you can't operate at that point.

The stock has been cut in half in the last 5 months. Intel will continue to crush them as they can't invest 4Bil like Intel is doing now over the next few years to stay ahead. You just can't win without R&D.

OC is the standard but you're talking about doing it on your own, vs sanctioned by the manufacturer and coming as default like that on almost every 660 TI out there (which is what he pointed out), then you go off about fans? He's talking about SPEEDS already OC'd on the cards by default regardless of the fan on it. The makers of the card (msi, xfx, gigabyte etc) are SELLING THEM OC'd. You don't have to do anything buy buy it and stick it in. It's already overclocked, and overclocking itself to the highest clock it can without damage (that's built into the 600 series, NV is a step ahead here). That changes on a per gpu basis too...Very nice.

Attack the man's data (if you can) not the nazi crap. Comparing the actions of one company to the actions of a well known person or group's actions (while I'd have chosen something other than nazi's) when said company is acting like them is valid. It's not disrespectful. His point wasn't they are killing Jews by the millions (or anyone else). His point was devs of cards are a bit peeved. I.E. only 2 have announced 7950 BOOST editions that I'm aware of. First, because they are already selling 900mhz+ versions that AMD doesn't want to see in the market (which is why I said Ryan should have benched one of his cards at this speed, what fool would buy REF or boost @850 when you can get a free 50-100mhz overclocked by the vid card makers already?), and second they don't care about want AMD wants after being shackled and wanting free reign, like you see on 660 TI's...all kinds of speeds from the launch, with rarely a REF CLOCKed card in sight! Do you get it now? It's not about the fan, it's about the speed the maker is willing to BACK out of the box by default and still warranty it without complaining about what you did to void their warranty. Cerise isn't putting AMD out of business, AMD is.Reply

I was speaking to CeriseCogburn, he sees the company as the prime evil, just read ny of his post and you will see the hate, the knifes in his eyes. I'd be AMD and I'd have him arrested by the police, he's a madman and almsot goes up to throw menaces at the company, Using terms like greed, Nazis and so much more that I'll leave it up to you to read him from page 1 to 10 on this forum. I'd be an AMD employee reading this and I'd be like ''WTF, I'm just a human being working my best to feed my family, I'm not working for the devil...''

used just in the text above mine, how are those words any useful when defending an opinion, that's repression, lack of respect and total madness..... We're not speaking of an army that tries to take control of the world by domination coming up in your home and killing your children FFS.Reply

I don't know if you realised but they way you display your arguments, accuse so easily, attack and confront even the owners of the cards, that uses it right now and telling them everything crashes and is unplayable while they are using the cards with no problem, it discredit any or your credibility.

You alone are making it worse, continue, I have no problem with that, the more you add things like your last post, the worse it is. I won't, I'm sorry for the way things are turning, I'm not going down to your level because I'm just a simple enthusiast.

And yet posting only to say ''Keep lying and crying crybaby'' is just another proof that SOMETIMES, your really just answering so you can have the last words. That was the most useless post I've ever seen....Reply

Oh you're such a fool - it's nine months AFTER amd released their crap clocked cards with the LOCK on their core speed. dude, get a freaking clue"oh dey released it juzz azz much wit oc,....i'm so stpooopid and such a liar.."How about manning up : " I was wrong in front of my daughter every time, and I'm mad about that. She's going to be just as foolish as me (I hope)".No nevermind, you KNOW IT.Reply

Odd, my laptop has a GTX 560m which is pretty much a power optimized GTS 450 and Im able to play Crysis DX10on high at 720p without AA, it runs between 23-33fps which might not seems great, is enough for casual gaming. I wonder how a 8800GT couldn't run that game at least on medium at the same resolution. Regarding other games like Crysis 2 and Batman AC they only run on DX9 or DX11, Metro 2033 is another story lolReply

You just said it yourself, 720p, no AA, 23-33 fps in a forum speaking about gtx 660 ti surrounded by people playing mostly 1080p and above... For me, anything below 40 fps is not super playable and still it's ALOT better when my fps is pegged at 60 with vsync.

Batman AC, Anno 2070, Hard Reset, and Skyrim are what I've been playing just fine, along with many other modern games. I don't think these games even have a DX 10 option, so of course they'll use DX 9.Reply

I won't argue that newer cards are 5 times faster...But in 1680x1050 I might argue how many times he'll notice. :) My nephew only complained about skyrim at this res on a less potent card. I'd also note ionis would have to spend quite a bit to get 5x that card he already has. My own card was only bought because it was 2x faster than my old card (a duplicate of my dads, which is a faster clocked 8800GT 512MB, Huge MSI copper pipes made it near silent also, the PSU fan was louder). He'd have to spend a few hundred at least. I'm currently waiting for another double of my 5850 at $300 (which I think just got released here :)). But I can wait a few more months for a great deal. So I'm guessing about $300 to beat his old 8800GT by 5x. His gpu may be limited by the cpu at that res quite a bit no matter what he buys at $300. Most running an old 8800gt I'd guess are running older cpu's too. So to see that 5x may require a new cpu in a lot of games (which I'm assuming is his monitor's NATIVE res). But it certainly would allow him to set EVERYTHING gpu wise at max on 1680x1050, of that I wholeheartedly agree. If he witnesses a slowdown at that point it's most likely his CPU :) Nvidia/AMD have really kind of run out of excuses for us to buy new cards right now. Unless you have a 27in (in rare cases 24's at 2560) or above, or multi monitor it's hard to argue for dual cards, or a great card at 2560x1600+. Newegg's 24's are all 1920x1200 (20 models) or 1920x1080 (48 models) for native resolutions out of 68 total :) Those are the RECOMMENDED resolutions for these 68 24in models at newegg Ryan.

Again, I wonder why Ryan couldn't make a recommendation with just a quick look at resolutions on Newegg's 68 24in monitors showing NONE in native at 2560x1600. Besides the fact that you have to jack all sorts of things around at that res on a 24 in the OS or they're small. 2560x1600 is ideally for 27in+. Raise your hand if you have a 27 or 30in...LOL. The recommendation is EASY at 1920x1200 (the highest native res of ANY 24 on newegg RYAN!). Even the $289 dell UltraSharp U2412M is only 1920x1200. This is a quite expensive 24in (albeit gorgeous). $400 24's on there are still 1920x1080 or 1920x1200. Still can't figure out what to recommend ryan? I don't get it. I'm all for giving AMD help if I can, but get real. The 660TI appears to dominate almost all games at these resolutions.Reply

Thanks for the review. Im looking a GPU for 1920x1080 to play skyrim and upcomming mods. Im looking at the GTX 660 and a HD 7870, both cards have 2GB memory which I think should be enough. My questions is which would you recommend? The GTX 660 looks good but the slower memory bandwidth seems to hinder it in certain games that seem to make use of high memory availability (im guessing games like skyrim?).

I think you should be comparing a 660Ti to HD7950. The 7870 can be had for $250 on Newegg. If you plan on overclocking, 7950 is the better card for Skyim, especially with mods and high AA. While not tested here, once you add Mods and crank AA, 7900 series is much faster than GTX600 in SKYRIM:

7950 800mhz leads GTX660Ti by 24% at 1080P with 8AA with mods in Skyrim:

If you're going to be playing with memory-intensive mods then our Skyrim results at 2560 are going to be the most relevant. The 7870 would be appreciably faster here, but note that this is basically the only case in our entire benchmark suite where that happens (even Crysis has the two virtually tied). I suspect you'd be happy with either card, but if you intend to keep the card for a while and to play games other than Skyrim, I'd be hesitant to recommend the 7870.Reply

Not quite sure I understand...660TI is dominant in 1920x1200 in skyrim. I already showed from the hardware survey at steampowered.com <2% use your resolution of 2560x1600. I already pointed out 68 24in monitors at newegg (that's all of them by the way) don't recommend above 1920x1200/1080 for recommended res (native), 41 of the 27in monitors run the same, only 11 27in monitors on newegg recommend a resolution even near your 2560x1600 (it's 1440 on those 11), the rest of the other 41 27in models are also 1920x1080...

Who do you think runs at this 2650x1600 res? It's a res used by a small % of the 2% I've just mentioned. You have to buy a 30in before this is useful info. Is this guy running a 30in? Crysis, tied? You talking warhead from an old engine from 2008 or what? See my other posts, you need to rewrite your conclusion and stop acting like there are more than 2% in the world using 2560x1600. I already proved this wrong many times in here in the comments on this article! Got any evidence skyrim is better for this guy? His link where nothing is explained? I believe he's testing at REF speeds for the 660TI since nothing is mentioned about what he runs the tests at (after translating the whole thing through google) . He has multiple cards but only 1 660TI in the test once he gets to the alan wake page and gaming tests. Again though, 1920x1080 the card is great. But still I have no idea of the clockspeed of the ONE card in the 3 card charts. Only ArmA2,Metro2033 seems to be a victory for 7950 (again at what speed?). I can't find evidence it's not all based on reference. But if you go by your benchmarks, hardocp, guru3d etc...It's clear who wins at the res 98% of us use. Our can you point me to 2560x1600 data showing otherwise? 2560x1600 isn't used. I pointed out all the monitors at newegg...Steampowered survey...Where is evidence it's used?Reply

The 570 is included in the benchmarks which the 660TI just slaps down. Why no 580?

I can pick up a used EVGA 580 for $250, so I am curious. I like the sound of the Gigabyte 660TI but it's $319... is it worth $70 more than a 580? Yes I know power consumption will be lower and it will be less noisy.Reply

When you look at power and temperature bars at the bottom, it's plain amazing! They surely got the sweet spot this time. Thinking the 660 ti performs a little better but such a difference in power usage, there's not much to say about it except it's plain extraordinary.Reply

660ti is not bad, 7870 custom cooler version is very tough competitor. I allso would like to see factory overclocked version of AMD card in the same test, but all in all it seems to be close call.Nvidia definitely needs a cart to 200-300$. But it seems to be so that we have to wait untill 700 series for that?Reply

I don't get why Nvidia keeps ignoring the 200$ market. With economies in Europe and the US going down, i doubt that 300$ cards will be important for the mainstream market. And even there, the 7950 seems to be the better choice.Reply

They are ignoring that market for they can't make money with it right now. Right now they are facing a shortage of 28nm wafers for TSMC can't produce enough chips right now. They usually make the most money with the $100 to $250 cards for they usually make money on selling high volume cards with low to medium profit.

But due to the shortage of 28nm wafers they have decided to only target markets that are low volume and high selling price. Because of this there majority of 28nm wafers are going for the notebook chips they are producing (the 620m to 660m all have 28nm versions). They leftover spare chips they have are going to the gtx680, gtx670, and now gtx660ti which they make a lot of profit on.

Now there is a gt640 on the market right now, but it uses the same die as the 640m le to 660m, so any chips that can't make laptop grade due to not matching the 25w to 50w tdp will be reused in a desktop chip that can be up to 75w tdp where they will be sold at a final street cost of $100. $100 dollars for a 118mm^2 gpu is outrageous considering the 6670 has the same die size but is made on 40nm, outperforms it, uses less energy, and is cheaper.

You won't see a gt640 with gddr5, gt650, or a gtx660 (non ti) until nvidia gets more 28nm wafers. (You also won't see nvidia making 28nm tegras until they get more wafers even if the design for tegra4/wayne was finished right now.) Right now nvidia is a victim of its own success where it is selling every 28nm product it can make, so since it can't make any more 28nm products due to its suppliers and facing a shortage it might as well maximize its profits. $300 to $500 dollar cards maximize profits, Laptop Gpus maximize profits.Reply

Not ignoring it. They can't keep up with demand at $300. Your card is coming, but not until they can get more chips (more failed $300 ones?) so they can created a $200 card.

Sorry already debunked your 7950 being the better choice. Ignore 2560x1600 and it's not even close. That being said, if you use a 30in monitor, maybe you can argue the 2560x1600, but it's a wash at that level as far as I can see. GTX 660 TI wins many times even in Anandtech's, etc..Reply

I do almost all my gaming on my Sony HMZ-T1 in the standard 720P resolution, so would be great to see what FPS you get in slightly lower resolutions, especially since this is a card aimed at the lower end of the market.

Would be great if you could start doing some 3D FPS benchmarks as well because there is a difference in performance again when you ad 3D rendering to any particular resolution.Reply

This is a castrated effectively 128-bit 24 ROPs hugely overpriced card. Best Buy had it for several days already, it is in the same green box as 560ti, and apparently nobody paid attention it was taken to the sale floor :). So, some people :):) already bought it, tried it, and can confirm it is not worth the money asked. In my sole opinion it can justify max. $199 at launch. It is disappointing. So, for whoever has and feels OK, GTX 670 for $400 is the way to go, otherwise pick up Radeon 7950 or 7870. I personally will choose the AMDs because of compute, NVidia current generation computing just plain sucks, but if you only play one game BF3 :):):) then maybe several hundred dollars is OK to spend on the GTXs.Reply

Different reviews have different setups. Toms seems to be the exception as regards the 7870 being superior; in general, the 660 Ti comes closer to the 7950. In some titles, it's shockingly fast.

If there was to be a standard 660, all NVIDIA could do here is to cut down the number of shaders and texture units - clocks won't do it as you'll just clock them back up again, and memory is already nerfed.Reply

"Closer to the 7950"?? Careful, sounds like ignoring the evidence. BEATS it, and usually the 7970 (even ghz at times) too...see all my other posts...Pointless to even respond here?.. I've already written every game at hardocp, anand show victories for Skyrim, Batman AC, Witcher2, Battlefield3, Battlefield 3 Multiplayer, Portal 2, max payne 3...That rules out HardOCP I guess. Anand added a few more, Shogun 2 (another landslide for 660 TI, even against 7970), Dirt3 used here anand - Wash (though minimums do show Nvidia as Ryan points out)...Civ5, landslide again at 1920x1200 here anandtech...Metro2033 here anandtech, <5% win for Nvidia %1920x1200 (I call it a wash I guess)...

So which game can I point to that will be OK to you? I'll try to help you let AMD win...:)

Understand Tomshardware, turned all cards to default...So you buy a card and downclock them all to test...ROFL "we dropped each card's clock rates to reference levels." from page 2 of their article:http://www.tomshardware.com/reviews/geforce-gtx-66...That same page 2 from their review at the bottom of the graphics card list:"All overclocked cards reduced to reference specification for testing"So, every card will perform UNLIKE what you would buy on either side. Their review is worthless as they are nerfing even ATI cards. Though it hurts NV more. I'm not sure why they even ran the benchmarks...They should have just said look elsewhere for real answers to how these will perform when you buy them out of the box. Nobody else did this, which is why they are ODD in their results.Reply

That's great them Tom's Hardware can put 100% or near so amd cards in their bang for buck monthly again perpetuating the big lie, jiggering the price categories up or down depending on what makes amd fanboys gleeful. It's so ridiculous they get best card for 115, best card for 155, then next month, best card for 90, best card for 135 ETC, and then they squish the crap amd card in just uner the number, and their attached price link shows it 50 bucks higher on the day of their post

You thought this place was bad ? LOL

Then the rabid amd fans at toms put a minus 20 on every comment that doesn't kiss the amd quite often. They're goners.

They do have more than 1 reviewer, so some times you'll get something sane, but not very often. It's been degrading for a long time, it's really sad.Reply

It has a lot to do with what settings are being used in-game. The Tom's article admits at the end that their setup could be AMD favored, since they tend to prefer high levels of AF and AA, which eat up memory bandwidth and heavily tax the memory subsystem (a strong point for AMD).Reply

That's because Tom didn't use Portal 2 in benches and Nvidia is so gooood at it! Plus, instead of Dirt 3, he used dirt showdown and AMD is soooo good at it. So if you don't play: Balltefield 3, Dirt 3 and portal 2, there's a good chance that the 7870 might be better for you considering it wil performe equially/very close to the higher priced gtx 660.

But again, if I'd be a heavy battlefield 3/portal 2 player, the choice is obvious...Reply

Correcting myself higher priced GTX 660 ti. But gotta remember at the same time, there's a limited quantity of Borderlands 2 they give away if you buy an Nvidia video card which should be a testament that their card perform well with this game and it's worth 60$ so you save that if you ever planned to buy it anyway....Reply

Said the fella at the site that has been milking crysis one for amd fanboys for how long now, even as Crysis 2 has been out for almost a year now... Yeah, sure is all nVidia here (gag)(rolls eyes)(sees the way every review is worded)Reply

Obvious for everyone else too. Quit looking at ryans comments and 2560x1600 where 98% of us don't run. He bases most of his junk comments and conclusion on 2560x1600...WHAT FOR?68 24in monitors on newegg...NOT ONE over 1920x1200.52 27in monitors on newegg. 41 1920x1200 or less, only 11 at 2560x1440 (NOT 1600). HIs recommendations and crap comments are about a res you can't even get until you use multimonitor or a 30incher!

Already proved all the games run better at 1920x1200. See my other post to you...Its a lot more than battlefield and portal2, dirt 3..shogun2, Skyrim, Batman AC, Witcher2, Battlefield 3 Multiplayer, max payne 3, Civ5 (landslide again at 1920x1200 here anandtech). How many more you need? Don't point me to wins at 2560x1600 either... :) Unless we all are getting 30inchers for free soon (obama handouts?), it doesn't matter.

What games are OK to test without you calling them biased towards NV?Reply

I thank you for telling the truth and putting up with the amd fanboys who can't find the truth with the swirling retarded amd fanboy redeye goggles sandblasted to their skulls. I really appreciate it as I don't feel like using the truth to refute the loons and taking so many hours to do so and having to rinse and repeat over and over again since nothing sinks into their insane skulls and manning up is something they never can do. I do have hope though since a few obviously kept responding to you, and hence likely reading some (ADD and dyslexia may still be a problem by the looks of it though) so maybe in 20 years with a lot of global warming and hence more blood flow to their brains as they run around screaming the end is nigh, the facts everywhere presented over and over again will perhaps start just a tiny bit to begin sinking in to the mush.LOL I have to say, you definitely deserve a medal for putting up with them, and doing that large a favor pointing out the facts. I appreciate it, as the amd lies are really not friendly to us gamers and especially amd fanboys who always claim they pinch every penny (which is really sad, too).Reply

It wins at 1920*1080 often because the games are cpu limited and Nvidia has an advantage to use lower cpu ressources. It does mean something else too, if it's cpu limited, it means the graphics doesn'T push the system enough and at the same time means that when intensive graphically new games will come out, the cpu will be less in the way. What's bad in buying a video card that already maxes everything at 1080p and will do this for you in the future because these games are just not pushing it enough?

I remember the gtx 580 when it came out, it was running everything in 1920*1080 while gtx 570 and radeon 6970 were already doing this still people bought gtx 580 and now they are more taxed it's useful at 1080p. But it's obvious gtx 660 ti is superior in many ways and many games but what I want you to understand you two(Cerise and Jian) well I should just say Jian, I understood a long time ago that Cerise has a closed mind on the subject, is that AMD has strenghts too it loses overall at 1080p and stock clocked cards, but someone can be happy with a card like that anyway..... While all along I've been discussing, I never said Nvidia was bad, I never dismissed their older gen card either as amazing parts too, while you just continued and try to make people beleive that you'll see AMAZING difference, HUMONGOUS GAINS by buying Nvidia and that AMD is cancer(or at least it looks like that in your eyes).

It's quite hard for anyone right now who's running a 7950 like I now do and my friend do and my 6850 crossfire did and my 4870 did. like my 8800gt, the gtx 460 I bought building computers for many of my friends do to just understand what rabble you might say about such a difference when 90% of their game is pegged at 60fps from high to ultra details. All these graphs reviews and everything else, they're not reflecting what the average user feels when they play their game.Reply

I debunked all this already (see other posts). Besides they ran all cards at ref speeds...LOL. Bandwidth is NOT an issue where 98% of us run. 1920x1200 or below even on 27in monitors (only 11 27in at newegg have above 1920x1200 and it's less than tested here at 2560x1440). Ryan misleads over an over in this review as if we run on 30in monitors. WE DON'T. @1920x1200 you won't have memory problems from either side. Not even at msaa/af/fxaa etc. ALL of the 24in monitors at newegg have 1920x1200/1080. NOT ONE at 2560x anything. Only 11 27in that are 2560x1440 all 41 others are 1920x1080 (even less taxing than 1920x1200!). Ryan is just trying to stop people from buying Nvidia I guess. I'm not sure why he thinks 2560x1600 is important as I've already shown <2% use it in steampowered's hardware survey and you basically have to have a special 27 or 30in to run above 1920x1200 native. Raise your hand if you are running in that 2% user group? I don't see many hands...LOL. Also note of that 2% most are running multi-monitor & usually multi card setups. But ryan can't make a recommendation...LOL. He could if he would quit pretending we all use 2560x1600...ROFLMAO. I admit, at that res you MAY run into memory issues (for the 2% that do it).Reply

Is there anyone out there trying to mod this thing to a 670 yet if it's all the identical parts with one of the four rop/mem buses disabled. I'd imagine some of these things, even if binned as failed 670s, a few would most likely have all 4 rop/mem buses functional.

This would be a pretty sweet upgrade path if so :) Would be the Radeon 6950 all over again (and all the previous generations that were able to either do softmods or if anyone remembers the pencil graphite trick back in the day).

Thanks for the review, I've been waiting for this one...even though I'm pretty disappointed. The 7970 I've seen on sale for $360 lately and right now it's looking like it's going to be the best bang for your buck. That's cheaper than a 670/680, only slightly more than a 660 Ti, and it's pretty much the performance crown single GPU for the most part, though AMD's drivers lately are scaring me.Reply

Please point me to a 7970 for $360. The cheapest on newegg even after rebate is $410.

Nice try though. "I'm pretty disappointed". Why? You got a 30in monitor or something? At 1920x1200 this card beats the 7970 ghz edition in a lot of games. :) Skyrim being one and by 10fps right here in this article...LOL.

Mod to 670 isn't worth it when the shipped cards already beat it (3 of them did here). Remember, you should be looking at 1920x1200 and ignoring Ryans BS resolution only 2% or less use (it's a decimal point at steampowered hardware survey). If you're not running at 2560x1600 read the article again ignoring ryans comments. It's the best card at 1920x1200, regardless of Ryans stupid page titles "that darned memory"...ROFL. Why? STill tromps everything at 1920x1200...LOL.

Got anything to say Ryan? Any proof we'll use 2560x1600 in the world? Can you point to anything that says >2% use it? Can you point to a monitor using it that isn't a 27/30in? Raise your hand if you have a 30in...LOL.Reply

And the prices just dropped, so yeah, I should be off by ~20 by now :) White box, as stated. No game. Well, dirt showdown don't count it's rated so low ;)

But nothing that states my analysis is incorrect. His recommendations were made based on 2560x1600 even though as proven 98% play 1920x1200 or less and the monitor he pointed me to isn't even sold in the USA. YOu have to buy it in Korea. With a blank faq page, help is blank, no phone and a gmail acct for help. No returns. Are you going to buy one from out of country from a site like that? Nothing I said wasn't true.Reply

I wonder if any board partners will try making the board symetrical again by pushing it up to 3GB? It's not like the extra ram would do any good, but if you could keep an already memory bandwidth starved card humming along at 144GB/s and prevent it from dropping all the way down to 48GB/s, it might help.Reply

It doesn't drop to 48GB, that was just the reviewers little attack.You should have noticed the reviewer can't find anything wrong, including sudden loss of bandwidth, in this card, or the prior released nVidia models with a similar weighted setup. The SPECULATION is what the amd fanboys get into, then for a year, or two, or more, they will keep talking about it, with zero evidence, and talk about the future date when it might matter.... or they might "discover" and issue they have desperately been hunting for. In the mean time, they'll cover up amd's actual flaws.It's like the hallowed and holy of holies amd perfect circle algorithm. After years of the candy love for it, it was admitted it had major flaws in game, with disturbing border lines at shader transitions. That after the endless praise for the perfect circle algorithm, that, we were told - when push came to shove, and only in obscurity, that no in game advantage for it could be found, never mind the endless hours and tests spent searching for that desperately needed big amd fanboy win... So that's how it goes here. A huge nVidia advantage is either forgotten about and not mentioned, or actually derided and put down with misinformation and lies, until some amd next release, when it has appeared it is the time that it can finally be admitted that amd has had a huge fault in the exact area that was praised, and nVidia has a huge advantage and no fault even though it was criticized, and now it's okay because amd has fixed the problem in the new release... ( then you find out the new release didn't really fix the problem, and new set of sdpins and half truths starts after a single mention of what wrong). Happened on AA issues here as well. Same thing.Reply

Most games are made to target specific amounts of memory, and often you won't hit the bottlenecks unless you run at higher detail settings. 1920x1200 even with 4xAA isn't likely to hit such limits, which is why the 2560x1600 numbers can tell us a bit more.

Best case for accessing the full 2GB, NVIDIA would interleave the memory over the three 64-bit connections in a 1:1:2 ratio. That means in aggregate you would typically get 3/4 of the maximum bandwidth once you pass 1.5GB of usage. This would explain why the drop isn't as severe at the final 512MB, but however you want to look at it there is technically a portion of RAM that can only be accessed at 1/3 the speed of the rest of the RAM.

The better question to ask is: are we not seeing any major differences because NVIDIA masks this, or because the added bandwidth isn't needed by the current crop of games? Probably both are true to varying degrees.Reply

Memory starved at what? NEVER at 1920x1200 or less. Are you running a 30in monitor? All 24in monitors are 1920x1200 or below on newegg (68 of them!). 80% of the 27inchers are also this way on newegg.com. 3GB has been proven useless (well 4gb was):http://www.guru3d.com/article/palit-geforce-gtx-68...

"The 4GB -- Realistically there was NOT ONE game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference."

"But 2GB really covers 98% of the games in the highest resolutions. "Game over even on 2560x1600 for 4GB or 3GB. Ryan is misleading you...Sorry. Though he's talking bandwidth mostly, the point is 98% of us (all 24in and down, most 27in) are running at 1920x1200 or BELOW.Reply

Was wondering about how the Zotac was altered to stand in as a reference 660 Ti.

Were the clock speeds and voltages lowered through one of the overclocking programs, or was a reference BIOS flashed onto it? I ask because as I understand AMD's base/boost clock implementation, the base clock is set by the BIOS and is not alterable by outside software.Reply

I was hoping these cards would come in under the $250 price point. I don't really see them as substantially lower at $300. If I were in the market for a card today I'd probably settle on the 7950 over the 660TI as it looks like it has room to grow with better drivers.. and seems like the 3G might actually benifit it in the long run.. or I'd just get a 670 and call it a day.Reply

"Otherwise the memory layout is the same as the reference GTX 660 Ti with 6 chips on the front and 8 on the back." - page 5

Ok I'm confused here because a few pages back it said:

"The only difference we can find on this PCB is that instead of there being solder pads for 16 memory chips there are solder pads for 12, reflecting the fact that the GTX 660 Ti can have at most 12 memory chips attached."

I get that this is a custom PCB so it might vary from the reference PCB but I don't understand how it can be equipped with 14 memory chips and if it is then is it a mix of 2Gb and 1Gb chips? Can you please explain?

Also for people that are referencing the 7870 on newegg at $250 can you please provide a link because the cheapest card I found was @ $279.99 AFTER a mail in rebate. Seems to me to be sitting much closer to $300 than $250.

Overall I was surprised by the performance of this card, I figured it would be a dog in games like Metro 2033 and Crysis having that extra ROP unit / memory bus cut down. Reply

they've tweaked gk104 resulting a new chip more of a gk105 ..lolbut this card gtx660ti is shitta highly overclocked would beat it(maybe 2fps more but who care)...and is $60 less ...this card is made for gamers who wants efficiency more than performanceright now the hd7950 is the best vfm card followed by gtx670Reply

Can you prove anything you've said? Links to reviews showing this please. You can overclock the 660TI also. I backed my opinion all over this comment section. I'd be more than happy to look at some data if you have ANY. No 2560x1600 though, I already proved nobody uses it. No 24in sold that use it on newegg. No 27's go that high either...ROFL. Please...LInks and data.

The 660TI is the best VFM card for 98% of us. We don't run in 2560x1600.Reply

The GTX 680 is 20% higher in performance than the 660Ti but it comes at a lofty 67% higher price tag. The 670 is just 10% faster but still comes at a 33% price premium. I was upset after dropping 5 Benjamins on an EVGA 680 Superclocked when the 670 came out. I should have waited on THAT card and saved a hundred. Now this card is out and two of these in SLI will slaughter my 680 by a 35% margin for just another 20% in cost (according to Guru3D's SLI tests). Just damn on my timing and decisions. Methinks I'm selling the 680 for a $50 loss and get two of these for $600. Sure beats the original plan of spending $1,000 in a 680 SLI setup. Reply

Another 680 at newegg is $499 (lots of choices). Two 680's will smoke the two 660's for $100 less. Since you already own one ;) If you buy two of either, I hope you're going to run them at 5760x1200 (3 monitors or 3840x1200? two monitors?). Wasted power otherwise. 1920x1200 is already 100fps+ in almost everything on a GTX 680.

But sorry about you jumping early :) That's the price any of us pay for being first on the block to have the latest toys :) Also note, you can turn down both 680's and have a silent seriously butt kicking machine until you actually need the power. No heat or noise until you actually need it. Let's face it, two 680's is a LOT of freaking performance.

Congrats if you've got an extra $500 laying around in these times :) Might want to wait for labor day specials though ;)Reply

I'd love to see SC2 come back, particularly with the new Arcade games. Some of them can easily bring even a top card to its knees. The final battle of a Desert Strike game will crush even the best cards.Reply

Nah, I'm sure people would whine because it's another victory for Nvidia at 1920x1200 and below (heck I think above also). They could have benched it as before but Ryan probably wanted to leave it out :) He might have had to make a conclusion then, even at 2560x1600... ;)http://www.anandtech.com/show/6025/radeon-hd-7970-...Note that article is from 6/22/2012. and they used it again here:http://www.anandtech.com/show/6096/evga-geforce-gt...7/22/2012...You see, at 1920x1200 ULTRA + 4xMSAA the GTX670 already scores 121.2 vs. 108.3 for the 7970GE (7970 only gets 99, and the 7950 gets 88.2fps). So you would have the GTX 660 TI smoking the 7970Ghz edition. That wouldn't look too good when it's supposed to be competing against the 7850/7950...LOL. The GTX 670 even beats the 7970GHZ edition in the 2% market share 2560x1600 also. So it may have looked pretty bad against the 660 here also. It would have made his 2560x1600 digs and conclusion even worse and hard to even argue there. Ryan was smart here...Just not quite smart enough if you look at the big picture of evidence.

Understand why they won't bench SC2 again now? Why not run the last version patch that works fine? Did 1.51 not work too (released 8th? a week ago from review date). Instead he keeps in Warhead from 2008 and an engine from 2007 that was only used in 7 games vs. the much more TAXING Crysis 2 (not 1) with DX11 patch and Ultra res patch which turns on a crapload of stuff like:hardware tessellation, soft shadows with variable penumbra, improved water rendering, particle motion blur and shadowing, Parallax Occlusion Mapping, full-resolution High Dynamic Range motion blur, & hardware-based occlusion culling."can it run crysis?"...Wrong question, can it run crysis 2! :) I still think it would be close or a loss for NV though with 660. It would be a close call, probably a wash...But that wouldn't help ryan either :) Hence the 2008 game. Reply

I'm not even going to waste my time with this BS comment. But see my response to Ryan's lame excuse over 2560x1600 for all the details you SHOULD have seen in his review (and some that he SHOULD have put IN the review). It only one ONE game @1920x1200. In my response to ryan I prove you can't run at 2560x1600 and stay above 30fps.

The 285 was included because I wanted to quickly throw in a GTX 285 card where applicable, since NVIDIA is promoting the GTX 660 Ti as a GTX 200 series upgrade. Basically there was no harm in including it where we could.

As for the 480, it's equivalent to the 570 in performance (eerily so), so there's never a need to break it out separately.

And the 680 is in Bench. It didn't make much sense to include a card $200 more expensive which would just compress the results among the $300 cards.Reply

So you're saying the 680 is way faster than the 7970 which you included in every chart, since the 7970 won't compress those $300 card results. Thanks for admitting that the 7970 is so much slower.Reply

I know one of the differentiating factors for the Radeon 7950s is the 3GB of ram but I was curious are there any current games which will max out 2GB of RAM with high resolution, AA, etc.?

I think it's interesting how similar AMDs and Nvidias GPUs are this generation. I believe Nvidia will be releasing the GTX 660 non Ti based on GK106. Leaked specs seem to be similar to this card but the texture units will be reduced to 64. I wonder how much of a performance reduction this will account for. I think it will be hard for Nvidia to get the same type of performance / $ as say GTX 460 / 560 Ti this generation because of having to have GK104 fill in more market segments.

Also I wasn't aware that Nvidia was still having trouble meeting demand with GK104 chips I thought those issues were all cleared up. I think when AMD released their 7000 series chips they should have taken advantage of being first to market and been more competitive on price to grow market share rather than increase margins. At that time someone sitting on 8800GT era hardware would be hard pressed to upgrade knowing that AMDs inflated prices would come down once Nvidia brought their GPUs to market. People who hold on to their cards for a number of years is unlikely to upgrade 6 months later to Nvidias product. If AMD cards were priced lower at this time a lot more people would have bought them, thereby beating Nvidia before they even have a card to market. I do give some credit to AMD for preparing for this launch and adjusting prices, but in my opinion this should have been done much earlier. AMD management needs to be more aggressive and catch Nvidia off guard, rather than just reacting to whatever they do. I would "preemptively" strike at the GTX 660 non Ti by lowering prices on the 7850 to $199. Instead it seems they'll follow the trend and keep it at $240-250 right up until the launch of the GTX 660 then lower it to $199. Reply

Pixelpusher, there are no games we test that max out 2GB of VRAM out of the box. 3GB may one day prove to be advantageous, but right even at multi-monitor resolutions 2GB is doing the job (since we're seeing these cards run out of compute/render performance before they run out of RAM).Reply

but it's the middle of freaking August. While Tahiti was unfortunately clocked a bit lower than it probably should have been, and AMD took a bit too long to bring out the GE edition cards, Nvidia is now practically 8 months behind AMD, having only just released a $300 card. (In the 8 months that have gone by since the release of the 7950, its price has dropped from $450 to $320, effectively making it a competitor to the 660 Ti. AMD is able to compete on price with a better-performing card by virtue of the fact that it simply took Nvidia too damn long to get their product to market.) By the time the bottom end appears, AMD will be ready for Canary Islands.

It's bad enough that Kepler (and Fermi, for that matter) was so late and so not available for several months, but it's taking forever to simply roll out the lower-tier products (and yes, I know 28nm wafers have been in short supply, but that's partially due to Nvidia's crappy Kepler yields... AMD have not had such supply problems). Can you imagine what would have happened if Nvidia actually tried to release GK110 as a consumer card? We'd have NOTHING. Hot, unmanufacturable nothing.

Nvidia needs to get their shit together. At the rate they're going, they'll have to skip an entire generation just to get back on track. I liked the 680 because it was a good performer, but that doesn't do consumers any good when it's 4 months late to the party and almost completely unavailable. Perhaps by the end of the year, 28nm will have matured enough and Nvidia will be able to design something that yields decently while still offering the competitiveness that the 680 brought us, because what I'd really like to see is both companies releasing good cards at the same time. Thanks to Fermi and Kepler, that hasn't happened for a while now. Us consumers benefit from healthy competition and Nvidia has been screwing that up for everyone. Get it together, Nvidia!Reply

So as any wacko fanboy does, you fault nVidia for releasing a card later that drives the very top end tier amd cards down from the 579+ shipping I paid to $170 less plus 3 free games. Yeah buddy, it's all nVidia's fault, and they need to get their act together, and if they do in fact get their act together, you can buy the very top amd card for $150, because that's likely all it will be worth. Good to know it's all nVidia's fault. AMD from $579+plus ship to $409 and 3 free games and nVidia sucks for not having it's act together. The FDA as well as the EPA should ban the koolaid you're drinking.Reply

Thanks for the review Ryan. Always appreciated.It's a bit annoying that AMD recently updated their drivers, but no card. I wonder how long that driver update was held back?No doubt NVidia will find similar gaming improvements in the months to come. I expect any issues that may arise due to the asymmetric bandwidth to GDDR5 ratio will again be quickly overcome. People should remember that the CUDA issue is limited to fp64 (double precision); single precision is greater than previous generations.Reply

The amd fanboys have finally had to shut up their 8 month long lie after the 4G 680's, even though all the data was there from the very 1st 600 series nVidia release, which still wins in the highest resolutions the reviewers test in. Now the idea is down to some future fantasy, even though the cores are smoked out before they can use the ram, as Ryan FINALLY after 8 months admits above in a post. Of course, an honest person like myself has said it from day one.Reply

From what I'm seeing is hardly worth overclocking 660 Ti. What you get is just a few extra frame rates,5 FPS on average, with a quite high increase in load power and temperature.I don't believe it's really worth it. This just seems to be the case with most video cards in my opinion.You just push it to its limits, probably shorten its life, for what?The only good possible outcome is in titles where your FPS are under 30-35 and you need a bump to make the game smooth. What I prefer to that is simply lower the resolution a bit, disable shadows and other effects you don't stare at while you're busy playing.What do you say?Reply

A certain review site comments on the effect maximum overclocking in it's ability to provide any game play improvements whatsoever. Often it does not. In a few rare cases, you can actually turn up one more feature in a game because of it, or have a frame rate playable when it was unplayable. The truth is there are so many drooling idiots willing to "love" whatever their fanboyism tells them to buy, that they will go to the ends of the earth to proclaim their few percent fps advantage that is notionally possible, if they had the monster rig and cpu the reviewer uses, and the gigantic screen, and the excessive ram, and fastest SSD, and supremely clean and defragged fresh install, and the years of OC stability settings under the belt, which they of course do not have, likely not even a single one of the above. So they go on in rampant fps only fashion, only at the highest peak of perf on maxi system specs they don't own, and IGNORE every other feature set of the cards. Every single other thing is GONE from the mind of the fanboy - unles of course their fanboy fave uses some obscure thing they WON'T and DON'T use - that they just put down for the last few years when their enemy was best at it - like "compute". So this is what they do. Then after getting their junk, and OC'ing, and having instability issues, they slap it back to stock so their games don't crash, you know after bragging and posting for a night or two, and destroying their electric bill they so methodically whined about for so long before their fanboy card turned into the housefire. So then they're stuck with their crap that does one thing that doesn't matter, and does not improve their gameplay whatsoever. Doesn't matter, that melted mind that brought them to the purchase is still swirling and ignorance is bliss, and boy are they happy they did the right thing and made the right choice. Then they tell themselves TXAA sucks, PhysX sucks, auto overclock sucks, target frame rate sucks, smooth gaming sucks, driver improvements back to 6 year old series 6 sucks, adaptive v-sync sucks, 3+1 surround and 3d monitors capabilities sucks, cooler and quieter sucks, and they just got the bestest hardware eva' as they saved $15 or spent $30 more bucks, either way, kneeling toward Dubai is called for. See, that's how it works. Even gaming IQ sucks, because who can see that ?Reply

I failed to mention as well, although the jaggies really suck, TXAA sucks and is unneeded and really unwanted because nVidia screwed up and did not make it supersharp, and super sharp is very, very important. Of course, we also know, that indeed, AMD's morphological antialiasing is pretty cool, and an awesome feature, and the blur is not bad at all, and having the words and letters in huds and panels blurred is ok, because the performance advantage is well, well worth it. Yes, that's exactly what we were told.Reply

All of the resolutions above 1920x1200 (above it) total less than a 2%. 1920x1080 and 1920x1200 are 29.x %. Everything else is below this. For all your talk about running out of rops, bandwidth etc...You're wasting most peoples time. I'd say every benchmark from 1920x1200 and below is the only thing that matters unless your after a 27in+ comparison. My 24/22 setup is 1920x1200 (dell 2407HC) and 1680x1050 (LG W2242T). Even the 27in I want is only 1920x1200 and I don't want higher (if only because I want to game in the same on both and it's just all around easier, not to mention a good card can run almost ANY game at this res).

It's nice to note changes at 2560x whatever, but who really cares? How many of you have 27in monitors? Why does anandtech tout this as anything special. You should be touting what we USE (not maybe what you special reviewers use, who I guess all have 27in+ or multi monitors in spanned resolutions...which again is NOT a large portion of users at steampowered.com). I don't use steam at all (it's a virus...LOL) but I do consult the stats they have (which are great).

I'd submit that the winner at 1920x1200 is what's important as most will never go over it. Find out what you can afford at that res and prep for future monitor purchases in that regard. If you don't ever plan on a 27in ignore 2560x+. You'd be looking at the wrong benchmarks and making an improper decision. I'd also submit that 97% (per the steampowered survey) of your users are NOT paying $301+ for their cards (I thought I paid a lot for my radeon 8850 at $260). It's pretty much $299 and below. A good 90% on that survey have 2GB or less of memory on the card (not many with SLI/Crossfire I guess). I'd rather see these results totally removed, and add more games to the testing. Perhaps an article dedicated to large resolutions should be written, but including it and bashing cards that fall off where nobody cares anyway is kind of pointless.

It's kind of like reviewing motherboards with 2+ vid slots. How many people have SLI/Crossfire in use? I have to buy a pointless/expensive motherboard to get all the ports I need (every time I buy a board! - try to get a board with all the trimmings and a SINGLE pcie x16 setup). I've yet to meet someone who actually OWNS 2 video cards and using sli/crossfire. PCIe 2.1 (or 2.0) x1 is 500MB/s (not Mbit), so what would I need 2 x16's for unless SLI/Crossfire is going to be used. Are motherboard makers catering to 5% of the market or what? The same can be said about testing boards in sli/crossfire. Review sites should complain more about wasting our money on slots that are never used. It takes a top of the line SSD to take out an X1 slot's bandwidth. Wireless etc will never need more than x1 before I'm dead...LOL.Reply

As I'm sure you're aware, our primary readership for video card articles are enthusiasts. So while we cover a broad spectrum of hardware on the whole, for video cards our testing methodologies are going to lean towards those methods best suited for people buying the product - enthusiasts.

To that end, 2560 monitors have become quite popular with enthusiasts. In recent months this especially goes for the $400 2560x1440 "Catleap" monitor (and other monitors based on the same LG panel), marking the first time such a high resolution IPS monitor has been available below $500. 1920 remains the most important resolution for gamers, but when we're talking about $300+ cards there is a significant minority running larger monitors.

Finally, in case you missed it in the article, we did quickly discuss optimal resolutions and what we would be focusing on:

"For a $300 performance card the most important resolution is typically going to be 1920x1080/1200, however in some cases these cards should be able to cover 2560x1440/1600 at a reasonable framerate. To that end, we’ll be focusing on 1920x1200 for the bulk of our review."

And 1920x1200 is what we based our concluding recommendations on.Reply

So you don't write for 98% of your readership? You're expecting us to believe you're hitting the 2560x1600 so hard due to a monitor I can't even buy on newegg? Seriously? I already pointed out NO 24in monitor at newegg (68 of them) runs above 1920x1200, and out of 52 monitors at newegg that are 27in only 11 of those run at 2560x1440, NONE at 2560x1600. Why not run your benchmarks at that res then if it's all about catleap? Again, you may think your readership is "enthusiast" but I'd argue that if you're calling enthusiasts only people with a $688-$2300 monitor (see why it's not $400 below) you're not in tune with your readership or what enthusiast means. Read on, it's going to get worse, using YOUR WORDS. It's long people, but WORTH THE READ ;)

The monitor: Available at Amazon from 4 SELLERS. 3 from Korea, and one from New Zealand...ROFL. 3 are "JUST LAUNCHED" WITH NO REVIEWS. Ok the one has 2 reviews in the last 12 months, but consider that just started too...The 4th,:http://lowellmac.ecrater.com/help.phpNo FAQ (BLANK PAGE) or ABOUT (BLANK PAGE) pages, and it looks like the store went up so quick they must be fraudulent. Contact? A GMAIL ACCOUNT! You can't even dial this joint...er...I mean can't dial this DUDE. :)

You couldn't come up with a better defense than a product (popular with enthusiasts? WHO?) that I can't even buy in the USA? Who the heck is buying $400 monitors from places with no phone, a BLANK about, a BLANK faq page, and ZERO reviews? Really? Where did you buy yours?...ROFL. KOREA? Refunds? From the only one I can dig up above: "Refunds and returns are generally not accepted unless product differs from description." from the only page they seem to have taken the time to fill in...Seriously Ryan? Are you freaking kidding me. I've never even heard of a YAMAKASI CATLEAP monitor...Now I know why. It doesn't exist in America...ONLY ONE REVIEW of the actual product from Aug 2nd. Probably from the guy that started his "JUS LAUNCHED" website...LOL. Which brings us to all the ones priced HIGHER than your $400. Amazon has an HP IPS for $688 (newegg too, this is the cheapest ANYWHERE!), Dell IPS for $800. Google that thing. In fact, ALL 2560x1440 monitors (11) at newegg are $690-2300! So you can't get into this club for under $690. So you benchmark based on Monitors you can't get (for the price you said, breaking the $400 barrier) from countries I would never give a credit card to (not without LIVING there). Raise your hand if you're one of the 2% that first, has a 27incher, then even more special the person Ryan wrote the review for who apparently has $688-2300 or this ONE "CATLEAP" owner....What's that like a decimal point of your readership? Yeah...I would write for that many people too.By "significant minority" you mean .2% of the 2% right? :) So you think most of your readership are in that .2% of the 2% then? Really? Do people with $680-$2300 for monitors only have $300 for a vid card to push it? Do they really dicker over $20 as you insinuate in your review conclusion? For such a popular monitor I find it hard to believe newegg doesn't even sell it. Don't you? Amazon doesn't sell it either (only through marketplace from no-namers in KOREA...Not Amazon themselves). Nope didn't miss your statements, used them all over the place here in the comments section already :) You really should have just said, "I've written a biased article, I'm sorry, I retract it, now go away. thanks..."Get ready...I'm going to use your words and benchmarks, easy for you to follow :)Here we go:"Coupled with the tight pricing between all of these cards, this makes it very hard to make any kind of meaningful recommendation here for potential buyers. Compared to the 7870 the GTX 660 Ti is a solid buy if you can spare the extra $20, though it’s not going to be a massive difference."

OK, so super enthusiasts care about $20 (but have $688-$2300 for a monitor or that POPULAR $400 available nowhere), but even if you buy this it won't make much difference in your conclusion. "if you can spare the $20"...They can buy that $688-$2300 monitor though..LOL You're not making sense here....Aren't these "potential buyers" buying $688-$2300 monitors? Isn't that what you said? But lets really break it down past the monitor $688-$2300, or nonexistent monitor garbage from companies with ZERO reviews and from Korea or New Zealand:

This is YOUR data:7870 @2560x1600 unplayable at Warhead (25fps, min18.9fps),"At 38.8fps it’s playable, but it’s definitely not a great experience" You said that about 1920x1200 on this game! You weren't done there though: "So for anyone wanting to partake in this classic, an AMD card is the way to go and it doesn’t matter which; even the 7870 is marginally faster."MARGINALLY FASTER? Only over the REF card you can't get. TWO others in the list, it LOST by your MARGINALLY SLOWER. Come again? Even the REF card only lost 39.9fps (7870) to 38.8. ONE FPS! No other 660 was beat wins or loses by more than .5 fps (thats a 1/2 fps out of 40!) Have you heard of margin of error? This is the definition of it. $299 gets you ZOTAC AMP SPEEDS at newegg. Not to mention this game is from 2008 and Crysis 2 would be a LOSS.Metro2033 28fps (min will be less, unplayable), Dirt3 74fps (but the 660's beat it anyway), shogun 2, again unplayable (19.1fps), arkham city 45fps (but beaten by 15%) but minimums are going to push unplayable [scratch that check hardocp below, the 7950 hits 10-15fps in this game min for quite a bit! with max 64fps] - TOTALLY UNPLAYABLE BATMAN 2560x1600, portal 2 37.3 (again 58+ for 660's-but 50% faster wont be noticed) so minimums here will hit unplayable probably also. Battlefield 3, 41fps (again 660's 56 jeez...no difference) again minimums may hit below 30fps in a MULTI Player game like Battlefield 3 it's UNPLAYABLE, elder is a wash at 76fps, Civ5 can run 68.8 but again beaten by 660 70+fps. So for MAYBE half of these games people will be running at your enthusiast res. But it's a $688-$2300 monitor (unless you're crazy and buy from korea) that they will turn down their graphics for so they can run on their shiny new monitor...Yep that's enthusiast mentality alright. I'm rich, on the bleeding edge but I'll turn down my graphics and run OUTSIDE the native res of my expensive toy. I think most buy $600 video cards and $300 monitors, not the other way around. Plenty of 27in 1920x1080 for $250-600 (the other 41 on newegg, NONE at 1920x1200). So this card really is NOT for enthusiasts thereby making it a race at 1920x1200 for this card 7870. Well really 1920x1080, since NO 27in on newegg.com is 1920x1200 and besides you just said RIGHT ABOVE THIS MSG "And 1920x1200 is what we based our concluding recommendations on. " We're about to test the truth of that statement Ryan :)660TI vs. 7870 @1920x1200Civ5 >3% fasterSkyrim >6% fasterBattlefield3 >37% faster (above 50% or so in FXAA High!!)Portal 2 >62% faster (same in 2560x...even though it's useless IMHO)Batman Arkham >22% fasterShogun 2 >31% fasterDirt3 >11% fasterMetro 2033>15% fasterWarhead ~wash (all 660 38.8fps-40.4 & 7870 is 39.9) WASHSTARCRAFT 2: 7970ghz (108FPS) VS. GTX670 (121FPS) @1920x1200http://www.anandtech.com/show/6096/evga-geforce-gt...So even though you left it out (lame excuse, use the same one you did here last month!), we can guestimate it would be just behind or FASTER the gtx670 as it's beaten a few times by the OC'd 660's (I.E. shogun2 3fps faster than gtx670) in this review for the 660 TI. Which BTW as I've just show SLAUGHTERED the 7970GHZ edition! NO wonder you left it out! The 7950 scored 88.2! in that test, so around 40-50% faster in STARCRAFT2 vs a card that isn't even the 7870 we're talking about for $20 ryan thinks you should save! So what 55% faster than 7870 in starcraft 2? Nah, never would notice that would they? Save your $20, none of these speed increases (some above 50%) don't matter at all...LOL.So for $20 you won't notice the difference (in this res this card would be bought for) with these phenomenal results vs. the 7870. Faster in everything, 31%, 37%, 62%, 15%, 11%, 22% & 50%+ (starcraft2)...People won't notice a card running this much faster in most of the games you tested? You really want to stand by that statement? Only a retard wouldn't buy this performance gain almost across the board. It would be killed in Crysis 2 also at this res since 7870 can't do what the 7970 series can in warhead already and Crysis 2 doesn't work like Warhead does in AMD's favor as I've shown in other comments here. I'm sure it will do well in Borderlands 2 that comes free with it also as it's a TWIMTBP game (another 50% game?). MASSIVE DIFFERENCE HERE PAL. "if you can spare the $20" though you've just bought your shiny $688-$2300 monitor somehow...ROFL. OK. Still "hard to make any kind of meaningful recommendation."??? C'mon...Seriously? You made one that makes NO SENSE.

"What’s different about this launch compared to the launches before it is that AMD was finally prepared; this isn’t going to be another NVIDIA blow-out." REALLY? Does anyone believe this statement after following my points here?"As it stands, AMD’s position correctly reflects their performance; the GTX 660 Ti is a solid and relatively consistent 10-15% faster than the 7870"Didn't I just prove that BS? 31%, 37%, 62%, 15%, 11%, 22% & 50%+. Well you got two right, out of the bunch. But otherwise it's 22%+ for EVERYTHING you tested. OH and note while writing this I see AMD just hacked all their prices because - Well, they KNOW THE TRUTH. They did the math like I did and re-adjusted IMMEDIATELY. 10 hours ago...LOL.

"AMD has already bracketed the GTX 660 Ti by positioning the 7870 below it and the 7950 above it, putting them in a good position to fend off NVIDIA." ? REALLY? Why did they just drop prices (10 hours ago...ROFL) across the board?

"while the 7950 is anywhere between a bit faster to a bit slower depending on what benchmarks you favor." HERE WE GO AGAIN PEOPLE: Follow along:Civ5 <5% slowerSkyrim >7% fasterBattlefield3 >25% faster (above 40% or so in FXAA High)Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)Batman Arkham >6% fasterShogun 2 >25% fasterDirt3 >6% fasterMetro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)Crysis Warhead =WASH (ref 7950 (66.9) lost to ref 660 (67.1), and 7950B 73.1 vs 72.5/70.9/70.2fps for other 3 660's) this is a WASH either way.STARCRAFT 2: 7950 (88.2fps) VS. GTX670 (121.2fps) @1920x1200So another roughly 37% victory for 660TI extrapolated? I'll give you 5 frames for the Boost version...Which still makes it ~30% faster in Starcraft 2.So vs. the 7950B which you MADE UP YOUR MIND ON, here's your quote:"If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"

we have victories of 25% (bf3), 54%(P2), 7%(skyrim), 25% (shog2) 30%+ (sc2), 6% (dirt3)1 loss at <5% in CIV5 and the rest washes (less than 3%) ...But YOU think people should buy the card that gets it's but kicked or is a straight up wash. You said your recommendations was based on 1920x1200...NOT 2560x1600...Well, suck it up, this is the truth here in your OWN benchmarks...Yet you've ignored them and LIED. Let me quote you from ABOVE again lest you MISSED your own words:"And 1920x1200 is what we based our concluding recommendations on. "Then explain to me how you can have a card that wins in ONE game at <5% and BEATEN in 6 games by OVER 6%, with 4 of those 6 games BEATEN by >25%, and yet still come up with this ridiculous statement (again your conclusion):"On the other hand due to the constant flip-flopping of the GTX 660 Ti and 7950 on our benchmarks there is no sure-fire recommendation to hand down there. If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"Are you smoking crack or just BIASED? Being paid by AMD? Well? The 7950 is NOT cheaper than the 660TI. Can you explain your math sir? I'm confused. What "flip-flopping of the GTX 660 Ti and 7950"?? You're making these statements on 1920x1200 right? OR should I quote you again??...LOL.

More bias just keeps going in that article too (if anyone still has questions):"But the moment efficiency and power consumption start being important the GTX 660 Ti is unrivaled, and this is a position that is only going to improve in the future when 7950B cards start replacing 7950 cards."So wait a minute: It's hot, sucks juice and it only wins ONE game by <5%, loses 4 games by >25%, another 2 games > 6% and you still recommended the 7950 "If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future" WHOA...Why in the future with all these losses?

Maybe it's this next statement?:"in particular we suspect it’s going to weather newer games better than the GTX 660 Ti and its relatively narrow memory bus."Are we back to concluding based on 2560x1600 again? Which again, no monitor UNDER 30inches uses. Even the 27in (11 of 52) only use 2560x1440. And with all the losses at 1920x1200 how can you conclude anything but this 1920x1200 resolution and below are NEVER memory constrained or it would be LOSING. You have to push the card above 27in monitor resolutions to get this difference to EVER show up. Even then, it's a VERY questionable argument. Or should I run through those scores for you too? Back to your BS:"As we mentioned in our discussion on pricing, performance cards are where we see the market shift from RICH ENTHUSIASTS who buy cards virtually every generation to more practical buyers who only buy every couple of generations."

OK, so you benchmarked at 2560 and beat it like a dead horse because of "enthusiasts" who by your own words (in the freaking conclusion again) are "RICH ENTHUSIASTS" would buy a $300 card? I make ~$50K/yr and paid $260 for my Radeon 5850 and have a 24in and 22in in use (24in can be had for $170, 22in ~$120). I'm not in the top5% of wealthy, heck I don't even crack the top 25%...LOL. I'm about to buy a $300 660TI though. You won't see me buy a $700-$2300 monitor ANY time soon. But YOUR "RICH ENTHUSIASTS" buy these and then spend $300 on a card. No, that guy just bought 3 24's and a GTX690 (or two) because the rich are running 5760x1200 with these (a single card even). OR that RICH guy bought a GTX690 and one of your $700-2300 IPS 27in monitors. Rich, by the very definition, don't buy the 4th rung in video cards. WE have a 660/670/680 & 690. But you suggest the "RICH ENTHUSIASTS" would buy 4th place? Put the crack pipe down, they aren't buying this card to run 2560x1600. They already own one or two GTX 680/690's. I wouldn't even call you RICH if you had anything less than a 690. You say they buy every year and I'm sure the RICH make far more than 100K, so probably have $1000 lying around for a GTX 690. I mean we're talking a 2% market share above 1920x1200 (check steampowered.com), by that % we're talking millionaires here. As of 2011 we have 11 million millionaires in USA. A LOT higher than 2% and I think they all have >$300 for a yearly card. They buy 2 of whatever or #1 single. :) Buying a $300 card to a millionaire "ENTHUSIAST" would be EMBARRASSING.WORSE: You have people thinking they can OC the crap out of these still:http://www.anandtech.com/show/6152/amd-announces-n..."The 7950 on the other hand is largely composed of salvaged GPUs that failed to meet 7970 specifications. GPUs that failed due to damaged units aren’t such a big problem here, but GPUs that failed to meet clockspeed targets are another matter. As a result of the fact that AMD is working with salvaged GPUs, AMD has to apply a lot more voltage to a 7950 to guarantee that those poorly clocking GPUs will correctly hit the 925MHz boost clock."

Let's look outside anandtech to further prove the 1920x1200 point or even make the whole BANDWIDTH POINT MOOT :) Can you OC a 3GB 660TI memory?http://hardocp.com/article/2012/08/21/galaxy_gefor...WOW, 7.71ghz MEMORY vs. 6ghz (6008) for normal REF 660TI. 185GB/sec! This card is also $339 and has 3GB...LOL. OMG...What just happened? :)Raising JUST the memory got them 10.4% in battlefield 3 @ 2560x1600! LOL.1920x1080 got 12% JUST FROM THE MEMORY. What bottleneck? So you can already go another 1100mhz (even with an extra GB of mem) than the ZOTAC AMP in this review at anandtech even though RYAN hasn't show why I would need it, nice to know it's all there just in case :) AMP=6.6ghz, this is 7.71ghz!SKYRIM PLAYERS TAKE NOTE:"In Skyrim at 8X MSAA plus FXAA, which is a very high setting for this card, at stock frequencies we got 44.2 FPS. Overclocking the memory boosted performance by 11%." - NOTE THE MINIMUM OF 35FPS AVG 49 MAX 65. 2560x

"We ended up with a GPU offset of +110, which put us at a baseclock of 1116GHz or GPU Boost of 1195MHz. The actual frequency though while gaming was running at 1298MHz, which we feel safe calling 1.3GHz. This was about 10MHz slower than it was when overclocking the GPU alone."CHECK THAT OUT, 1298, CARD OVERCLOCKING IT'S OVERCLOCK BEHIND YOUR BACK BY 100MHZ MORE! What you buy out of the box is GUARANTEED, your card may do far more all by itself, no need to do anything unlike 7950's. Check it out people, with 8xMSAA/16af HIGHEST in game settings possible@2560Zotac Anandtech 4xmsaa=75.9 HARDOCP 8xMSAA=67! Minfps=37! So you can double your MSAA and STILL not run into a memory bandwidth issue despite ryan harping, harping, harping,...Did I mention he beat it like a dead horse? They couldn't tell it from the GTX670!Witcher 2 @2560 faster on GTX660, but unplayable on both at min 18/16...ROFL proving more, these cards are NOT for 2560x1600. Batman Arkham hit min 10fps at 2560x1600 on 7950. Look how much time it spends below 25fps in the graph, it's almost half the time beetween 10fps and 25fps. OUCH. NOT BUILT FOR 2560x1600, not sure the 7970 is either at these fps..LOL

Are you getting the point RYAN? You won't be running many things at 2560x1600 with fps in the 10 range. How many more take a 54fps hit like batman on Radeon 7950? This whole bandwidth thing is a NON issue because these just are NOT built for this res. Like I said, people usually run hi-res with TWO cards and ABOVE 2560x1600. You can verify that at Steampowered.com 2560x1600 is a DECIMAL point. Because most opt for two cheaper cards and run a LOT higher. But just for giggles:"In Batman we experienced large improvements in performance with the overclock on the GALAXY GTX 660 Ti. We are performing 22% faster than the GTX 670 in this game with the overclocked GTX 660 Ti, and 55% faster than the Radeon HD 7950."It never dropped below 28fps. Bandwidth issue? @2560x1600 the 7950 dropping to 10fps while 660TI is up 22% on the GTX670ref and well...10fps...LOL. Say what? 55% faster @2560x1600? I thought this card had a memory bandwidth problem? Only if you say so ryan...ROFLMAO.

The defense is quite simple: you're extremely biased towards NVIDIA, and you're going around picking and choosing comparisons that support the way you think it's meant to be played. Mind you, I'm running a GTX 580 personally -- with a 30" 2560x1600 LCD no less -- but that's beside the point

Your post is laughable because it really boils down to this: you're not as informed as you like to think you are. There are a variety of Korean 27" LCDs selling for ~$400 or less that use the same LG IPS panel. Go search any hardware enthusiast forum (you've linked and mentioned several just in this single post, nevermind the 20 or 30 others you've made on this article shouting down everyone that disagrees with you) and I can guarantee you'll find posts about Catleap, Yamakazi, Auria, and several other brand names. Microcenter (a US company that enthusiasts should be more than familiar with) also carries one of these LCDs for $400: http://www.microcenter.com/single_product_results....

So basically, your main premise that no one uses such LCDs is at best the perspective of someone with serious blinders. But you don't stop there. You go on to complain how no one sells 2560x1600 LCDs these days. Apparently, you weren't around for the transition from 16:10 to 16:9 and somehow think everything has to be tested with 16:9 now? That would make you one of the even less informed people that thinks 16:9 LCDs are somehow preferable to 16:10 I suppose. Given Ryan has a 30" LCD, why should he test it at less desirable resolutions?

But even then, you still have to keep going. Ryan specifically comments throughout the text on 1920 performance and rarely mentions 2560, but you appear to get stuck on the presence of 2560 graphs and seem to think that just because they're there, he's writing for 2% of Steam's readership -- which is already a biased and useless number as Steam hardware surveys are always out of date and frequently not resubmitted by people that have already seen the survey 100 times.

Short summary: you have ranted for over 15000 words in the comments of this article, all with an extremely heavy NVIDIA bias, and yet you have the gall to accuse someone else of bias. You've shown nothing in your above commentary other than the fact that positioning of the various GPUs right now can vary greatly depending on the clock speeds of the GPUs. (Hint: a heavily overclocked GTX 660 Ti 3GB card -- which totally eliminates the asymmetrical aspect of the other 660 Ti cards -- beating a stock HD 7950 doesn't mean much, considering the pricing is actually slightly in favor of the 7950.)

What it really comes down to is this: buy the card that will best run the games that you tend to play at the settings and resolutions you play at. Period. If you have a 1920x1200 or 1080p display, just about any current $300+ GPU will handle that resolution with maximum detail settings in most games. If you have a better 2560x1600/1440 display, you'll want to check performance a bit more and make sure you get the right card for the job -- I'd suggest looking at 4GB GPUs if you want something that will last a while, or just plan on upgrading again to the HD 8790/GTX 780 next year (and the HD 9790/GTX 880 after that, and....)

It basically comes down to opinions and a-holes; everyone has one. You think the 2560x1600 crowd apparently doesn't matter, going so far as to say " I think most buy $600 video cards and $300 monitors, not the other way around." I would say anyone buying $600 in video cards to run a $300 monitor has their priorities severely screwed up.

I bought an $1100 30" LCD five years back and I'm still using it. During that time I have upgraded my system, CPU, GPU, etc. numerous times, but I still use the same LCD. Buying a high quality LCD is one of the best hardware investments you can make, and you're an idiot to think otherwise. That you can now buy 27" QHD displays for $300-$400 will only serve to increase the number of users who own and use such displays. The only reason not to have a larger display is if you simply don't have the space for it.

I know three people that have between them purchased five of these Korean LCDs, and they're all quite happy with the results. You might recognize the names: Brian Klug, Chris Heinonen, and Ian Cutress. The only issue I have with the cheap QHD panels is that most of them don't have DisplayPort, and that will become increasingly important going forward. So spend a bit more to find one with DP on it. But to dismiss 2560 displays just because you're too cheap to buy one is extremely biased towards your world-view, just like the rest of your posts have been.

NVIDIA makes some very good GPUs, but so does AMD; which GPU is better/best really comes down to what you plan on doing with it. CUDA people need NVIDIA, obviously, but someone doing BTC hashing wouldn't be caught dead trying to do it on NVIDIA hardware. It's about using the right tool for the job, not about shouting the loudest every time someone offers a differing opinion.Reply

What a joke of a response that was. He proved the reviewer flat out lied repeatedly, and without a doubt, with a huge lying bias for amd. That you CAN'T address, and didn't.#2. Catleap is around, so show us the catleapers with 660TI or 7870...I've been to the forums and thejian is correct, they nearly all have 2x 680's or 2x 7970's or both. Overclock forum thread. #3. This is a gamers review, not bitcoin, and no cuda either, your ending paragraph is goofy as a response as well.

I like and enjoy anand, but it would be a lot more enjoyable if people told the truth, especially in their own reviews. When the obvious bias exists, it should then at least be admitted to. It's pretty tough getting through the massively biased wording in general as well, and I don't care to go pointing it out again and again. If you can't notice it, something is wrong with you, too.

You didn't prove thejian incorrect, not even close, but he certainly proved your fellow worker incorrect. When I complained about the 1920x1200 when it should be 1920x1080 just several card reviews ago, showing how nVidia won by a much larger amount in that much more common resolution, a load of the enthusiasts here claimed they own 1920x1200, there was only 1 that claimed a 2560x. LOL

What is obvious is most of your readers and commenters don't even have a 1920x1200, and yes they whine about $5...

So, nice try, it didn't work, and if the amd fans didn't keep lying and responding thejian wouldn't have to either... however overall it's great people DO what Russian and thejian and others do with long comments, it's way better than smart snarking and ignorant one liners and pure stupidity and grammar complaints to the reviewer.

If people whine so much about warranty and they do, thejian has a very good point on the monitors, as well. Also they are 2560x1440, so this review doesn't address them, because they are too expensive for ANANDTECH ! In fact, we've seen how a $20 killawatt is too expensive for the anandtech site (above reviewer).

Okay ? I'm not agreeing with you, because the facts don't fit - and the point on the cards pushing the pixels is ALSO correct in thejian's FAVOR, another portion you completely sidestepped.

Anyway, I know it's hard, and anand is a great site and it's reviewers I'm sure do their best and do a lot of good work, but facts are facts and fans are fans and fanboys should still use the facts to be a fan when making a recommendation.

You're defending buying no name monitors from what I found first looking at amazon/newegg (zero at newegg) having "just launched" their site. 3 from Korea, one New zealand. You did read the post correct? I don't give my credit card to foreign countries, or buy no name monitors (even locally) with no company websites for the resellers, gmail account for help, no phone to call, blank faq & help pages. Are you serious? If newegg and amazon don't carry it, I'd say it's not too popular no matter what it is in electronics/pc gear.

YOU do not run your games at that res and hit 30fps very often. There are a LOT of games that will be well below (the witcher, Batman, just to name a few popular ones).

You're still defending a position that is used by 2% of the population. That's laughable. Nice fanboy comment...The best you got? Tech report did an article on one monitor he Ebay'd (jeez), I read it and he almost had a heart attack waiting for it...Then no OSD at all...LOL. No other monitor adjustments other than bright/contrast and sound up/down. These are not the quality of a dell :) I'm sure you can find the article at techreport.com He had multiple scares ;) Roll the dice on a non brand if you'd like. I don't see a cheap anything that isn't Korean. HP starts $688, Dell 800.

LOL to cheap to buy one?...You're wasting my time attacking me & not attacking my data at all. Keep attacking me, it only looks worse. You already stated my point in another comment. These cards are used at 1920x1200 or less and his review beats bandwidth like a dead horse. These are gaming cards, BTC hashing?...I digress...another decimal point of the population.

You could claim bias all day, it won't help the numbers I pointed out, nor your case. You've wasting a ton of words yourself trying to convince me a few reviewers (related to this site) are in the 98%. You're the 2% or steampowered survey's are just lying eh?

Never said AMD was crap. Stating the facts and them not going the way you want doesn't make me a fanboy..LOL. Nice try though. His own conclusions make no sense as pointed out. I see nothing above but opinion. Where I gave a book of data :) You really want to argue about their financials...Not bias there either...Just stating financial facts.

Read it again, it beat the boost. Boost craps out at far less than 1200mhz. Read your own AT article. That post is half of his own words from two articles that make his conclusions incorrect. I can't believe you wasted all that air trying to convince me 2560x1600 and these monitors are the norm. I would expect reviewers to have them, but not to think we all do.

Many of my posts said good things about AMD...I even said Intel owed them 20bil not 1.5. etc. I even mentioned why they're suffering now (Intel stealing sales, stopping people from buying AMD years ago when I owned a PC business, courts took far too long to decide their case). You really didn't bother to read, but rather went on a monitor rant riddled with personal attacks. Nice.BTC hashing...LOL. Nuff said. I discussed the games. You discuss Bitcoin hashing, and defend resolutions YOU already told the other poster isn't used by these...LOL. My world view is 98% of us shouldn't be misled by a 2% opinion. But you just keep thinking that way. ;) Reply

"For every Portal 2 you have a Skyrim, it seems. At 1920 the GTX 660 Ti actually does well for itself here, besting the 7900 series, but we know from experience that this is a CPU limited resolution. Cranking things up to 2560 and Ultra quality sends the GTX 660 Ti through the floor in a very bad way, pushing it well below even the 7870, never mind the 7900 series."&"68fps is more than playable, but hardcore Skyrim players are going to want to stick to cards with more memory bandwidth."

Based on my previous post regarding the hardware survey at steampowered, hardcore skyrim players don't exist I guess Ryan? Since nobody uses this res (uh oh, the 3 players using this res are about to complain...LOL), why act like it's important? Making the statement hardcore Skyrim players (in your opinion people with 2560x+ I guess?) should avoid this 660 TI, is at best bad journalism. At worst, I'm not sure, an AMD ad? Also, it's 75fps since the Zotac is FAR more accurate compared to what you BUY for $299/309 at newegg. For both prices you can get a card that is 100mhz faster than the one in REF GREEN in your graphs (your ref version). I'd argue 75fps (even 68.5) at a res nobody plays at is really good. Since when is 75fps unplayable? Never mind the fact I think this res is useless and you should be realizing most are using (e.g. most hardcore users) 1920x1200 or below and you're actually better off with Nvidia in this case...ROFL. For the res I think this card is designed for, it's the best out there and your review should clearly reflect that. The 7950 BOOST edition can't be had for less than $350 and barely be had at all. Never mind the watts/heat issues.

It's arguable that "hardcore players" could get away with anything in the list but the 560ti as they all hit over 83fps in what you've already stated is a cpu bound res. What evidence do you have that show more than 2% of users in the world use a res over 1920x1200? I'd say steampowered stats are a pretty good representation of what gamers are using and 2560x+ is NOT what they're using unless you have evidence to prove otherwise? Use more games in your tests to show variations rather than resolutions none (meaning 98% apparently) are using. Again I'd say a separate article should be written for the highest resolutions and mutli-monitor gaming, but using it as a basis for recommendations in std consumer cards is ridiculous. I'd rather see 15 games tested (easier to make sure you're avoiding the ones everyone Optimizes for and are benchmarked everywhere), for a better look at overall play across what 98% of us are using.

This brings your whole conclusion into question, which it seems is totally based on running at 2560x+. Raise your hand if you run at this res or above? I see the same 3 people...LOL.

"Coupled with the tight pricing between all of these cards, this makes it very hard to make any kind of meaningful recommendation here for potential buyers."http://www.newegg.com/Product/Product.aspx?Item=N8...Core Clock: 1019MHz Boost Clock: 1097MHz vs. your ref at 915/980. They are selling a CORE BASE that's above your REF BOOST...for $299. What's a virtual launch when 12 cards are available at newegg.com, and only 1 7950 Boost at $350?Borderlands 2 free with it also, and another at $309 with basically same specs:http://www.newegg.com/Product/Product.aspx?Item=N8...1006base/1084 boost. Again real close to your Zotac Amp for $309. So looking at the AMP basically as a BASE at $299/309 (it's only 14mhz faster in both base/boost clocks, which is nothing - not sure why they even sell it) let's fix your conclusion based on 98% of users res:

Zotac AMP (only 14mhz faster base/boost than $299/309 cards linked above) vs. 7950 (again more expensive by $20) @ 1920x1200Civ5 <5% slowerSkyrim >7% fasterBattlefield3 >25% faster (above 40% or so in FXAA High)Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)Batman Arkham >6% fasterShogun 2 >25% fasterDirt3 >6% fasterMetro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)Crysis Warhead >19% loss.Power@load 315w zotac amp vs. 353 7950 (vs 373w for 7950B)! Not only is the 660TI usually faster by a whopping amount, it's also going to cost you less at the register, and far less at the electric bill (all year for 2-4 years you probably have it - assuming you spend $300-350 for a gaming card to GAME on it). Note the AMP is about as bad as the 660 TI watts/heat/noise can get.

For $299 or $309 I'll RUN home with the 660 TI over 7950 @ $319. The games where it loses, you won't notice the difference at those frame rates. At todays BOOST prices ($350) there really isn't a comparison to be made. I believe it will be a while before the 7950B is $320, let along $299 of the 660 TI. This card DOMINATES at 1920x1200 & below which according to steampowered hardware survey is what 98% of us use for a resolution. So 98% of you have a no brainer recommendation for this card...There...FIXED.

I own a Radeon 5850 (and waited 7 months for it like the other 470 amazon buyers on back order as they tried to get us to drop our orders by making a new xfx model#) ...My bad, in my other post I put 8850...ROFL...You can google the amazon complainers if wished or you doubt that I own one... :) Just did-google this, you'll land in the complaints:"jian amazon backorder 5850" (without the quotes)Top of the listed links will get you to the backorder complaints for the card...LOL. I got a card eventually, so don't go giving me that AMD hate crap. Just the facts :) But you can guess what I'll buy this black Friday :) Because the 660TI is awesome, just like my 5850 for $260 was. Unless you're planning on running above 1920x1200 any time soon, you're retarded buying anything but 660TI at the $300 price range (including the 670+, save your money). Heat, Noise, watts...NO BRAINER. 660TI even at Zotac AMP heat/noise/watts. Perf at the resolutions 98% of us use...NO BRAINER. IF you're dickering over $20 (as ryan is in his recommendation of them all being close together) then you don't have the cash for 3 monitors and triple wide gaming either. IF you DO have 3 monitors (likely a Quadcore also), surely you can afford TWO of these and rock the house no matter what you play. Again, though, that's a 2% user base I'm talking about here. You should rewrite your conclusion Ryan. It's baseless currently. Running your LCD in a res not native? Really? I'm kind of offended Ryan, I think you just called me NOT a hardcore gamer...LOL. :)

One more note: Mutli monitor resolutions at steampowered @ 2560x1600 and below are less than 7% (add up the list!). So again, Ryan you're not making sense. Most people running this res and above have more than one monitor and probably have more than one card to do it. Note it's a 2% user group any way you cut it, and even less when you consider these mostly have more than one monitor and card. I doubt the people you wrote the conclusion for (it seems) are worried about $20.Reply

Although 192bit's is obviously a step down from 256, most games won't be overly impacted even on PCIE2 setups. For those that are, if you go to a PCIE3 setup the 192bit limitation largely disappears; basically PCIE3 bandwidth is twice as fast as PCIE2. So for example, if you have a PCIE3 capable 1155 motherboard and pull an i7-2600 and replace it with an i7-3770 (similar CPU performances) the bandwidth effectively doubles and would be equivalent to 384 at PCIE2. Obviously that would be a fairly pointless upgrade in terms of CPU performance but Intel's 'sly' control is paying off; you have to upgrade your CPU to benefit your GPU. An i7-2600 or similar is still a much sought after CPU, so they are readily salable, making the 'upgrade' reasonably inexpensive. However the LGA1155's are very limited boards, and adding a second card would drop you from x16 to x8 rates for both cards, albeit at PCIE3 performance. So if bandwidth is likely to be a problem for any game now or in the future and your on an LGA1155 just get a bigger card (670/680) rather than going SLI. Adding a second card on a PCIE2 rig could be a really stupid move.Reply

"the bandwidth effectively doubles and would be equivalent to 384 at PCIE2"

The speed of the PCIe bus is in no way connected to the width (or overall speed) of the local memory bus on a video card. The local memory bus is an entirely different and otherwise isolated subsystem.

While PCIe 3.0 may improve a video card's performance - though for single card scenarios we have not found any games that meaningfully benefit from it - any improvement would be due to improved transfer performance between the video card and CPU. If rendering performance was being constrained by memory bandwidth, then it would continue to be so.Reply

Just realized this uses Cry engine 2.0 from 2008. So the only real loser for Nvidia here is from 2008. What happens when you run today's CryEngine 3.0 from Crysis 2? As in a game released March 2011 with DirectX 11, and has even had the HIGH RES patch released which adds :DirectX 11 hardware tessellation & Ultra Upgrade adds soft shadows with variable penumbra, improved water rendering, particle motion blur and shadowing, Parallax Occlusion Mapping, full-resolution High Dynamic Range motion blur, & hardware-based occlusion culling.

"The test run apply is stringent, harsh and really only suited for high-end DX 11 class graphics cards of 2011 and 2012. "from Guru3d Radeon 7950 BOOST article vs a ref clocked 660TI (far slower than AMP):http://www.guru3d.com/article/radeon-hd-7950-with-...Umm...even at 2560x1600 the 7950 wins by scoring 35fps vs REF CLOCKED 660 TI 34fps.Meaning the 660 TI's in this review would CRUSH it. Update your game instead of calling crysis a thorn in NVidia's side and showing a 20% loss for warhead from 2008. You're question should have been "Can it run Crysis 2 DX11 with updated High res patch with all the goodies turned on?". Answer...Yes, as fast as a 7950 BOOST at 2560x1600 and faster below this res (albeit by the same 1fps margin...LOL...Crysis 2 is a WASH for 7950Boost vs. 660TI REF, but a LOSER for the two cards I linked to before at newegg for $50 less than 7950BOOST). Also note it clearly beats $319 7950 regular at any res already in crysis2. As I read further at guru3d, it's clear you need more games. Lost planet2 at 2560x1600 7950boost loses by 20% (40 to 48 vs. ref clocked TI, not vs AMP or two cards I linked). Aliens vs. Predator shows 7950boost beating 660TI by more than 20% (dx11), again showing why more games should be tested, mind you this is at 1920x1200!...LOL. I'm not afraid to show where NV 660 gets capped...ROFL. 54fps 7950vs 40fps 660TI. No need to show 2560 settings if you pick better dx11 group of games (is anyone playing warhead 4yrs later with crysis 2/hi res out?). It goes on and on, even at tomshardware. You need more games. Note CRYSIS 3 will be using the CryEngine 3 with basically all the goodies released in the Crysis 2 update/hires download patches. So highly relevant.

Worse, after l keep looking at both, and the specs on 7950's at newegg, you can get a 7950 that seems to put to shame AMD's own new Boost speeds to shame at 900mhz:http://www.newegg.com/Product/Product.aspx?Item=N8...XFX with 900 CORE for $329 (rebated). and another for the same after rebate also:http://www.newegg.com/Product/Product.aspx?Item=N8...Perhaps you should test what is sold and easily had rather than AMD's version? Though I'm not sure they boost any higher than normal in either case. Only the NV cards showed the core & boost speeds. Obviously power draw and heat would be worse than your review though. I'd still rather see these benchmarked than AMD's ref 7950 design. It's clear they clocked it too low when you can pick up a 900mhz version for $330 after rebate (though only two, rest are $350). Then again, maybe they didn't want to show worse in heat/noise/watts dept.

This still doesn't change my "review" (LOL) of your conclusion though. 2560x1600+ is NOT what people are running, and the 660 TI is still awesome at 1920x1200 and below for $300 and can be had at that price far above your ref card reviewed here (as I just proved, the same can be said for clocks at 900 core on AMD, but $30 is still 10% higher than $300 and all the heat/noise/watts still applied only worse).

You started your review of the benchmarks with this (though after crysis warhead instead of crysis 2):"For a $300 performance card the most important resolution is typically going to be 1920x1080/1200"You should have based your conclusion on that statement alone. It's TRUE. I already proved it from Steampowered hardware survey (30% use these two resolutions today! over ALL cards). Throw out crysis warhead for crysis 2 w/updates and your conclusion should have been very easy to make.

For that extra $30-50 a 7950 boost costs you can make sure you get an Ivy Bridge K chip (both are only $20 more than regulars i5 or i7) and have ultimate overclocking on your cpu when desired. You can overclock either card already for free (amd or nvidia). By the end of the article I think you forgot what the heck you were reviewing. Two cards battling it out at 1920x1200. Your analysis after each benchmark seems to indicate you thinking these are 2560x1600 cards and that's the most important thing to remember here. Nope. It's NOT, by your own words earlier it's really 1920x1200/1080. Along with conclusions being off, and games being too few (and old totally & out of date in Crysis Warhead's case), you should have put in a 900 amd clocked 7950 (you could have easily run the one you had at that speed to show what you can buy for $330). Who would piss away 10%+ clockspeed on AMD's ref version when you can get multiple models after rebate at $10 more? $319 is the lowest 7950 and only at 800mhz, and a 900mhz version can be had for $10 more after rebate in 2 models. While AMD may have wanted it benchmarked this way for heat/noise I think most would buy the 900mhz version. Maybe you should just rethink the whole idea of benching their ref versions altogether when they don't represent a real purchase at ref prices.

In the end though, this is just a misleading review currently, no matter how I cut it. Further, the 7950 (boost or not) just isn't worth $330 or $350. It's hot and noisy, and uses more watts for a heck of a lot of LOSING in the benchmarks by LARGE margins. You guys are getting like Tomshardware, who blew their review by reducing overclocked cards to ref speeds (they had the $300 MSI 660 TI I linked to at newegg @ core 1019/boost 1097 in their hands and didn't use it...ROFL). Why the heck even mention you have them if you're going to reduce every one of them before benching them (amd or nvidia - they reduced all...LOL)? Where do you go for a GOOD review these days? Consider that if you bounce over to Tomshardware review people. Small print shows they reduce everything to ref in the test setup page...ROFLMAO...jeez. Would you review a Ferrari at 55-65mph because that's the speed limit? Heck no. I wouldn't tape myself driving on the street at 200mph, but I'd sure test it there without taping myself if I was benchmarking cars...LOL. I'd rather see reviews based on cards you can purchase at the best speeds at the best pricing in both brands (amd/NV) when they release a new product. Include their ref speeds, but also include what we'll buy. In AMD's case here, you have to work to buy a 7950@800mhz (the only card at newegg for $319!). You'd have to ignore the same pricing at 850-900mhz next to the rest at $329+. Who does that? Heck most are at 350, with the two I mention at 900mhz being rebated to $330...LOL. What would you buy, assuming you wanted a 7950 disregarding heat/noise/watts used and the performance at it's intended 1920x1200 res (meaning at this point, you'd have to be an AMD fanboy - which I kind of admit I am...LOL)? You'd buy the 900mhz for $330 after rebate. If you want to ignore rebated products, the same would be true of my conclusions. $309 can get you a clock of almost Zotac 660 TI AMP speed as shown already. 1019core/1097boost. For anyone who likes paperwork you'd get the same card for $10 off. Ignoring the rebate it's still a no brainer. I used to have to read 3 reviews to get a good picure of a product, but these days I have to read a good 10 reviews to get a TRUE picture with all the review shenanigans going on.Reply

My goodness the NVDA trolls are here in force on this launch.That immediately points to Green Team fear that this card just isn't going to cut it against the HD7950.Cerise Cogburn(aka)Silicon Doc has been banned from just about every tech site on the Net.Back for some more trolling before the ban Doc?Reply

I absolutely agree. This Cerise Cogburn loser and his friend TheIdiot (oops, TheJian, stupid autocorrect) have been trolling harder than I've seen in a long time. Go home guys, you've contributed nothing.Reply

The article says that the 660 Ti is an average of 10-15% faster than the 7870, and that's true. But I feel that that average doesn't reflect how close those two cards really are in most games. If you throw out the results for Portal 2 and Battlefield 3 (since they are nVidia blowouts), the 660 Ti is only about 5% faster than the 7870.Now obviously you can't just throw those results away because you don't like them, but if you're not playing BF3 or Portal 2, then the 660 Ti and the 7870 are actually very close. And given the recent price drop of the 7870, it would definitely win the price/performance mark.Reply

No PhysX, no adaptive v-sync, inferior 3D, inferior 3 panel gaming, no target frame rate, poorer IQ, the list goes on and on.you have to be a fanboy fool to buy amd, and there are a lot of fools around, you being one of them.Reply

PhysX is not that great. There is only a single this year that will have PhysX support, and that is Borderlands 2. Most of the effects that PhysX adds are just smoke and more fluid and cloth dynamics. Sometimes a slightly more destructible environment.Adaptive V-Sync is cool, I saw a demonstration video of it.Inferior 3D is true, although your next point is stupid. AMD's Eyefinity is much better than nVidia Surround.I'm not a fanboy, Go to the bench and look at the results, do the math if you want. Barring BF3 and Portal 2, again since they are huge wins for nVidia, every other game on the list is extremely close. Of the 35 benchmarks that were run, it's the 8 from BF3 and Portal 2 that completely blow the average. The 660 Ti is more powerful, but the 7870 is a lot closer to the 660 Ti than the average would lead you to believe.Reply

Yeah whatever - buy the slow loser without the features, say they don't matter, get the one with crappy drivers, say that doesn't matter.. throw out a few games, say they don't matter, ignore the driver support that goes back to the nVidia 6 series, that doesn't matter, ignore the pathetic release drivers of amd, say that doesn't matter... put in the screwy amd extra download junk for taskbar control in eyefinity, pretend that doesn't matter - no bezel peek pretend that doesn't matter...

So for you slower loser is a little dif in framerates only in the allmyghty 19XX x 1XXX res? where everything is playable with other cards also? what when new titles come and then some stuff starts to go wrong with 660ti?. you can actually ignore the difference now and future titles could go better for AMD for opencl stuff. You should have said "a little slower now, if we are lucky, still a little slower on the future". bullshit

"without the features, say they don't matter"

I don't actually notice phyxs playing... and... if 2% of people play in very high res, how many do you think plays at your marvelous nvidia 3d? bullshit. Its like saying this is bad because only 2% uses it, and this is good but the percentage is even less. bullshit

"get the one with crappy drivers"

You read that a lot of people had amd driver issues, nice, like a lot of people also has nvidia driver issues... do you know the percentage of driver failures? the failures stand out only because normal working drivers don't drive attention. Does not mean that its plagged by bugs. bullshit.

", say that doesn't matter.. throw out a few games, say they don't matter, ignore the driver support that goes back to the nVidia 6 series, that doesn't matter, ignore the pathetic release drivers of amd, say that doesn't matter... "

Hey nice! i know how to repeat stuff that I have already said without proving anything also! look: bullshit, bullshit, bullshit. The games you think matter can still be played, its future games that will tax this cards to new limits, then we will see, and if those include opencl, where will be your god? "well I could play battlefield 3 better some time ago, im sure these new games don't matter". or maybe a "yeah whatever" ? :)

And im tyred now, I think this card is a fail, what does it do that cards already didn't do? what market do they cover that was not previously covered?

OH NO BUT WE HAVE BETTER FPS FOR MAIN RESOLUTIONSWell, good luck with that in the future... I'm sure a man will buy a good 7950 with factory oc that will go just about as well, still playable and nice, and when the future comes then what? you can cry, cry hard.

You cannot accept that your card is:

1. Easy to equalize in performance, with little performance difference in most games or actually none if OC is considered.2. Focused on the marketing of some today games and completely forgot about future, memory bandwidth and so on.3. Overly marketised by nvidia.Reply

1. Easy to equalize in performance, with little performance difference in most games or actually none if OC is considered.

I don't have a problem with that. 660Ti is hitting 1300+ on core and 7000+ on memory, and so you have a problem with that.The general idea you state, though I'M ALL FOR IT MAN!

A FEW FPS SHOULD NOT BE THE THING YOU FOCUS ON, ESPECIALLY WHEN #1 ! ALL FOR IT ! 100% !

Thus we get down to the added features- whoops ! nVidia is about 10 ahead on that now. That settles it.Hello ? Can YOU accept THAT ?If FOLLOWS 100% from your #1I'd like an answer about your acceptance level.

2. Focused on the marketing of some today games and completely forgot about future, memory bandwidth and so on.

Nope, it's already been proven it's a misnomer. Cores are gone , fps is too, before memory can be used. In the present, a bit faster now, cranked to the max, and FAILING on both sides with CURRENT GAMES - but some fantasy future is viable ? It's already been aborted.You need to ACCEPT THAT FACT.The other possibility would be driver enhancements, but both sides do that, and usually nvidia does it much better, and SERVICES PAST CARDS all the way back to 6 series AGP so amd loses that battle "years down the road" - dude...Accept or not ? Those are current facts.

3. Overly marketised by nvidia. "

Okay, so whatever that means...all I see is insane amd fanboysim - that's the PR call of the loser - MARKETING to get their failure hyped - hence we see the mind infected amd fanboys everywhere, in fact, you probably said that because you have the pr pumped nVidia hatred.Here's an example of "marketised""http://www.verdetrol.com/ROFL - your few and far between and dollars still hard at work.AMD adverts your butt in CCC - install and bang - the ads start flowing right onto your CCC screen... Is that " Overly marketised" ?

I absolutely will ignore driver support for the 6 series cards. If you are using an AGP card, it's really REALLY time to upgrade.You are just as bad a fanboy for nVidia as any AMD guy here, moron. You are completely ignoring anything good about AMD just because it has AMD attached to it.I'm completely confident that if AMD had introduced adaptive v-sync and PhysX, you would still say they suck, just because they came from AMD. If you read my post, it says that 660 Ti IS more powerful than the 7870. I was just pointing out that they are closer than they seem. I have no nVidia hatred, they have a lot of cool stuff.And about the 660 Ti beating the 7950 at 5760x1080, look at the other three benchmarks, moron. The 7950 wins all of them, meaning BF3, Dirt 3, and Crysis 2. It only looses in Skyrim by and average of 2 FPS. Why didn't you include those games in your response.And when I left the games out, I said that they merely blew the average out of proportion, but that you can't leave them out because you want to. You still have to calculate them in the total. Moron.And for the record, I'm running a GTX 570, moron.Reply

Look, the amd crew, you, talk your crap of lies, then I correct you. That's why.Now, whatever you have that is "good by amd" go ahead and state it. Don't tell lies, don't spin, don't talk crap. I'm waiting...My guess is I'll have to correct your lies again, and your STUPID play dumb amnesia. The reason one game was given with 660Ti in that highest resolution winning is very obvious, isn't it, the endless your bud giradou or geradil or geritol whatever his name is was claiming that's the game he was buying the 7950 for...LOLROFLMHO Whatever - do your worst.Reply

" Fan noise of the card is very low in both idle and load, and temperatures are fine as well.Overall, MSI did an excellent job improving on the NVIDIA reference design, resulting in a significantly better card. The card's price of $330 is the same as all other GTX 660 Ti cards we reviewed today. At that price the card easily beats AMD's HD 7950 in all important criteria: performance, power, noise, heat, performance per Dollar, performance per Watt. "LOLpower target 175W LOL" It seems that MSI has added some secret sauce, no other board partner has, to their card's BIOS. One indicator of this is that they raised the card's default power limit from 130 W to 175 W, which will certainly help in many situations. During normal gaming, we see no increased power consumption due to this change. The card essentially uses the same power as other cards, but is faster - leading to improved performance per Watt.< br />Overclocking works great as well and reaches the highest real-life performance, despite not reaching the lowest GPU clock. This is certainly an interesting development. We will, hopefully, see more board partners pick up this change. "Uh OHbad news for you amd fanboys.....HAHAHHAHAHAHAAAAAAAAAAAAAAThe MSI 660Ti is uncorked from the bios !roflmaoReply

"I don't have a problem with that. 660Ti is hitting 1300+ on core and 7000+ on memory, and so you have a problem with that.The general idea you state, though I'M ALL FOR IT MAN!A FEW FPS SHOULD NOT BE THE THING YOU FOCUS ON, ESPECIALLY WHEN #1 ! ALL FOR IT ! 100% !"

So you have not a problem with performance? good, because actually that means its a competitive card, not a omfg card. And if you want to oc a 660 you would just oc a 7950 so I don't see the omfg nvidia is so much better.

"Thus we get down to the added features- whoops ! nVidia is about 10 ahead on that now. That settles it.Hello ? Can YOU accept THAT ?"

So essentially when i ask how many people do actually 3D because you seem to think 2% is unimportant in resolution your answer is "well nvidia is 10 ahead because it has features ACCEPT BLINDLY". Not smart.

"Nope, it's already been proven it's a misnomer. Cores are gone , fps is too, before memory can be used. In the present, a bit faster now, cranked to the max, and FAILING on both sides with CURRENT GAMES - but some fantasy future is viable ? It's already been aborted.You need to ACCEPT THAT FACT."

FPS are gone and future is fantasy? amd cards still perfom, they are very gpgpu focused and they do excellent for that, and still they don't have bad gaming performance while doing it because you just buy a pre OC version or something and you get still awesome performance (very similar to your 660ti god), say to me what is not enjoyable while playing with an AMD card mr fanboy.

And the future, well, future is gpgpu because allows big improvements to computing, yet is "fantasy". It's only non important because nvidia had good gpgpu in the past and not now?

"Okay, so whatever that means...all I see is insane amd fanboysim - that's the PR call of the loser - MARKETING to get their failure hyped.."

Yeah, calling fan-boy before actually noticing that nvidia told the reviewers how to review the card so it looked better, because get realist, if they include a horrible AA technique with no reason at all something is behind the table hiding you know. Haven't you noticed? theres a lot of discrepancy in 660ti's benchmarks around the web, from sites where the 660 loses to 870's radeons and where it wins to 970's, there is not a single liable review now, do you want to see the truth? buy a 660ti a 870 and a 950, and compare the 3, you will have the truth, thay they perform like they are priced and AMD cards are not shit.Reply

Oh stop the crap. nVidia is 10 features ahead, I'm not the one who talked about resolution usage, so you've got the wrong fellow there. 3D isn't the only feature... but then you know that, but will blabber like an idiot anyway. Go away.Reply

"I'm not the one who talked about resolution usage". You can't fault him for mixing up his trolls. Since almost everything you and TheJian have said is complete shit it's hard to keep track of who said what.And if you can objectively prove that I've lied about anything, I really would like to see it. And I mean objectively, not your usual response of entirely subjective 'AMD suckz lololol' presented in almost unreadably bad grammar.I take that back, I won't read it anyways, since I know already know it'll be an nVidia love fest regardless of what the facts state. And I'll reiterate that I'm using an nVidia card. Moron.Reply

Oh it is not, he showed it all to be true and so does the review man.Get out of your freaking goggled amd fanboy gourd. Look, I just realized another thing that doesn't bode well for you.. What nVidia did here was make a very good move, and the losses of amd on the Steam Hardware Survey at the top end are going to increase.... The amd fanboy is constantly crying about price - they're going to look at $299 with the excellent new game for free and PASS on the more expensive 7950 Russian is promoting EVEN MORE now. Here let me get you the little info you're now curious about. ( I hope but maybe you're just a scowling amd fanboy liar still completely uninterested because you never got 1 fact according to you LOL sad what you are it's sad) Aug 15th 2012 prdola0" Looking at Steam Survey, it is clear why AMD is so desperate. GTX680 has 0.90% share, while even the 7850 lineup has less, just 0.62%. If you look at the GTX670, it has 0.99%. The HD7970 has only 0.54%, about half of what GTX680 has, which is funny considering that the GTX680 is selling only half the time compared to HD7970. It means that GTX680 is selling 4 times faster."ROFL...No one is listening to you fools, Russian included... now it's going to GET WORSE for amd....Reply

Okay, and that stupid 7950 boost REALLY IS CRAPPY CHIPS from the low end loser harvest they had to OVER VOLT to get to their boost...

LOLLOL\OLOI mean there it is man - the same JUNK amd fanboys always use to attack nVida talking about rejected chips for lower clocked down the line variants has NOW COME TRUE IN FULL BLOWN REALITY FOR AMD....~!HAHHAHAHAHAHAHAHHAHAAHHAHAHAHAHAAomg !hahahahahahhahahaahhahahaahahaHoly moly. hahahahahhahahhaReply

Pity I hadn't dug a bit further and found this also...I just checked 3 sites and used them...LOL.

Even crysis 2 is a wash ~1fps difference 1920x1080 and above again we see below 30min fps even on 7950B. It takes the 7970 to do 30 and it won't be there all day. It will likely dip under 30. Ryan comments a few times your experience won't be great below 60 as those will dip :)

The 7950 or B rises with volts and a lot of them have a hard time hitting over 1150 and run 80watts more. Not good if that's how you have to clock your card to keep up (or even win...it's bad either way). The one at guru3d.com was a regular 7950 that those #'s came from so it will have a hard time beating a 1300mhz much on NV's side. Memory can hit 7.71 as shown at hardocp with ONE sample. Must be pretty easy. Memory won't be an issue at 1920x1200 or even less at 1920x1080 and you can OC the mem any time you like :) Interesting article.

Again, 1322/6.7ghz on mem. Above Zotac Amp in both cases. Easy to hit 1300 I guess ;) and it still won't be as hot/noisy or use as many watts at those levels. Not that I'd run either card at max. They're all great cards, it's a consumers dream right now, but NV just seems to be in better position and Ryan's comments were just out of touch with reality. Reply

Well as far as the overclocking that's almost all amd people were left with since the 600 series nVidia released. All the old whines were gone - except a sort of memory whine. That gets proven absolutely worthless, but it never ends anyway. Amd does not support their cards with drivers properly like nvida, that's just a fact they cannot get away from, no matter how many people claim it's a thing of the past it comes up every single launch, and then continues - that INCLUDES this current / latest amd card released. So... it's not a thing of the past.No matter how many amd liars say so, they're lying.Reply

I saw this when their article hit but here is a good laugh... after the you know who fans found it so much fun to attack nVidia about " rejected chips" that couldn't make the cut, look what those criticizers got from their mad amd masters ! " These numbers paint an interesting picture, albeit not one that is particularly rosy. For the 7970 AMD was already working with top bin Tahiti GPUs, so to make a 7970GE they just needed to apply a bit more voltage and call it a day. The 7950 on the other hand is largely composed of salvaged GPUs that failed to meet 7970 specifications. GPUs that failed due to damaged units aren’t such a big problem here, but GPUs that failed to meet clockspeed targets are another matter. As a result of the fact that AMD is working with salvaged GPUs, AMD has to apply a lot more voltage to a 7950 to guarantee that those poorly clocking GPUs will correctly hit the 925MHz boost clock. "haROFLMHO - oh that great, great, great 40% overclocker needs LOTS OF EXTRA VOLTAGE - TO HIT 925 mhz .. LOLhttp://www.anandtech.com/show/6152/amd-announces-n...Oh man you can't even make this stuff up !HAHAHAHAHHAHAHAHAHAAAAAaaaaReply

Oh you were comparing it to the 7950? I was promoting the 7870 :) in the spanish forums they did they own kind of review because they don't trust this kind of page reviews and the OC 7870 of a member performs better than the OC 660TI.

So if we talk about the 7950

The winner is clear, the 7950 wins, you are all facts well deal with this:Techpowerup did the most quantity of games, also reviewed the 660TI in 4 diferent reviews for each edition, you can talk all you want nvidia fanboys but techpowerup showed that for your 1080p, 7950 is 5% slower than 660TI, but then w1zzard himself has a post in the forum that you have to suppose a 5% increase in performance for 7950 for the boost he did not include. Which yields equal performance at average, not only that but tom's hardware shows something you have forgotten, minimum FPS rendered in the games, which shows 660TI horrible minimum FPS that indicate a very unstable card, my guess is your god card has very high highs for the good GPU core but when things get demanding the memory bandwidth can't keep the pace, inducing some kind of lag segments.

It's easy, if they render the same performance average in games with almost the same price, the card that wins is the one with the better features: That is GPGPU, frame stability and overclock, which is by far much more important than closed source Physx for 2 games every hundred years. Why? OpenCL is starting to get used more and more, and it's showing awesome results. Why does nvidia cards sell more? well they still tell the reviewers how to review the card to make it look nice, they made a huge hype of their products and they have a huge fanbase that cannot see:

1- Nvidia is selling chips which only look good today so they have faster obsolescence and therefore they can sell their next series better.2- They are completely oblivious to the fact that they see amd cards with a non objective point of view.3- Proofs of equally performing amd cards with more OC rom is ussuallly defended by them talking about the past and attacking the so called amd fanboys as follows:

"REALLY IS CRAPPY CHIPS from the low end loser harvest they had to OVER VOLT to get to their boost...

LOLLOL\OLOI mean there it is man - the same JUNK amd fanboys always use to attack nVida talking about rejected chips for lower clocked down the line variants has NOW COME TRUE IN FULL BLOWN REALITY FOR AMD....~!HAHHAHAHAHAHAHAHHAHAAHHAHAHAHAHAAomg !hahahahahahhahahaahhahahaahahaHoly moly. hahahahahhahahha"

Telling chips are bad without using them, manipulating info showing reviews that favor nvidia, exaggerating features that are not so important, ignoring some that are.

Explain to me how what I quoted (in example) changes the fact that I can go and buy a 7950 with pre OC and have same performance in average due to w1zz studies, and even OC more and forget about 660TI. Explain to me, how that overly exaggerated laugh changes the minimum frame rates of the TI and makes it good for no reason. Well It doesn't change anything actually.

The only cure I see for you fan-guys is get a 7950 and OC it, or buy a good version already, then you would stop complaining the moment you see its not a bad card. And also get the 660TI so you can compare also. You will see no difference that could make you still think AMD cards are crap, you will not see the driver issues, you will notice that physx don't make the difference, and hopefully you will be a more balanced person.

I'm not a fanboy, I like nvidia cards, I have had a couple, and to me the 670 is a great card, but not this 660TI crap, I'm not a fanboy because I know to see when a company makes a meh release.Reply

I've already had better, so you assume far too much, and of course, are a fool. YOU need to go get the card and see the driver problems, PERSONALLY, instead of talking about two other people on some forum... Get some personal experience.NEXT: Check out the Civ 5 COMPUTE Perf above - this site has the 6970 going up 6+fps while the GTX570 goes down 30 fps... from the former bench...http://www.anandtech.com/show/4061/amds-radeon-hd-...LOLOLNo bias here.....The 580 that was left out of this review for COMPUTE scored EQUIVALENT to the 7970, 265.7 fps December of 2010. So you want to explain how the 570 goes down, the 580 is left out, and the amd card rises ? Yeah, see.... there ya go and famboy - enjoy the cheatie lies.Reply

The comment section is filled with delusional fanboys from both camps.

To the Nvidia fanboys, the 600 series is great when you get a working card, that doesn't just randomly start losing performance and then eventually refuse to work at all. Doesn't Red Screen of Death. or get constant "Driver Stopped Responding" errors etc etc. No review mentions these issues.

To the AMD Fanboys, the drivers really do suck, the grey screen of death issue is/was a pain, card not responding after turning off the monitors after being idle for however long also sounds like a PITA. Again no review has ever mention these issues.

I've been using Nvidia the majority of my time gaming, and have used ATI/AMD as well though. Neither one is perfect, both have moments where they just plain SUCK ASS!

I'm currently using 2 GTX 560 Ti's and am currently considering up/sidegrading to a single 670/680 or 7970/7950, and during my research I've read horror stories about both the 600 series and the 7000 series. What's funny is everyone ALWAYS says look at the reviews, none of which mention the failures from both camps. none speak of reliability of the cards, because they have them and test them in what a week's time period at most?

Here's a good example, one of the fastest 670's was the Asus 670 DCII Top, it got rave reviews, but got horrible user reviews because of reliability issues, got discontinued, and is no longer available at Newegg.

I can see why EVGA dropped their lifetime warranty.

All of this said, I'm actually leaning towards AMD this round, sure they have issues and even outright failures but they aren't as prominent as the ones I'm reading about from Nvidia. I don't like feeling like I'm playing the lottery when buying a video card, and with the 600 series from Nvidia that's the feeling I'm getting.Reply

Other Thoughts: This card at stock settings will beat a stock GTX680 at stock settings in most games. I think this is the best deal for a video card at the moment.

I sold my 7970 and bought this as AMD's drivers are so bad right now. Anytime your computer sleeps it will crash, and I was experiencing blue screens in some games. I switched from 6970's in crossfire to the 7970 and wished I had my 6970's back because of the driver issues. This card however has been perfect so far and runs much much cooler than my 6970's! They would heat my office up 20 degrees!

I also have a 7770 in my HTPC and am experiencing driver issues with it as well. AMD really needs to get there act together with their driver releases! "

LOL - and I'm sure there isn't an amd model design that has been broken for a lot of purchasers.... Sure....One card, and "others here are rabid fanboys" - well if so, you're a rabid idiot.Reply

lol you've gotta be one of the most ridiculous, blind, hard-headed fanboy troll noobs i've ever seen on the internet. The amd 7 series atm are great cards and at $300 for the 7950 i'm sure they make nvidia sweat. I myself am running a gigabyte windforce 660ti an am very happy with it but mygod can the 79's oc. Reply

Have you not noticed your own constant posting of Pro Nvidia statements, and at the same time bashing AMD? And I said delusional not rabid. Though you may be on to something with that.....

EVGA recalled a lot of 670 SC's, gave out FTW models(680 PCB) as replacements. Something about a "bad batch".

Maybe it's an partner problem, maybe it's an Nvidia problem I don't know. But I know Asus DCII cards have lots of low ratings regardless if it's AMD or Nvidia. The Asus 79xx cards with DCII have 3 eggs or less overall, similar to the 6xx series from them. Gigabyte has better ratings, and less negatives than Asus, MSI and even EVGA on some models. So maybe it is a partner problem.

I also must be imagining my Nvidia TDR errors or drivers/cards crashing (with no recovery) while playing a simple game (Bejeweled 3, yeah I know...) and other games occasionally as well since Nvidia can do nothing wrong in the driver department right? Just like my AMD friend seemed to think I was imagining my AMD driver issues when I had my HD 2900 Pro.

It's also funny that I'm being attacked by a "Devoted Nvidia fan", and my friends usually consider me a "Devoted Nvidia fan". Go figure. I've never been totally against any company, never anti-Intel or AMD, or Nvidia or ATI/AMD. The only company I have avoided is Hitatchi and their hard drives, and Intel initially because honestly their stuff seemed overpriced during the P4 days.

Maybe I'm just getting cynical as I get older...but Hard Drives started becoming unreliable the last couple of years, and now video cards are suffering more failures than I'm used to seeing. And SSD's with Sandforce seem to suck ass as well reliability wise, they are almost comparable with the 600 series, high speed and more failures than I'm comfortable with. Though in Nvidia's defense even the 600 series isn't as bad as Sandforce or OCZ or Seagate.Reply

I have purchased 2 of the new Nvidia 600 series, a gtx 660 ti, which runs fine this far, but I would question if it was worth it's cost. I purchased a second cheaper gtx 650 because cost with the promised free download of Assassins Creed III made it best choice for second computer video card. The game download promotion is a lie, tigerdirect codes are not being honored by Nvidia and I informed tigerdirect but they are still selling these cards with the promise of free game download. This is a dishonest promotion by both firms, Beware!

Why support firms that are promoting these video cards with a dishonest promotion. If the free download promotion has exceeded its allotment, then both firms should stop advertising it and promising a free game download they know is not going to be honored. Shame on them for promoting such lies to customers at Christmas. I will never buy from them again.Reply