I agree, it seems to be a pretty standard refresh except whereas 460's tend to top out around 850mhz, these make it much closer to 1ghz.

I hope the all of the manufacturers learned their lesson from the rash of dying 460's a few months after launch and included heatsinks on the VRM's like Asus did. These GF114/GF104 cards draw too much current when overclocked for the manufacturer's to leave the mosfets naked as they did with most launch 460's.

I also liked how the clock scaling was presented in the review, this is a good way to handle the non-standardized speeds. I'm sure you'll get the standard comment whiners screaming bias, but at this point I'm convinced they will do this whenever you show an Nvidia card even power on correctly.Reply

I'm pretty sure they didn't learn too much, seeing what happened to tdp-control on the 590 ... (i.e. nerf the card else it's gonna blow up) - quite normal though, trying to put two 350 watt gpu's on the same board was a retarded idea, since it's not supposed to be a hairdryer.Reply

Why? Nvidia pretty much said last week that the target market for the GTX560 was users who want an affordable card to play games at 1080p resolution. Who would buy a $200 graphics card to play on a $1000+ 2560 x 1440/1600 display anyway? If you have that much money in your pockets for a high-quality display, why would you skimp out on the graphics card?Reply

Where exactly are you finding a $300ish 2560x monitor? IIRC even the best sales Dell had on refurb 3007's only dropped as a low as $800ish, and with the potential inventory of off lease 3007's mostly gone by now and the 3008 and 3011's being significantly more expensive deals that good aren't likely to repeat themselves in the future.Reply

Who would buy a 200 bucks card to play on a single 150 bucks monitor when the whole config costs 700+ bucks ?

200 bucks is A_DECENT_AMOUNT_OF_MONEY for a GFX, it means you're a gamer (maybe a poor one though) and it means you might be interested in dual screen (meh you spent 700 bucks on the tower, why not 2*150 for dual 22" 1080p monitors ?).Reply

I'm seeing quite a trend with AMD stuff getting better scores (relatively) on more recent and demanding games, and I'm wondering if it would be time to weight games differently for a better comparison.

For example here, on the important/demanding/modern games (let's take Metro 2033 and Crysis to have undisputable arguments here), the 560 doesn't ever come close to a 6950 and only the best version can beat the 6870 by almost nothing.

If somebody buys a gfx today, it might be to use it for at least another year, and in that sense, the weight of less important games should be diminished a lot, including hawx-120fps-fest, other 100+ fps titles and the clearly nVidia-favoring Civ5.

What is important here, is that NOONE has any interest in buying a gtx560 today, because of the following extremely important points :

-> AMD offerings do better in more demanding games, and will thus do better in future games-> AMD offerings (6950 for example) have more memory, which WILL be used in next-gen games for sure, as for every gen-> Noone cares if they have 110 or 120 fps in hawx, which is a console game anyway

I believe the use of any PC component for gamers can be summarized to this in the end :

-> can it play this game ? 30+-> can it play it well ? 60+

Because any of those components will for most people be used 2 years from now, the fact that older / less-demanding games get 110 fps is completely irrelevant, might as well show 557 fps in quake3 as a benchmark...

As a summary, could you anandtech guys please tweak your test list / weighting in order to better inform the less-informed readers of your website ?

It is utter nonsense to state today that a 560Ti "trades blows" with a 6950 or that a factory OC'd 560 "comes close" to a 6950 either.

The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach.

nVidia has done some great stuff in the past, but one has to admit that outside of quad-sli gtx580 there is no use in buying anything nVidia this round, as AMD offers better performance + performance/watt at every price point this time around.

There is one argument for nVidia and that argument (no, not the drivers, because you do NOT play on linux) is the nVidia goodies like 3d gaming and other minor stuff.Reply

I half agree with you... some of your commentary is good (HAWX lol) but one particular conclusion is not tenable:

"AMD offerings do better in more demanding games, and will thus do better in future games"

When Mass Effect 3 comes out, I expect that like Mass Effect 2 it will strongly favor nVidia GPU's - unless they rewrote the entire engine.

New games cannot be classified into demanding vs non-demanding - each game engine has its favorite factors, be it clock speed, memory bandwidth, stream processors, ROP's, etc, so I expect each game will have its favorite card.Reply

The problem is that if they DON'T complete rewrite the entire engine, Mass Effect 3 will continue to be a festival of even mid-range cards breaking 60 FPS. While there's nothing wrong with that per se - ME2 is one of the better-looking games out there despite being not particularly intensive, after all - it still means that nVidia's slight advantage over AMD in that game is meaningless. Compare that to Crysis where even the 6970 falls short of 60 FPS at WUXGA, and the sizable lead AMD carries over the competition there has real, noticeable impact on the game.Reply

Of course money and politics play, but that has no importance to gamers, they just want the best out of what they pay, and if some company plays better politics to offer a better result, so be it.Reply

OK. I get it. If a game runs better on Nvidia we should just consider it old trash and never benchmark it. Regardless of the fact that it's arguably one of the most important franchises in history. CIVILIZATION 5 (released Sept 21, 2010, expansion packs released even later I'm sure, always usually 2 packs released). Even though it was release 2 years after your crysis warhead (2008). Civ5 will continue to be relevant for another year or two (if not more, people play these for years).

Sid Meier didn't make Nvidia's cards run better on it either. Nvidia let out MULTITHREADED RENDERING. Umm, that's a DirectX 11 feature isn't it? But isn't Civ5 an old junk game we shouldn't benchmark? LOL.http://forums.anandtech.com/showpost.php?p=3152067...Ryan Smith (from here at AT) explaining just a month ago (exactly today) how Nvidia sped up Civ5. It doesn't favor them. They are just making their cards RUN THE WAY THEY SHOULD. Full performance, with multithreading. NOTE: Ryan says it's NOT a sweet cheat they came up with. The cards should run like this!

"For example here, on the important/demanding/modern games". Again I guess we have a different definition of modern. 2008 vs. 2010. I'd say you got it backwards, they didn't test crysis 2 here. Read the comments from Ryan. This game can harness 12 threads from the cpu. Firaxis thinks they're gpu limited! Check this:

"At this point in time we appear to be GPU limited, but we may also be CPU limited. Firaxis says Civ V can scale to 12 threads; this would be a hex-core CPU with hyperthreading. Our testbed is only a quad-core CPU with HT, meaning we probably aren't maxing out Civ V on the CPU side. And even with HT, it's likely that 12 real cores would improve on performance relative to 6 cores + HT. Firaxis believes they're GPU limited, but it's hard to definitively tell which it is."

Nope not very modern. Dump this game AT. ;) Obviously hawx is old (Tom's tests hawx2, but the game sucks). Personally I'd rather benchmark based on popularity. How many ever played hawx or hawx2? If a large portion of us are playing it, you should benchmark it (of course you can only bench so many, the point is don't bench it if nobody plays it). I agree you should just NOT benchmark anything over 200fps (draw the line where you want). At this speed nobody has to worry about playing on even a $100 card most likely so nobody cares (can you even call yourself a gamer if you don't have at least discrete card? $50 $100 card?).

Metro 2033 is 6 months OLDER than Civ 5 and scored an avg of 10pts less at metacritic than Civ5. Nuff said? Will you still be complaining about Civ5 when AMD turns on Multithreading on AMD's cards? LOL.

Maybe they should test crysis 2? Head over to Tomshardware where the 560 beats your beloved 6870 by a good 10% in 1920x1200, even more when you look at the MINIMUM FPS. Which to me is far more important than any other number as games suck below 30fps (AMD's will dip here where Nvidia usually dips less), but everyone can play fine at 200fps.

Crimson117 pretty much said it best describing a games favorite features and why they run better on one architecture or another. IF memory serves, mass effect/2/3 are based off unreal 3 engine (unreal 4 coming 2012 I think, but totally aimed at Xbox720? whatever they call it, so says Sweeney). Many games in development are based on that engine. It's a good indicator of unreal 3 engine games, current and future releases.

The 6950 is not the direct competition for the 560 (oh, and are you completely unaware of the fact that half of the cards for sale on newegg only have 1GB of memory just like the 560?...whatever). You do have to pay an extra amount for 2GB, which start at about $240. Meanwhile newegg has a good 1/2 dozen 560's for under $200. Apple's/Oranges? Oh, everyone that buys a new card hopes it will still be relevant at least a year later...I thought that went without saying?

"The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach."

Umm, correct, because it's another $40, I'd expect that. That's 20% more cost. Does it perform 20% better in every game? Must not, or AT wouldn't be saying it "trades blows with 6950" correct? Dirt 2 from Dec2009 (just a 3 months older than metro2033) is dominated by nvidia. Sorry. Suspect dirt3 will too. Wolfenstein (representing Id's Tech4 engine, until RAGE) is again dominated by Nvidia (no surprise, it's OpenGL based which Nvidia has always been good at). Rage could change all that, but currently (meaning NOW and RELEVANT) it's tech4. Again wolf's not much older (Aug2009) than metro 2033 (about 7 months?). DirectCompute for DirectX 11 looks like Nvidia dominates a 6970 there too (even base model beats 6970). Hmm. They don't have DX12 yet right? At idle the card is great power wise (leads all), under crysis it looks fine at load (overclocking only raises it 10w, very nice - AMD obviously still has a watts advantage in my mind - but nice improvement over 460 perf/watt wise). Great idle noise (80% of my pc time isn't in games), leads all. Load noise blows away my 5850 and it's nowhere near driving me out of my room. I love the 5850. So I'd happily take a 560 at 3-5 less DB's of noise if I were in the market now.

Have you seen Physx in Alice? Very cool. I don't think AMD has that (wish NV would give or lic it to them). Watch E3, looks like some of Nvidia's stuff isn't bad... :) DUKE in 3D? Nvidia pretty cool?

FYI: as noted I own a radeon 5850 from Amazon.com :) (took them 7-8 months to fill my backorder but $260 was still $40 less at the time). I own it purely because of noise/heat was a winner at the time, price awesom, perf was a wash and really depended on your favorite games. Both sides have great tech. It just depends on what you play. I'll be buying something in Dec, but honestly have no idea which side will get my money NOW. It will depend on what I'm playing THEN...LOL. OH, some people do use linux. :) I'm more than happy about the fact that AT see fit not to make their benchmarking picks base on making AMD a winner every time...Thanks AT. Between here, toms and a few others I cover about 40 games and always know which card is best for me at the time I buy. :) That's the point of AT etc right? ahh....Trolls...Fanboys... :(Reply

Interesting reply, but I am no AMD fanboy, i'm a bang for the buck fanboy that's all.Quick reply to your stuff :Civ 5 is older tech than Metro 2033, nobody cares about the release date.Civ 5 is also NOT a very relevant game, as there are others in the RTS genre which have much much more success (SC2 anyone ? ).

Dirt 2 is also irrelevant as it gets the "lots-of-fps" syndrome which does not help any card make a big difference at all.

Wolfenstein is as relevant as it is on my hdd, sitting there not being played.

Where I cite the 6950 as competition (I can buy the 2gig version for 215 bucks, thanks), I assume we are talking about the upper range 560OC and 560 TI which are the cards I talk about when comparing apples to apples.

AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC).

While it is important to keep an eye on games that favor one architecture or another, these 'wins' should definitely weight less, especially when they are temporary (as you said about civ5, nvidia released a special patch just for that game, maybe amd will do the same, who knows), be they amd or nvidia wins.

I like nVidia, I like their fermi idea, I like a lot of the stuff they do, but that does not change a thing about the current gamer gfx market : nvidia = more watts and more dollars for the same average performance in modern games.

And no, civ5 will not be relevant a year from now, it's part of those games that die quick, it's not made by blizzard and it's not named sc2, the next few decent rts will wash it out, as for all past civ's and "normal" rts's.Reply

Bang for buck boy wouldn't make an across the board statement about Nvidia's product line being all trash. There are more than a few great buys on either side and not a lot of competition in the $200 range for the reviewed card here (so say AT in this article). Steampowered shows stats, and Civ5 is in the top 10 EVERY NIGHT. Just checked, 16,000 playing NOW, and that usually doubles or so at night (last I checked that is, months ago - but day stats look pretty much the same. As soon as an expansion comes out these numbers will swell again. If people can be proved to be playing it NOW (every night), I think it's relevant. Note METRO 2033 isn't even there, oops my mistake....500 peak players (that's it down near bottom in like 98th place)...ROFL. Really? Civ4 Beyond the sword has more people playing right now. People just quit playing though, because Civ5 sucks. /sarcasm/

Civ5's re-playability is awesome. People don't quit playing it quickly. Expansion packs add even more to it. Agreed, bench SC2 also. We're clear on Civ5 not being a forgotten game now right? Top 10 played games every night on steam. Nuff said? It's not Starcraft 2, you're right, won't be played for 10 years probably. But I'd venture to guess they'll play it at least as long as Civ4 (yep, still playing it MORE than metro2033...ROFL). It's first expansion isn't even out yet.

Wolfenstein (like it or not) is the latest OpenGL game out. So it's a good indicator of OpenGL performance (the only indicator we can get now, until Tech5, or another OpenGL game comes out. This is it. It's not really about the game, so much as the engine under it. When possible they test games based on engines that are being used no and on many games in the future. Prey2 and Brink use it too. Prey 2 coming 2012, Brink of course just out. Tech 4 still relevant eh? Games still being made with it. We shouldn't bench games that will give us an idea of a future games performance though...This sites for benching games you like. Test Brink then. You're just not making a good argument yet :)

Please provide the LINK for the $215 2GB 6950 radeon? I can't get it off newegg for less than $239 after rebates."AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC)." I wonder about the clocks on your card if you actually give me a link to it.

Um, I quoted the "$hit cards" as you call them because that's what AT is reviewing here today. It's a review of the GTX 560, not the 560TI (never compared the TI, it's not the reviewed product). The ASUS TOP card is only $220 at newegg. ASUS also has a higher than tested here MID card that's only $199! 850/4104 (GTX 560 Mid), but this card is 850/4200. Not much faster (100mhz on the memory) but not a BASE card by any means.

Uh, it's completely UNIMPORTANT to keep an eye on games that favor one architecture over the other and then weight them. LOL. Your problem is as a fanboy you just can't get over any game running better on your enemies product. It's IMPORTANT to be testing games that are using today engines (that will be used tomorrow in games based on said engines), or at least the top games out now that are heavily played. I don't care who's better at what. And if one wins over the other after benching it, we don't just throw it out because NV happens to win it.

NO, I Didn't say nvidia released a special patch. Nvidia FIXED their drivers to work properly, AMD is currently trying to do the same. They're late. Sorry. Nvidia got there first (and should affect many other games, it's not a patch for this game, it's fully in the DX driver info box now). AMD is failing to produce a multithreading driver for DX currently so NV looks good in Civ5 (for now). Jeez did you read Ryan's post? It's NOT a cheat patch or something.

I guess you should just make up a list of games for them that run great on AMD. You'll be happy then :) I won't care as long as you can prove they're very popular and played a lot now, or based on an engine that will be in games coming up shortly. FYI: DIRT2 is a DirectX 11 racer. The EGO engine is used in RaceDriver:GRID,F1 2010 & Flashpoint:Dragon Rising as well. Oh, Dirt3 and GRID2, F1 2011 and Flashpoint:Red River will be running EGO 2.0. Yep, games not out yet. So by testing one game we have a decent idea of play in 8 EGO engine based games. Though F1 2010 is probably better to indicate this as it runs EGO 1.5. But still all in the family here and very relevant. GTX 580 gets just over 110fps. The tested card is around 72. I wouldn't call this a 500fps waste game. Don't forget they aren't posting MINIMUM's here (occasionally, but I wish they ALWAYS included them as it affects gameplay below 30fps).Reply

There are no great buys on the nVidia side at the moment, just go read all the relevant reviews and compare.

16.000 people, double at night ? lol ???

When I played EvE Online we were over 40.000 at night, and it's surely grown bigger since.. your game is irrelevant, thanks.

Metro 2033 being not popular because it's not a multi game makes no difference : it IS a demanding game and a GOOD benchmark, noone cares if it sucks or not. It's like far cry .. it wasn't a very good game but it was a perfect benchmark.

A game cannot be considered relevant or "played" when you have less than 5.000 people playing it online.. otherwise you could go ahead and say Command and Conquer Generals : Zero Hour is still alive ... and i can assure you it's totally dead, swarmed by cheaters and unpatched flaws.

Wolfenstein didn't seem all that beautiful to me .

The fact that you are unable to find good suppliers is your problem not mine, I'm not going to give you everything I searched for just because you're flaming around.

Well, base cards, shit cards, whatever cards, all cards but the top one don't ever come close to a hd6950 that costs at most 20 bucks more. and if you're looking on the cheap side, the 6870 eats the lowish 560's .. done.

Top games being played has nothing to do with benchmarking (cfr far cry, crysis ...).Do you play unigine a lot ? ^^Games favoring one architecture strongly should be weighted less because they are not representing the gfx but rather the driver / gfx calls picked for the engine.

The fact that nVidia patched is great, AMD will too, and thus it is not very relevant to review a game w/ and w/o patched drivers to compare two HARDWARE pieces, even if it's clear AMD tends to be late on driver patches.Reply

My local hardware dealer has the new 560 (non-Ti) in stock today. True, only two models so far but the cheapest 560 in stock costs less than the cheapest 6870 in stock, and even less than some of the 6850s. And that's the issue, graphics cards are about price points. Its no good going on about the AMD 6950 to buyers who are only looking at choosing either a 560 or a 6870, because both are around the same price point. And as already said, the 560 today is actually at a better price point at this dealer than any 6870.

One reason why the majority of discrete desktop graphics card buyers continue to purchase nVidia is the quality of the drivers. Or the continuing issues with AMD drivers. There is an example here http://forum.team-mediaportal.com/746244-post1.htm... of a guy who's having problems rendering a web page with a 6950. Yes, that's right a web page, now don't going playing games with it will you.... What he is seeing is partly because the page has a Shockwave slideshow. He complains that GPU usage constantly fluctuates between 0-8-18-44%, with a 6950. With a GTX 550 Ti GPU usage figures 0-2-11% - that's right only 11% with a GTX 550 Ti compared to 44% with an AMD 6950.

Meh. That's not a real issue, as can be seen by the replies. Following up on it I'd speculate that the only reason the single time sample spikes at high load are spotted at 175mhz but not at 450 is that at the latter the render takes significantly less than the sample time so they get largely smoothed into the average (as can be seen by the smaller load spikes).

There is one real issue in their drivers that's been annoying me significantly since I got my 5870 last summer. When running 2 monitors and a GPU app, when the app completes a work unit it drops down to single monitor speed for a second or three and ghost images from the top of one monitor flicker on the 2nd until the new task starts and the clocks throttle back up. I've made it go away by creating a profile and manually editing its config file so that the single monitor settings (which it enters despite 2 attached monitors) are no lower than the others. This makes the problem go away, but really...Reply

I suggest reading more than one review before deciding which card is better. I do not get the same results as Anandtech does on the same benchmarks. I have a different platform and get different results. Their bench is useful for getting a rough idea, but there are a lot of other sites to check. Not all show the same AMD scores gotten here.I have tried to use AMD cards in the past. I really have, but with a 5850 I could not get it to work with a DVI connection, and with a 3450 it would bluescreen whenever I played a DVD. Both were sent back. All the other cards I have had (Nvidia) just worked!I even recently looked at getting a 6950, but too many comments on Newegg were about the lousy drivers(too many for me anyway). I just couldn't do it. Reply

I don't know where you guys are getting this information, but the Radeon HD 6870 IS NOT at $180. Therefore, you shouldn't let that sway your opinion about this card. The Radeon HD 6870 is a $200 card, as is the GTX 560. Folders and gamers should go for the GTX 560, while people that want higher efficiency and same performance should go for the Radeon HD 6870.

That alone leaves me with a very bad taste about this article. I suggest you read the review by Tom's Hardware and TechPowerUp instead.Reply

Witcher 2 doesn't use Aurora engine, all new now so wait for it to be put in some sites benchmarking. Same with skyrim, it uses a new Creation engine, and also 100% dynamic lighting with lots of snow and cliffs (which this engine is designed for).

Sorry. Not much worth extrapolating other than we have no idea who will win later :) If we're being honest anyway. I suppose a quick google might get a hit on the devs opinions.Reply

Your comments about confusing naming are very valid.I've long ago come to the conclusion that the confusing naming issue is a deliberate strategy by Nvidia.They WANT to create confusion to make it more difficult for less sophisticated buyersto compare cards head to head and Nvidia can thereby pick up a few more sales than they would otherwise.The proof that this is deliberate is the fact that they Keep Doing It.Otherwise, they would have very straightforward, very easily compared naming conventions:Higher numbers = more power, GT not as powerful as GTX, etc.An unfortunate state of affairs, but not about to change even with writers often complaining about it.Fortunately, there are resources on the web that compare cards head to head.Reply

They're not trying to confuse you. They're just trying to sell every chip they can. A lot of dies have defects etc that cause them to release a plethora of cards at different speeds, features disabled (possibly due to defects in dies) etc. Die shrinks cause problems too. Sometimes they save enough in power/heat to warrant a new release # or model. Take the GTX 260. The core216 came out, fixed heat issues and was a good 10% faster. People would want to identify the faster/cooler cards and not get screwed. I hate Motherboard makers not listing the REV prominently on the box, or in ads. It's tough to buy online when I'm after a specific rev. This is more a tech issue than a company deliberately ticking us off.

If you don't mind paying MUCH higher prices, they can go ahead an toss all defective dies and get back to 3 product lines with easily seen performance advantages between the 3. AMD, Intel, Nvidia, etc they all have this problem. Of course progress would really slow down if they take this route. A person going into the store and seeing a 6750 card, might find a 5850 sitting next to it for $200 and wonder what the heck is going on...LOL. I could almost say the same about the 6850. That 6850 should blow away a 5850, I mean its a whole 1000 higher right? Confusing yes? But that 5850 beats the 6850 by about 10% in everything. There are a lot of these examples. Heck this time NV let the manufacturers decide everything (clock/memory/ref design).

In an age of small margins, just about everything in your PC being a commodity, and shareholders demanding every last dollar they can get from company X, you should just get used to tons of products not performing too differently. Really, I can make up my mind in one night of reading reviews on 3-4 websites. By the end of the night I can decide how to spend my money and be fairly certain I'm not making a big mistake. But yeah, if you're not willing to do some homework, get ready to buy something that's completely disappointing on occasion. But you're already here, no worries :) We have hardware review sites, because stores shelves and floor reps at fry's don't help us at all... :) I pity marketing dept's trying to work with all these dies/re-launches/binning etc that probably cause them nightmares...LOL Could they do better here and there? Probably. Would I like to try to make us all happy? HECK NO. :) I take that back, I wouldn't mind taking a crack at intel naming. :)Reply

Thanks for your reviews. On page 3 you write "on the ATI side we’re using the Catalyst 11.5a hotfix"...but is that the case for all the AMD cards? The same page lists three drivers being used: Catalyst 10.10e, 11.4, and 11.5a. And for the Nvidia cards, you also list three drivers: 262.99, 270.51 beta, and 275.20 beta. If you could help, I'd specifically like to know which drivers were used for the GTX 580 and the Radeon 6970. And since I'm going to be running at 2560x1600, it would also help to know which drivers were used for those 2 cards in your March 24 review (of the GTX 590), since that review included that resolution. My thinking is that if the drivers are reasonably current for both cards, then it is closer to being 'apples to apples'.

As for the March 24th review of the GTX 590, all the high end cards were on 266/267 drivers or the Catalyst 10.4 preview drivers respectively. Those were the newest drivers at the time of that publication.

And I apologize for the somewhat chaotic nature of the driver selection. We benchmark many different cards, redoing them for every single driver revision simply isn't practical. The relevant cards will be updated for any given article, and many (if not all) of the cards are updated if there's a major driver release that significantly impacts performance. Reply

Can you show me ONE review in which you did the same for AMD? Including a factory OCed card in a general review and compare it to stock Nvidia cards?

Are you trying to inform your readers or to pander to Nvidia by following to the letter their "review guides"? No transparency in the game selection (again that TWIMTBP-heavy list), OCed cards, what's next? Changing the site's color from blue to green? Letting the people at Nvidia to do your reviews and you just post them in here?

NV didn't send them a card. There is no ref design for this card (reference clocks, but not a card). They tested at 3 speeds, giving us a pretty good idea of multiple variants you'd see in the stores. What more do you want?

Nvidia didn't have anything to do with the article. They put NV's name on the SLOWER speeds in the charts, but NV didn't send them a card. Get it? They down-clocked the card ASUS sent to show what NV would assume they'd be clocked at on the shelves. AT smartly took what they had to work with (a single 560 from ASUS - mentioned as why they have no SLI benchmarks in with 560), clocked it at the speeds you'd buy (based on checking specs online) and gave us the best idea they could of what we'd expect on a shelf or from oem.

Or am I just missing something in this article?

Is it Anandtech's problem NV has spent money on devs trying to get them to pay attention to their tech? AMD could do this stuff too if they weren't losing 6Bil in 3 years time (more?). I'm sure they do it some, but obviously a PROFITABLE company (for many more years than AMD - AMD hasn't made a dime since existence as a whole), with cash in the bank and no debt, can blow money on game devs and give them some engineers to help with drivers etc.

I tend to agree with the link below. We'd have far more console ports if PC companies (Intel,AMD,Nvidia) didn't hand some money over to devs in some way shape or form. For example, engineers working with said game companies etc to optimize for new tech etc. We should thank them (any company that practices this). This makes better PC games.

Many more sites about both sides on their "enhancements" to games by working with devs. It's not straight cash they give, but usually stuff like engineers, help with promotions etc. With Batman, NV engineers wrote the AA part for their cards in the game. It looks better too. AMD was offered the same (probably couldn't afford it, so just complained saying "they made it not like our cards". Whatever. They paid, you didn't so it runs better on their cards in AA. More on NV's side due to more money, but AMD does this too.Reply

I expect to be using Crysis 1 for quite a bit longer. It's still the de-facto ourdoor video card killer. The fact that it still takes more than $500 in GPUs to run it at 1920 with full Enthusiast settings and AA means it's still a very challenging game.

Crysis 2 I'm not even looking at until the DX11 update comes out. We want to include more games that fully utilize the features of today's GPUs, not fewer.

LA Noire isn't out on the PC. In fact based on Rockstar's history with their last game, I'm not sure we'll ever see it.

In any case, the GPU test suite is due for a refresh in the next month. We cycle it roughly every 6 months, though we don't replace every single game every time.Reply

Ignore whats on a box. Go to metacritic and pick top scoring games from last 12 months up to now. If the game doesn't get 80+/100 you pass. Not enough like or play it probably below there. You could toss in a beta of duke forever or something like that if you can find a popular game that's about to come out and has a benchmark built in. There's only a few games that need to go anyway (mostly because newer ones are out in the same series - Crysis 2 w/dx11 update when out).

Unfortunately mosox, you can't make an AMD list (not a long one) as they aren't too friendly with devs (no money or free manpower, duh), and devs will spend more time optimizing for the people that give them the most help. Plain and simple. If you reversed the balance sheets, AMD would be doing the same thing (more of it than now anyway).

In 2009 when this cropped up Nvidia had 220 people in a dept. that was purely losing money (more now?). It was a joke that they never made nvidia any money, but they were supplying devs with people that would create physx effects, performance enhancements etc to get games to take advantage of Nvidia's features. I don't have any problem with that until AMD doesn't have the option to send over people to do the same. AFAIK they are always offered, but can't afford it, decline and then whine about Nvidia. Now if NV says "mr gamemaker you can't let AMD optimize because you signed with us"...ok. Lawsuit.Reply

I don't want "AMD games" that would be the same thing. I just don't want obscure games that are fishy and biased as well.

Games in which a GTX 460/768 is better than a HD 6870 AND they're not even popular - but are in there to skew the final results exactly like in this review. Take out HAWX 2 and LP2 and do the performance summary again.

Lately in every single review you can see some nvidia cards beating the AMD counterparts with 2-5% ONLY because of the biased game selection.

A HW site has to choose between being fair and unbiased and serve its readers or sell out to some company and become a shill for that company.

See my other posts. Nvidia finally got multithreaded rendering finished in their drivers (which should affect many games now, it's not GAME specific). Expect AMD to get theirs done soon. This isn't Civ5 or Anandtech favoring NV, it's just the beat AMD to the punch in getting drivers finished. If AMD takes another year to get their drivers done, I'm glad they report this. I hope AMD gets them done soon, or the next set of cards that get benched might show quite a few games with AMD bunched at the bottom of the list.

NOTE: this is a DRIVER issue, not game issue. Both sides have been working on getting this in their drivers for a while. It's about time :) The game had this in it all the time (many other too, get ready for speedups if you own NV and running 2.75 drivers (whatever is latest from NV). Unfortunately my 5850 has to wait for AMD. :(Reply

Besides, the way people rate on metacritic or any critic source for anything is at most a relative indicator of how people who take the time to vote feel about stuff... doesn't help that much does it ?

Crysis 2 is NOT in the same series as Crysis 1. If you don't know why, read some more about it.Reply

Two words for you mate, Bench and MarkThe purpose of benchmarking is to get a relative idea of the performance of a component, not to test every little game out there, because of that reviewers attempt to have a most relevant panel of games/ benchmarks to test the gfx cards.

Crysis 2 is not (yet) a relevant benchmark. Maybe when they're done writing the engine for PC's it will be, but now it's just a worthless console port.

IF you think they're lazy, just post a list of the games you would use to benchmark and ask people how useful they find those.Reply

"RE: Time to change the tests by Ryan Smith on Tuesday, May 17, 2011 The test suite is due for a refresh, and will either be updated at the end of this month or next month once we build our new SNB testbed."

Weren't you waiting for SNB-E? Isn't that a Q4 release? Or by "SNB" do you mean "Bulldozer" and know something that we don't?

You've got me all curious and maybe excited and more curious and now I-don't-know-what-to-think.Reply