Both desktop and Non-switchable mobile drivers have been fine for me. Just as good overall as my experience with Nvidia drivers. Especially on laptops ever since they switched to an opt-out program (I think Sony and a couple of others did choose to opt-out so I will never buy those). It's just the Enduro stuff that needs work, and they're on the right track there too. Both have had quite a few hiccups though, over the years.Reply

I did have ati card on notebook. I am Linux and thinkpad laptop user. Their driver is by far worst than Intel's and nVidia's. Their Open Source driver is just a joke. If you are OpenCL programmer you will find a lot of bugs specially when you are using OpenGL api.They seriously need to make some move. Not supporting is as just bad as not having a good hardware.Reply

It's nice not to see the deranged amd fanboys screaming amd will lower prices and wipe nVidia out.For years I had to put up with it, and no amount of facts, data, nor common sense would do. Now, even the author of the piece mentions the buyout/failure rumors circulating...LOL nVidia (GPU's) gained marketshare in Q3 and Intel and AMD lost.LOL It is so sweeeeeeeeeeeeeeeeeeeet to see the amd fanboy liars brought low ( in their own mind ! ), where they have always been, and where they belong. I have told them for years on end, if they are such insane fanboys, and they are, they should be paying top dollar for AMD cards, but instead they raped amd down to the last penny they could extract with timing, lies, complaining, and any other lowlife tactic like delaying purchase until nVidia drove amd prices into the gutter, all the while attacking nVidia while relying upon them to rape their great master AMD. With friends like amd fanboys, enemies are not needed to be utterly destroyed.Reply

You know, all Catalyst drivers but the WHQL ones are beta. It is at least the same quality as any other Catalyst from last year, and they stopped asking for WHQL signatures because it added 1 month to the release cycle (what is kinda problematic if you want monthly releases)Reply

Gunbuster, just expect it. The amd fans can't help themselves, lying comes so naturally to them now that if they had to tell the truth it would take six months to do so as they'd immediately enter a coma from the massive reorganization their brains require to do so. Did amd ever fix it's hackable drivers so the US Government could possibly use them without massive security risks ? NO, of course they didn't.Reply

On the latest laptops, it's generally at the point where I'd say "yes". I still prefer the NVIDIA driver UI, but for less technical users I think it won't matter much. The bigger factor now will be the bottom line: if a laptop with NVIDIA is cheaper, that's what people will buy; if AMD's partners can hit lower prices (without compromising on other areas like LCD, keyboard, etc.), they'll move more units.Reply

When I go to different sites to configure laptops with this GPUs price differ from 273USD to 376 Euros!!! So how you can compare them? If you go AMD way, for same money you can get much better CPU or much bigger SSD or anything 300USD difference can give you?Price wise it should be compared to 675m which it beats. Reply

Are you going to get a laptop with Nvidia's newer ultra high end notebook GPU? I realize it's specs will put it somewhere between a gtx670 and gtx660ti, but still interested in how it performs with notebook CPU's attached to it.Reply

It can depend on what settings, resolution, etc. that you're willing to deal with, but any mid range or high end GPU should handle any of the Crysis games fine.

My old Geforce 9650m GT with only 32 cores seemed to handle Crysis 1 fine, although it's possible later in the game it wouldn't have. But it ran Crysis 1 at around 720p with effects on medium to high at fine frame rates, so probably any mid range GPU out today ought to handle it fine.

One 'trick' you may want to consider for games-disable anti-aliasing. I personally don't think anti-aliasing helps all that much-I mean we're at a point where kind of *MINIMUM* resolutions we play at are in the 720p range, and at that let alone 1080p, aliasing just isn't a big deal IMO. But AA uses a TON of power, like it can easily drop frame rates by half, depending.

I'd MUCH rather use higher resolutions and higher effects settings then turn on AA, and disabling it goes a long way towards letting your system run games better.Reply

Surprised to see the comment about waiting for the next gen GPUs...are they actually coming any time soon? My assumption is that they won't amount to much...kind of like Nvidia's 8000 9000 100 series GPUs that were all basically the same thing, or the 200/300 chips. Tweaks, better clocks and the like, but nothing earth shattering.

This year both companies:-Introduced great new architectures-Migrated from 40nm (?) to 28nm-Managed to release notebook parts that were shockingly close to their desktop parts...something they didn't really pull off very well in previous generations. (Of course that may be because the desktop parts could be even better than they are, but sensibly they're trying to salvage more profits out of them...but still.)

With all that, I'd be shocked if we got anything big in 2013...personally I'm still quite pleased with what we have now! :)Reply

I might sound like a fanboy here, but I'm speaking from a mathematician's point of view. You can't have that "Average of 14 games" figure when you have 1 or more pointless results (diablo and portal)... the FPS in those games for the 680 are useless even on a 120Hz screen... need to normalize to 100 FPS, then averageReply

Perhaps, however they highlight areas where the Kepler architecture significantly outperforms Tahiti at this time. It is worth asking why this is the case, and I don't believe we should limit the testing to 100fps in, say, Portal 2 - the 680 is 43% quicker here, so why is this?Reply

Not like it matters - amd enduro has been crap, is crap, and will be crap - and really really stretching it, on the newest laptops, one might be able to say, it's almost up to optimus snuff. As for ALL the older enduro ou7tlays by loser incarnate amd, it's still over.So, not like it matters.amd sucksReply

Well, I couldn't wait for AMD to get their Enduro stuff together. Based on Jarred's recent recommendation that the Optimus drivers and updates were more mature, I sprung for the CZ-17 w/ 675m. Maybe in another 5 years from now I'll go red instead of green. If that option exists.Reply

Jarred, why do you keep testing the Sniper Elite V2 and DiRT Showdown? Please check their metascores or reviews/user forums. Those games are viewed by the general public as rubbish. Sniper Elite is only at 65, which is extremely low if compared to other titles you test. Showdown is higher than SEv2, but still a lot lower than DiRT 3 was.

And if you're not going to remove these games, at least in the Showdown turn of the AMD-built global illumination, which has almost no visual benefit, yet cripples the framerate significantly. It probably uses double-precision in the compute shader pass just to take advantage against the GK104 chip. Just try the test without the illumination and check how much the GeForce drops with it enabled. It's way past reasonable.Reply

Because I already ran those tests with a previous driver, and yes you're right about the scores, global lighting, etc. I do not intend to keep either game as a long-term benchmark. However, since I had "before and after" results, it was easy enough to run them again and report the improvements/changes in performance.Reply

Thanks for clarification. It might be also worth looking at some other "dirty stuff". For example, Alan Wake - the quality in the game with maximum settings in incomparable on GeForce vs Radeon GPUs. Check this example:

On the Radeon, the grass is almost completely missing and the vegetation is jaggy and incomplete (especially the forrest to the left, near the "Energizer" billboard). Radeons also ignore the TRAA. I know that running the tests it quite time consuming, but I honestly have to question if the majority of reviewers worlwide check the resulting quality. I mean, it's quite easy for AMD to improve performance with new driver iterations, while doing this kind of "optimizations". And everybody should carefully watch that. I am just worried whether or not such things manifest in your test suite.Reply

LOL - Come on we have to give amd the underdog another gigantic break.Let's act like our illegal government and when anything isn't up to snuff with amd let's just pretend they're the greatest thing since sliced bread with cheesey crust. Beyond that, we can praise amd's gpu's as the best, even as they are losing. If not we can just be crickets in any wayward case where lies won't make amd shine. Reply

LOL, amd has to be given a hand-up cheat to look good. It's amazing, but expected here. Don't worry the amd fanboys will rage for a few moments, pretend to not see your post, but internally scream PhysX or reminisce about "excessive tesselation" in games during the 500 series vs 69xx games as they did with the utter spanking amd took from nVidia, then totally excuse amd's massive cheat, forget it exists, but be certain to implement it in every review, as the pressure from amd (and it's fanboys) will no doubt be undesirable.

Thanks for the heads up.Cheat cheat cheat cheat for amd - should be on every videocard box they sell.Reply

i dont get it. amd really baffles me. they make good gpu that can compete toe to toe with nvidia but then they seem to completely ignore support for drivers. because of this if i would choose an nvidia gpu over amd if both are equal. would it really be that hard for amd to put more resources into supporting drivers. id like to think that the nvidia software devs arent that much smarter then their counterparts at amd. who knows how much more performance amd could get if they ever took drivers serious. and we all know performance = sales.Reply

Yup, I'm of the same opinion. I guess good coders are hard to come by, or are sunk with a million other tasks each day...(at least that is how it appears with regard to our own in-house coders).

I was one of those plumping for Crossfire 7970Ms too. Until drivers 12.11 Beta 7, (on Beta 8 now), I really couldn't say they had cracked it.

The only problem I face now, is in F1 2012, where the between-race menus have the car randomly changing colours, you know not in a nice way, but a sort of buggy- random way. Fortunately, the in game performance is wonderful.