Consumers that purchase AMD's top-end Radeon 7900 series of GPUs will now get license codes to download Far Cry 3, Hitman: Absolution, and Sleeping Dogs. The lower-end 7700 or 7800 series cards come with a code for FarCry 3, but customers can add a license for Hitman: Absolution by buying two of the lower-end cards at once. Each of the graphics cards will also come with a coupon for a 20 percent discount off the Digital Deluxe Edition of Medal of Honor: Warfighter.

Bundling hot new games with graphics cards isn't exactly a new practice: AMD has been offering Sleeping Dogs with its cards since this summer, replacing a previous bundle with DiRT Showdown, and Nvidia is currently offering Borderlands 2 with its GeForce cards. But offering $170 worth of newly released games and coupons with a graphics card that retails for just over $300 is a major expansion of the practice, and a sign that AMD is trying hard to differentiate its graphics hardware from the competition this holiday season.

AMD is also using the promotion as a way to highlight its new Catalyst 12.11 drivers, which it says have been tuned to provide performance gains of up to 15 percent on titles from its "Gaming Evolved" partners, including Far Cry 3, Hitman Absolution, and Medal of Honor: Warfighter.

I can appreciate what they're doing. I'd rather have Borderlands 2 personally but I don't care for Nvidia cards. I am looking to upgrade though and I wouldn't mind trying Far Cry 3. The others are uninteresting to me.

As much as I would like to support AMD here, I have a 5870 that still does everything I can throw at it. What sense does it make to get a new card?

Tessellation, compute and power efficiency are three large reasons.

I know where he's coming from though... I tend to upgrade my graphics card when I reach a point that the current can't play a game I want well. This generally means a new card every 3 - 4 years. If you're budget minded, you don't upgrade every time something new and shiny comes out... you try to make the purchases last and probably won't bite the bullet for a free game or two. If you're somebody who has to have the latest, you probably already own the card you want.

As much as I would like to support AMD here, I have a 5870 that still does everything I can throw at it. What sense does it make to get a new card?

As far as I'm concerned, we've reached the point of diminshing returns in high-end GPUs, and in general processors. And I say this as a former computer-upgrade junky.

Totally agreed. I have an 8800 gts in my computer, and while it's not as fast as the latest card, the prices that have been propped up by the video card manufacturers for incremental upgrades for several generations now keep me from upgrading.

Prices on hardware used to come down and cards were replaced by obviously superior models at the same price point.

Right now you don't have that happening, you just have cards that perform the same as the ones that preceded them stacking up in the market.

The nvidia 560 to 660 is a good example, the upgrade is miniscule, and even a couple of years after the 560 series was released it's only come down in price by 25-30 percent.

Years ago the price model would have had the 560 selling for a 50%+ discount from its original price and the 660 beating it by 50%+ in performance.

As much as I would like to support AMD here, I have a 5870 that still does everything I can throw at it. What sense does it make to get a new card?

As far as I'm concerned, we've reached the point of diminshing returns in high-end GPUs, and in general processors. And I say this as a former computer-upgrade junky.

Totally agreed. I have an 8800 gts in my computer, and while it's not as fast as the latest card, the prices that have been propped up by the video card manufacturers for incremental upgrades for several generations now keep me from upgrading.

Prices on hardware used to come down and cards were replaced by obviously superior models at the same price point.

Right now you don't have that happening, you just have cards that perform the same as the ones that preceded them stacking up in the market.

The nvidia 560 to 660 is a good example, the upgrade is miniscule, and even a couple of years after the 560 series was released it's only come down in price by 25-30 percent.

Years ago the price model would have had the 560 selling for a 50%+ discount from its original price and the 660 beating it by 50%+ in performance.

nVidia's 600 series (Kepler) architecture is leaps and bounds better at game performance than the 500 series... I don't know what you're talking about. And if you haven't noticed, the reason they can't give massive discounts these days is because their profit margins are razor thin. In the old days, they could charge an unbelievable amount up front and then give these 'massive' price cuts to make you feel better. Is that really better economics? But seriously. Read this: http://www.anandtech.com/show/6175/amd- ... le-inbound

I'm not a big fan of the ATI cards. Don't get me wrong they're good video cards, but the software/drivers seem much more difficult to use, and rarely work for many of the uses we need at my place of business. Settings are hard to make, don't hold across separate accounts, and are limited in many respects. For distributed video systems such as we use this is important.

I don't have any issues with Nvidia cards and the settings are the same whether using a basic account or an admin account. For our purposes the Nvidia cards just work, with very little overhead in trying to make every single one display the same way. YMMV.

Note the price chart. That's the part that bothers me most. The 660 is an incremental upgrade, but the price of the previous series of cards is what bothers me more.

Of course, the AMD situation is even worse as the 5900 series was such a performance monster that AMD is still working back to that performance apex. It's no wonder they're having trouble selling - they literally sell slower hardware now at the same price compared with what they brought out years ago.

As much as I would like to support AMD here, I have a 5870 that still does everything I can throw at it. What sense does it make to get a new card?

As far as I'm concerned, we've reached the point of diminshing returns in high-end GPUs, and in general processors. And I say this as a former computer-upgrade junky.

Totally agreed. I have an 8800 gts in my computer, and while it's not as fast as the latest card, the prices that have been propped up by the video card manufacturers for incremental upgrades for several generations now keep me from upgrading.

Prices on hardware used to come down and cards were replaced by obviously superior models at the same price point.

Right now you don't have that happening, you just have cards that perform the same as the ones that preceded them stacking up in the market.

The nvidia 560 to 660 is a good example, the upgrade is miniscule, and even a couple of years after the 560 series was released it's only come down in price by 25-30 percent.

Years ago the price model would have had the 560 selling for a 50%+ discount from its original price and the 660 beating it by 50%+ in performance.

I had the same sort of dilemma. My desktop came with a GF9500GS. Two months ago, I was playing STO and saw the temp hit 100C. Not good. The fan had stopped working, as it'd become all bound up (and this was a card I'd replaced 3 years ago because of a fan recall). Downclocked, and stopped playing games until I could get it replaced (which had to wait a month because Dragoncon was taking all my spare money).

Ended up buying an AMD 6570. Part of that was the cost, It was $59 at microcenter, amongst all the GTX230s (with a $20 mail in rebate). But perhaps more importantly, it's a PASSIVE card. No fans to worry about.

I'll also add it's the first ATI card I've ever had (previous cards have been voodoo2, GF32, GF2 GTS64 (after the first geforce melted in less than 6 months) then a Fx5200 for a few months before the 9500. My last laptop had an 8200M and this one has the intel graphics (i3 chip) I'm not a huge game player, and i tended to drop the graphics down where possible (if a game needs graphics on high, it's a bad game)

But I didn't buy ATi because I felt the need, or wanted it. It was just the best card in my budget range, and the only passive one. So maybe they need to focus there.

I also found things INCREDIBLY complex, for comparison. Some cards would have two or three model numbers, often in different series. Slimming down their product lines, to give more differentiation might help them more as well.

Interesting timing. I was thinking about buying a 7850 tonight when I got home. Looks like I should see if I can find one with the bundle, although Newegg has a banner for it on the 7850s none of them list it yet on the product page. I'd never buy Far Cry 3 (because screw Ubisoft), but hey, can't complain about free.

Prices on hardware used to come down and cards were replaced by obviously superior models at the same price point.

Right now you don't have that happening, you just have cards that perform the same as the ones that preceded them stacking up in the market.

I had the same sort of dilemma. My desktop came with a GF9500GS. Two months ago, I was playing STO and saw the temp hit 100C. Not good. The fan had stopped working, as it'd become all bound up (and this was a card I'd replaced 3 years ago because of a fan recall). Downclocked, and stopped playing games until I could get it replaced (which had to wait a month because Dragoncon was taking all my spare money).

Ended up buying an AMD 6570. Part of that was the cost, It was $59 at microcenter, amongst all the GTX230s (with a $20 mail in rebate). But perhaps more importantly, it's a PASSIVE card. No fans to worry about.

I'll also add it's the first ATI card I've ever had (previous cards have been voodoo2, GF32, GF2 GTS64 (after the first geforce melted in less than 6 months) then a Fx5200 for a few months before the 9500. My last laptop had an 8200M and this one has the intel graphics (i3 chip) I'm not a huge game player, and i tended to drop the graphics down where possible (if a game needs graphics on high, it's a bad game)

But I didn't buy ATi because I felt the need, or wanted it. It was just the best card in my budget range, and the only passive one. So maybe they need to focus there.

I also found things INCREDIBLY complex, for comparison. Some cards would have two or three model numbers, often in different series. Slimming down their product lines, to give more differentiation might help them more as well.

You're on my wavelength.

Too many cards stacked up in the market that also have very little differentiation in performance - the video card market is in serious need of an enema.

Pricing older model cards to sell and clear out some market would be a great start. I think it's absurd for a video card that started off its life at ~$200 in 2010 to still be selling for $170-180.

That means either that performance of newer chips is too stagnant or that prices on older cards are being kept too high.

AMD's drivers are still so bad, even for their motherboard chipsets, that it makes it impossible to buy their GPUs. Thats probably what holds AMD back more than anything else. I remember back in the day when then ATI and all the ATI fans had made such a big deal about how their drivers were "finally as good as nvidia's drivers". So I gave their card a shot. Fresh install of Windows, newest version of their driver. Fired up GTA3. Had no ground textures. None. Was walking on air. Uninstalled the driver, took the card out, put my nvidia GPU back in and installed their driver. Guess what? Had textures again. If AMD would fix their drivers then I would consider their hardware.

coder543 wrote:

jackstrop wrote:

timw4mail wrote:

As much as I would like to support AMD here, I have a 5870 that still does everything I can throw at it. What sense does it make to get a new card?

As far as I'm concerned, we've reached the point of diminshing returns in high-end GPUs, and in general processors. And I say this as a former computer-upgrade junky.

Totally agreed. I have an 8800 gts in my computer, and while it's not as fast as the latest card, the prices that have been propped up by the video card manufacturers for incremental upgrades for several generations now keep me from upgrading.

Prices on hardware used to come down and cards were replaced by obviously superior models at the same price point.

Right now you don't have that happening, you just have cards that perform the same as the ones that preceded them stacking up in the market.

The nvidia 560 to 660 is a good example, the upgrade is miniscule, and even a couple of years after the 560 series was released it's only come down in price by 25-30 percent.

Years ago the price model would have had the 560 selling for a 50%+ discount from its original price and the 660 beating it by 50%+ in performance.

nVidia's 600 series (Kepler) architecture is leaps and bounds better at game performance than the 500 series... I don't know what you're talking about. And if you haven't noticed, the reason they can't give massive discounts these days is because their profit margins are razor thin. In the old days, they could charge an unbelievable amount up front and then give these 'massive' price cuts to make you feel better. Is that really better economics? But seriously. Read this: http://www.anandtech.com/show/6175/amd- ... le-inbound

Now, open mouth, insert foot.

I really don't understand the "incremental" upgrade comment by that poster. The Kepler architecture is a massive upgrade over Fermi. The GTX 560 from 460 upgrade was incremental. I had an original version of the 1GB GTX 460 and could push its clocks a little bit and match the GTX 560. But going from that GTX 460 to the GTX 660 Ti I have now was a massive upgrade. And even thats an understatement. In Crysis 2 (yeah, I know), in the "Corporate Collapse" level, theres a scene where you walk through a little cave to get from one area to another. In that cave with DX11 enabled while using the "Ultra" preset, the frame-rate would drop to around 17 frames per second on the GTX 460. On the GTX 660 Ti the frame-rate "drops" to 72 frames per second at the lowest point. In every other game my frame-rates have more than doubled, or allowed me to turn on image quality enhancements that were otherwise impossible to have on while running the GTX 460. So not only do some games like Battlefield 3 and Crysis 2 run significantly (understatement) better, the image quality of all of my other games has vastly improved. My games in motion look better than screenshots.

This isn't a terribly bad idea - software drives the purchases of higher end discrete video cards. I purchased two GTX680s when they were first available so I could try out a 5760x1080 setup for gaming - the two cards perform wonderfully and every game I've thrown at it with max settings maintains a solid 60FPS w/ vsync.

But the price that I had to pay for this set up wasn't worth it. If I didn't have a lot of disposable income I would never have made the same purchase; my previous GTX460 set up was fine with moderate to high settings on a 1440x900 resolution. If it weren't for some newer games coming out that I wanted to play at max settings, I wouldn't have upgraded.

AMD has simply lost trust with a lot of consumers thanks to their trash performance with drivers for so many years. nVidia has been stellar with driver updates and that's why I went from a 5850 to a GTX 680 when I upgraded. AMD needs to have parity with nVidia on the high end if they want to retain relevance in the GPU market.

EDIT: Must have a lot of fans of AMD here in the comments section because almost every single post that has slammed AMD on drivers has been downgraded. Also if AMD wants to become relevant they need to stop making monumental failures like Bulldozer and the current Trinity chips that only cater to the middle and below. Anything that is a quality failure in their chip set business is invariably going to seep into their GPU business.

Too many cards stacked up in the market that also have very little differentiation in performance - the video card market is in serious need of an enema.

Pricing older model cards to sell and clear out some market would be a great start. I think it's absurd for a video card that started off its life at ~$200 in 2010 to still be selling for $170-180.

That means either that performance of newer chips is too stagnant or that prices on older cards are being kept too high.

I'm a bit confused by comments like this. The 5770 I bought in 2010 for $200 is now $99. The 5850 was $300 then (brand new just came out), a 6850 (basically the same card) is now $150. That fits in line with price halving every 2 years, which is about on track IMO.

AMD has simply lost trust with a lot of consumers thanks to their trash performance with drivers for so many years. nVidia has been stellar with driver updates and that's why I went from a 5850 to a GTX 680 when I upgraded. AMD needs to have parity with nVidia on the high end if they want to retain relevance in the GPU market.

I wouldn't call their driver updates exactly stellar... they seem to break something with SLi or spanning for the 680 every time they release an update. It might only be first hand acecdotal evidence, but I keep having to revert updates because every other update I get from nVidia causes games to crash to desktop unless I have spanning or SLi disabled.

It's interesting that the "ATI has shitty drivers, and Nvidia has great drivers" meme, which is a decade old now continues to live. It hasn't been true for a long time. Anecdotally, I would actually say that AMD has better drivers.

It was true ten years ago...then the Radeon 9700 happened. Since then, it's been a back and forth slugfest with NV surging ahead at times (GF6 and GF8), but ATI surging at others (Radeon 4xxx/5xxx).

I have owned both green and red in between, and in my experience, I've actually had more problems with NV drivers than I have had with ATI/AMD. With ATI/AMD, I can keep a driver on for months and not worry. Games just work.

I've had 2 nVidias, 3 ATIs. I've had driver issues here and there with both companies, and I can't really say I've had significantly more from one or the other. I've heard time and time again not to go ATI because of bad driver support... I've never had an experience that left that bad of a taste in my mouth, they've tended to be the best bang for the buck at whatever price point I'm looking into at the time. For full disclosure, both nVidia cards I had were laptops, but they were gamed on quite heavily whenever I was on the road for work.

Here I was thinking the old saying was that ATI drivers were superior to Nvidia Drivers. And that it changed lately.

I never really had a problem with ATI drivers until since about a year and a half ago. Since then there has been one Beta-driver that was somewhat stable for me (12.6b i think). This might be because I have an older dual GPU card (5970), but I've had troubles on 7770, 6670, 3870, 4850 as well during that time. It was never the drivers issue for me until lately. My next card and for a while coming I guess will be Nvidia.

I miss te days when gaming centric hardware always came with at least one game. I maser the BL2 deal by about a month and got only a 670 for my money. Some of my favorite games of all tine were packins that I didn't know about prior to buying the hardware.

But no, I'd rather have a 7970 *without* any games bundled than a 690 *with* any set of games, personally.

(the 690 compute performance is trash.)

Hyperbole much? Having been programing for compute gpu's since before the transition to double precision hardware (I started with 8800GTX's) I can say that this is incorrect. The performance isn't trash. In fact, in some scenarios it will blow Fermi's (4/5) totally out of the water. In some cases it doesn't. It totally depends on your problem space.

And no, I wouldn't rather a 7970. The CUDA sdk is far beyond anything that AMD ships imo.

I've been using AMD (previously ATI, which I think was a mistake to drop this branding, but that's a different story) video cards for years. Honestly, the last problem I had with a driver came with an update that messed with Skyrim. I simply rolled back to the previous version of the driver and all was once again right with the world. To be honest, I'm certain that they have since released updated drivers, but I haven't felt the need to update again. I probably should...

As for hardware issues. I've only had a single issue with their hardware. I started noticing blocks all over my screen, even when it was sitting idle (monitor still on of course). After a bit of troubleshooting, I determined that it was being caused by overheating. I then went and purchased a new case (since the one I had seemed to be filled with wires that went nowhere that couldn't be removed without wire cutters) and a replacement video card (same model I had previously used). It turns out that the problem didn't stem from the video card itself (as I had thought it did). It turns out that the fan over the CPU's heatsink wasn't seated properly meaning that not enough heat was being directed away from the system. The good news was that it was still moving away from the CPU so there was no permanent damage there. The bad news that was that it was being directed towards the video card which is what caused the issue.

Now, here's the thing. Even though the card was roasting, it was still functioning. I continued to use that video card until the new one arrived (and I hung onto it as a backup in case I ever needed it again). I think that speaks volumes to the product itself.

Oh, and for those interested, the card is a 6850. It still runs well, even with the newer games coming out. So, I don't predict having to replace it for another year at least (probably two or more actually).

I've had 2 nVidias, 3 ATIs. I've had driver issues here and there with both companies, and I can't really say I've had significantly more from one or the other. I've heard time and time again not to go ATI because of bad driver support... I've never had an experience that left that bad of a taste in my mouth, they've tended to be the best bang for the buck at whatever price point I'm looking into at the time. For full disclosure, both nVidia cards I had were laptops, but they were gamed on quite heavily whenever I was on the road for work.

ATI's drivers were pretty bad a few years back. However, I've only seen one problem with their drivers on my newest card (6850) and that was fixed the next driver update.

I actually did a thought experiment a few months ago on how the video game industry could turn a profit without copyright protection. One of the things I thought of was that while you might be able to get the games for free, the hardware it runs on is never free. Which ultimately means the video card industry in highly dependent on a strong video game industry. I concluded that you could eventually see video card companies directly sponsoring the development of games in order to sell more cards. Particularly games that push the boundaries where you would need an upgrade in order to play it.

To me including free games with a video card is a step in that direction.

From a financial standpoint the best advice I can offer to AMD is get out of the CPU business. You still make amazing GPU's that consistently kick Nvidia's ass. Don't let the CPU ship drag the GPU business with it.

But I already have two Radeon HD 6970s in CrossfireX, and they are holding up decently.

Well, that is, unless I try and max everything out in 3x1 Eyefinity. Even doubly that I can't do stereoscopic 3D across all 3 due to a lack of mini-DP outputs (something about syncing refresh rates, apparently).

I've been using AMD (previously ATI, which I think was a mistake to drop this branding, but that's a different story) video cards for years. Honestly, the last problem I had with a driver came with an update that messed with Skyrim. I simply rolled back to the previous version of the driver and all was once again right with the world. To be honest, I'm certain that they have since released updated drivers, but I haven't felt the need to update again. I probably should...

As for hardware issues. I've only had a single issue with their hardware. I started noticing blocks all over my screen, even when it was sitting idle (monitor still on of course). After a bit of troubleshooting, I determined that it was being caused by overheating. I then went and purchased a new case (since the one I had seemed to be filled with wires that went nowhere that couldn't be removed without wire cutters) and a replacement video card (same model I had previously used). It turns out that the problem didn't stem from the video card itself (as I had thought it did). It turns out that the fan over the CPU's heatsink wasn't seated properly meaning that not enough heat was being directed away from the system. The good news was that it was still moving away from the CPU so there was no permanent damage there. The bad news that was that it was being directed towards the video card which is what caused the issue.

Now, here's the thing. Even though the card was roasting, it was still functioning. I continued to use that video card until the new one arrived (and I hung onto it as a backup in case I ever needed it again). I think that speaks volumes to the product itself.

Oh, and for those interested, the card is a 6850. It still runs well, even with the newer games coming out. So, I don't predict having to replace it for another year at least (probably two or more actually).

Before I upgraded my cooling in my case, some games got me over 100C in GPU temps in my computer. Turns out the culprit also was the CPU fan (was seated properly, but was blowing air into the case, and my other fans caused turbulence, preventing the heat from venting outwards. Upgraded my case for better airflow and got a significantly better CPU heatsink/fan, and my temps dropped 20-30C.

Despite that heat, though (which was going on for about a year, by the way), my cards and all my components function perfectly fine.

The Radeon HD 6970s I have were made by MSI. I don't know if it's just because MSI is that good, or what, but it impresses me that they're still healthy.

Kyle Orland / Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in the Washington, DC area.