The Radeon HD 6950 Sweet Spot: Five 1 GB Cards Rounded-Up

AMD’s 1 GB Radeon HD 6950s are certainly cheaper than their 2 GB predecessors. But are they really a better value? In theory, the Cayman GPU runs out of steam before 2 GB is needed. So, we're comparing five custom 1 GB cards to the 2 GB part to find out.

Which Radeon HD 6950 1 GB Should You Buy?

Gigabyte has the best performance at a class-average price, leading to its performance value leadership. HIS has the lowest noise and power consumption, but second-place performer MSI has the best efficiency. So who wins?

Let’s consider a few of the features and additions. HIS and Sapphire provide a free game, DiRT3, with their cards. The H695QN1G2M also has added value compared to both Gigabyte and Sapphire in its dual DisplayPort outputs, yet costs $10 more. Sapphire becomes the value leader for those who want the free game, while HIS is the only choice for those who want both a free game and dual DisplayPort outputs.

Yet, we can’t look past Gigabyte’s performance advantage, especially when we consider that most people don’t need a pair of DisplayPort connectors.

When it comes to Tom’s Hardware Recommended Buy award, there can be only one recipient. At $240 for either the Sapphire or Gigabyte cards, we have two potential winners, though. The tiebreakers for value seekers will be Gigabyte’s slight performance advantage or Sapphire's free game, but we know some enthusiasts insist on the fastest product.

It’s because of this division between different (but equally important) judging criteria that we introduced the Tom's Hardware Approved award, which can be bestowed on multiple deserving products able to service disparate segments. While we generally approve of most of these cards, the two products that most easily earn our recommendation are Gigabyte’s R695OC-1GD and Sapphire's dual-fan HD 6950 1GB GDDR5 PCIE.

XFX also deserves special credit for its available lifetime warranty. Although we know most serious gamers replace their cards after a couple of years anyway, anyone concerned about warranty coverage beyond three years should look to XFX, buy the XFX HD-695X-ZDFC, and then register that purchase immediately.

I've really started gravitating to the card with the lowest noise at idle -- but if I don't need to pour hot wax in my earholes on load, that's certainly a bonus. Perhaps the forthcoming 28nm GPUs can give the same kind of quietness at full load -- even overclocked substantially -- that my i5 2500K does. The main issue with aftermarket GPU coolers is their questionable compatability and the fact that the market for them isn't nearly as broad as CPU coolers are. I bought a Noctua for my CPU, that I can take with me from 775, 1156, 1155 (maybe X58 boards too)... in addition to every AM2 and AM3/3+. So I view that purchase as more of a long term investment. If GPU coolers were the same way, I'd have no problem just buying the GPU I want and slapping a cooler on it.

If only things were so simple. That's why I think (hope, really) that a large number of next-gen low and mid range cards will be mostly silent, and very efficient.

mayankleoboy1are the benches of metro2033 correct??my gtx 580 @ 1080p with these exact settings gets around 35 average fps.the low fps are probably around 15.
Yeah it cost twice as much to. I could CF both of these cards and it would kill your card in performance/price

The metro2033 benches are on medium that's why...
and why ARE they on medium settings? wouldn't it show the benefit of 2gb on higher settings, hell even on my 6850 I play it on higher settings than that...

conjugateWhy wasn't the Asus HD 69501GB w/ DirectCU II cooler tested? I'd think that's a pretty strong competitor as well.Try asking Asus? If you don't see a card it's because the company decided not to use it.amirpThe metro2033 benches are on medium that's why...and why ARE they on medium settings? wouldn't it show the benefit of 2gb on higher settings, hell even on my 6850 I play it on higher settings than that...Thumbs down for not reading the text below those charts:

The benchmark for this game also reports minimum frame rates, and most players will only be able to see apparent smoothness at 2560x1600 using medium details with AA disabled (producing around 19 FPS minimum).

The test was set up to produce playable framerates in the sample map. The tests showed a minimum framerate of around 19.8 FPS using MEDIUM details and no AA at 2560x1600. Obviously, the sample map pushes these graphics cards harder than the maps you're currently playing.

greghomeActually no, my 6950 1GB handles Metro as good as the 2GB version on very high settings with Adavanced DOF on,only difference between mine and the Sapphire card used in this article is mine has just one fan..........wtf....You'll still see it fall off at 2560x1600 more than the 2GB version :) Not that anyone actually buys a single 1GB 6950 to run 2560x1600 or higher in Metro 2033 at very-high settings, CrossFire does exist for a reason.

mayankleoboy1are the benches of metro2033 correct??my gtx 580 @ 1080p with these exact settings gets around 35 average fps.the low fps are probably around 15.
I have found that Metro 2033 requires a strong CPU as well as GPU. Your CPU might be the bottleneck. I've also found that Metro 2033 is one of the few games I've played that hyperthreading matters.

I have the XFX 6950 2GB card and when I bought mine, the cost difference (in the real world) between the 1 GB and 2GB cards was lik $20. I just didn't see the value of going to a 1 GB card for $20.

Unless prices have changed a lot, I don't see the 1GB 6950 as the sweet spot.There are probably a dozen of other professional reviews that show that the 2GB version DOES greatly improve performance at the highest settings. At the highest settings, the 6950 2GB card virtually ties the more expensive 570.

It would have been interesting to see which of the cards overclocks the best. I moved my settings up in ATI's Catalsyst Control but the card did not overclock when I moved the settings up for some reason. I tried researching it but XFX's info kind of sucks. Anyway, my card is so fast that I decided it wasn't important anyway and I don't game.

I liked how the review was made, it solve part of my decisions to get a 2Gb or 1Gb card. But I think, that would be maybe more accurate if on the charts were shown the max amount of graphics memory used per game at each settings and resolution... Just to see what results pops out...