With so many options on the market right now, what makes the GTX 280 a good choice for anyone? The fact that it is the highest-performing card out there sure helps, but it's still not for everyone. To join this club, you better hope you have one massive resolution to push.

Blah!! Too many videocards these days. Every week a new model is released. And the drivers from last years cards are not perfected. And the performance is irrelevant, you see, next year's integrated graphics chipsets from the likes of intel will outperform these behemoths, and sip power to the tune of 15 watts as a bonus!

I feel sorry for you early adopters blowing a ton'o money on all these things.. sheesh!

With all these variants and model numbers and clock speeds and memory sizes out there how is one supposed to know what to buy? A lot of folks I know just up-and-up and go integrated graphics or get the cheapest that works. The ati and nvidia are just hurting themselve with such intense granularity!!

It depends what you want i got an 8800GTX when they first came out and it was very expensive and now they are considerably cheaper.... The same will happen with these and has already happened to some degree which however improving its "bang for buck" doesnt make it feasable in that sense. I wont be buying one of these even though i play allot of high end games at max setings an 8800GTX does me fine and the extra fps you get from one of these however nice it is to have wont help you that much, not for that cost difference anyway. However if you are a company who needs a realy powerful gpu for a time saving solution this is the card i rekon this is more of an "industrial" card than a gamers card considering the competition atm.

Nice review, and I am extremely sure you knew this was coming... but you really needed an HD 4870 in there. I know GPUs aren't easy to come by, but unless you use Xfire then it's the next best thing to test against the GTX 280.

I wish both companies would launch their upcoming cards already, I need to buy a GPU and this is killing me! 4870 X2 or GT200b 3xx... or maybe some better prices on the 280/4870... either way it'll be good.

Nice review, and I am extremely sure you knew this was coming... but you really needed an HD 4870 in there. I know GPUs aren't easy to come by, but unless you use Xfire then it's the next best thing to test against the GTX 280.

You couldn't be any more right... I know I need one. Should have had one in long ago, but it's taking a while. Still, what I said about 2560x1600 should prove true in all cases. If you want that resolution along with topped-out settings (like AA), then the GTX 280 is still a proper high-end card. Still expensive, but I believe it would fare much better than an HD 4870 at that same resolution with equal settings.

Well something I noticed (And one site is claiming because of DX10.1 support specifically) is that the Radeons take a very minimal hit due to AA use. They simply scale better with AA than NVIDIA cards now, and at least part of this is the sheer memory bandwidth GDDR5 is giving. This is obviously opposed to their last 2 generations where AA hurt them.

The GTX 280 is unquestionably the best single card. But before the price cuts... it was by far the worst to buy! For less than a single GTX 280 you could buy two 4870's or even 4850's that lay waste to it.

Now that we have a $150 price cut things are more questionable... but still, two HD 4850s cost $100 less, and gives better performance than a GTX 280.

As much as I would prefer to keep a single GPU and avoid any possible Xfire of SLI issues, there hasn't been a better time than ever to try multiple GPUs. And right now, HD 4850 Crossfire looks to give the absolute best price/performance, and best performance short of two 4870's for xHD gaming.

What is even funnier, GTX 280 SLI scaling is just a joke. HD 4870 Xfire still competes well against $1k+ SLI setups.

Edit:

Okay, let me put it this way. Look at FiringSquad's review... the HD 4850 Crossfire beat the GX2 and GTX 280 in every single test at 1920 or 2560 resolution. The only thing better, is two 4870's....

FiringSquad doesn't seem to publicize their testing methodology, so I'll assume they stick to timedemos like most everyone else. So, I could care less what their results say. I'll work on getting in two 4870's or another HD 4850 and then perform similar tests, although with real gameplay, since that's where it matters and we aren't so lazy.

Either way, you are right. It's a good time to get into the dual-GPU scheme of things. For $400, you get a lot of power.

Well, FiringSquad, Tech Report, Anandtech, all of them have Crossfire tests that show the same tend. The only thing comparable seems to be two 9800GTX+ versions, but those will cost an extra $60 beyond two 4850's... AMD still comes out on top whichever what it's sliced.

Something that was confusing for me when reading your review here was exactly which tests had AA and which did not. The text in the review seemed to imply only the 1920x1200 score had AA used... since this makes a big difference (And who doesn't use AA), it's an issue. Just stick it on the charts or something. The 4870 by itself plays just fine at 2560x1600 with AA in some games, so there's no reason not to test with 4x or 8xAA in everything else. As I said above, ATI scales much better with AA than NVIDIA now.

Something that was confusing for me when reading your review here was exactly which tests had AA and which did not.

As much as I hate to 'uglify' the charts, I might have to start adding that information into the title. But there is a reason I include a link to our testing methodology at the top of every-single results page. I show a screenshot for every game, with the exact settings used at a given resolution:

Well, if you read too fast, then a lot of other people do as well, I'm sure. I hate to cloud up the graphs with a bunch of information like that, which is why I made a more robust testing methodology page, and prefaced each performance page with a link to it.

Any recommendations on changes you'd make to the settings I chose? There are a few I wish I thought more about before doing them, but for now I think most of them are ok. I am not sure why I chose no AA for 2560x1600 in CoD4, for example. Makes no real sense. That one I kind of regret.

The sad thing is that changing something means re-installing a whack of GPUs to retest... not a fun task.

To be honest I was mostly comparing results here to other reviews, and did not sit down to fully read the review. At the time I was mostly after some 4870/260 numbers to compare to everything else. I'm not sure if it is a bad habit or not, but before opening my mouth for comments/questions I try to read the full page that the comment is related to if I hadn't already, to ensure I didn't miss something. I think this is the third time I'd have found the answer on the Testing Methodology page for a review, so I guess it's just a bad habit of mine.

I can't really make up my mind on any particular GPU setup so I've looked at everything and then some... 9800GTX SLI vs 4850 Xfire reviews are hard to find, let alone GTX+ or simulated GTX+ cards. Taking into account the lack of GTX+ cards and that vanilla GTX cards are well over $200 I think the 4850 is just unmatched... it beats the 9800GTX soundly enough and is still cheaper.

Since you've asked for comments I'll try and actually be helpful for a change.

Regardless of if it's a bad habit or not I tend to look for AA/AF settings on the page with the benches. Mostly I would suggest picking an AA/AF setting and keeping that constant for the entire review, especially within the same game. Not labeling the graphs is fine, but for lazy bums like me having the AA/AF mentioned at the start of each new game is easiest if not on the graphs. Doubly so if the game doesn't support it or other AA/AF settings are going to be used.

Quote:

Depending on the graphic card being reviewed, we split up models into two different categories: Low-End to Mid-Range and Mid-Range to High-End. The former will see the GPUs tested using 1280x1024 and 1680x1050 resolutions, since those are the most common resolutions for gamers looking to purchase a GPU in that price-range.

For our Mid-Range to High-End category, we test GPUs at 1680x1050, 1920x1200 and also 2560x1600 to better reflect the resolutions for those looking for a solid GPU offering.

This confused me until I realized that you only were doing Mid to high-end range testing for this review. I might suggest leaving the first part out as it isn't relevant to this actual review, since only the mid-high range was tested?

I would make note if ya had to, or don't use any driver overrides since those aren't relfected in the ingame screenshots. I do like that you have screenshots for every resolution there, because some settings do differ between resolutions within the same game so it's good to know.

Blah!! Too many videocards these days. Every week a new model is released. And the drivers from last years cards are not perfected. And the performance is irrelevant, you see, next year's integrated graphics chipsets from the likes of intel will outperform these behemoths, and sip power to the tune of 15 watts as a bonus!

I feel sorry for you early adopters blowing a ton'o money on all these things.. sheesh!

It is about time the industry stops doing this. If intel releases more than 4 proc a years... they are wasting their R&D time and money.

If a company releases more than 2-3 Vid cards a year, they are wasting their R&D time.

Want proof...? Look at some of the problems some of these cards have. Look at the fooking initial drivers.

Here is an idea!

Dear industry:
Don't release a card every 3 weeks when you guys figured out that you like a certain colored switch better than another one. **** THAT! and **** YOU TOO!

Kougar, thanks for all of the input, I appreciate it. I'll take a lot of that advice into consideration with the next GPU review. I'll see about adding the information to each graph, and just make it look good. I'll figure out a way... it'd be a it easier all-around. The last thing I want to do is confuse our readers.

As for the screenshots, I am not sure what you mean about the driver overrides. I never apply any sort of overrides in the drivers when benchmarking, but rather let the game handle everything. If you think this is an issue, then please let me know. Essentially, the game it would affect is UT III, since I would not have AA enabled, since it's not available in the game options.

Those screenshots just show the settings I use for each resolution, even if the screenshot itself is 1280x1024. I kept the screen resolution the same for the sake of having the thumbnails look good beside each other, but what's selected in the screenshot is accurate. I use those same screenshots when benchmarking to double-check to make sure I am using the exact same settings each time.

I'd like you input on more things later, if you don't mind. I appreciate your input and constructive criticism, since it's exactly the kind we need.