When the first HD3870 review was published on X-bit, I wrote a comment about testing with no AA or 2xAA. Still nothing, tests are only done with 4xAA, which seems to fit Nvidia as their antialiasing is very efficient, and differences are supposed to mostly disappear when anything but 4xAA is chosen.

Please, guys, could you test with no AA and 2xAA?

Also, what happened to image quality tests? You used to have those some time ago...

Preview

We've been testing with FSAA 4x only for years now. Moreover, even with FSAA 4x we are CPU-bound in certain cases with 1280x1024 resolution. In cases where ATI Radeon HD 3870 X2 is considerably slower than the GeForce 9800 GTX we see either CrossFireX issues or Catalyst issues, not FSAA 4x issues.

Preview

To be honest, I'm more interested in the vanilla HD3870 with no AA and 2x AA. I've "always" used ATI cards (Matrox before them), probably a habit from the times they had a great VGA output and when Serious Sam taught me what 16xAF means, so I'm considering keeping the tradition... And I have a passive cooler ready, in case I buy one.

The 8800 and 9600 cards aren't much more expensive and they seem to be faster in the tests commonly done on all tech websites. What I don't know is whether the difference disappears if other AA methods are chosen; in other words, if Nvidia is very efficient with 4xAA only. Is there even a need for 4xAA in 1600x1200 or 1680x1050, much less in higher resolutions?

There's also the issue of ATI taking a long time to optimize their drivers so further speed improvements could be possible, while Nvidia usually doesn't improve much.

Finally, I would like to see an article about image quality. Like I said, it's been a long while since you've done one of those, and none directly apply to the current generation of graphics cards. Is AF still better on ATI than on Nvidia? Who has more accurate AA? Etc.

Preview

The difference between performance of ATI Radeon HD 3870 and Nvidia GeForce 8800 GT will hardly become considerably more significant or considerably less significant without FSAA 4x (unless you use a very slow CPU - there will be no difference, or very fast, overclocked, CPU). FSAA is needed even in 1920x1200 and higher.

Image quality (IQ) comparison article is uneasy thing to do these days. There are a lot of reasons for that:

- Firstly, a lot depends on applications and their rendering methods (not all support FSAA).

- Secondly, thanks to Nvidia's TWIMTBP, ATI has no access to certain games until they are released, thus, Radeon graphics cards may suffer from performance and/or quality problems which eventually get solved.

- Thirdly, we are not completely sure about the list of games where our readers would like us to compare quality and what exactly we should compare.

- Fourthly, image quality should be compared considering performance factor as well. We do use driver settings that produce similar IQ on ATI Radeon and Nvidia GeForce when making speed comparison. But what exactly should we compare in case of IQ comparison is not completely certain: not a lot of people enable the absolutely highest IQ settings in the drivers.

There are a lot of uncertainities these days regarding IQ comparison. So, while we do have certain ideas, we are more likely to compare IQ briefly in the foreseeable future rather than do a comprehensive IQ review.

Preview

2.

9800 GTX is not 9800 GX2... and there are not a single game in the world where 9800 GX2 would lose to 9600 sli (basicaly 9800 GX2 is 8800gts in sli mode, and 8800 gts 512 will be always better than 9600....)

Preview

Preview

I read you paired it with the Asus?

Sorry im not all that computer literate, but from what i made ot, the gigabyte was being tested also had the other asus 9600 hooked into it via SLI? Or was all the tests done, just solely based on the gigabyte 9600 by itself?