Without doubt the biggest issue many people had with NVIDIA's 7-series was the terrible texture filtering it offered by default. Texture Shimmering, a side effect of overzealous filtering optimizations on NVIDIA’s part, was always an huge issue for 7-series cards, and while it was reduced over time, it never really went away completely.

With the tremendous texture filtering capability of the G80 it’s good to see NVIDIA take advantage of it and drastically improve their default texture filtering quality. Texture shimmering is completely gone now from what I can tell, even in the default control panel “Quality” mode.

Probably most notable, though, is the almost perfectly angle-independent anisotropic filtering that the 8800 series uses as the default setting. This is probably the biggest image quality improvement NVIDIA could have made, and they definitely deserve extra points for making it the default setting. I was a little disappointed to see that they are still doing their trilinear optimization by default (aka “brilinear”), but in all fairness it is virtually impossible to tell the difference now with and without those optimization enabled (very much unlike the 7-series cards). Even so, High Quality mode disables those trilinear optimizations if you run into some games where it becomes evident. Check out the JavaScript app to the right where we compare sampling patterns on the 8800, the 7900, and ATI’s X1950 using D3D AF-Tester.

Screenshots aren’t the best way to showcase filtering quality (you really need movement to spot texture shimmering and other issues), but we make do.

Below we have provided several more JavaScript apps to help you see up close what the texture filtering quality of the 8800 is like compared to the 7-series and ATI’s X1950. In lieu of doing one game and providing shots of all quality levels, we are doing several games from different genres and comparing the highest quality possible (High Quality 16x AF on the 8- and 7-series cards, 16x High Quality AF on the X1950 XTX). The first game is the new SimBim racing sim GTR 2, the second is the excellent RTS Company of Heroes from Relic, and the third is the old shooter Far Cry from Crytek. Commentary on the texture filtering quality for each game is included within the apps.

It’s difficult to quantify the results shown here. In Company of Heroes the 8800 filtering looks noticeably better, and in GTR 2 ATI hardware appears to make the textures cleaner and sharper. In Far Cry, conversely, it looks to be a virtual tie between the 8800 and the X1950. Overall, at least when directly comparing the settings used in this review, I would say that both the 8800 and ATI’s X1950 do an excellent job at texture filtering. As for default control panel settings, NVIDIA clearly has the better filtering pattern which they definitely deserve some credit for.

As with the AA tests on the previous page, AF quality is only good if it doesn’t hurt performance too much. The performance results below were gathered using the second timedemo in out Half-Life 2: Lost Coast benchmark test. I’ve found that this timedemo in particular shows performance differences on the various texture filtering levels more-so than any of our other tests.

[ Default Settings ]
[ High Quality ]

Performance drop using High Quality filtering vs default settings

2x

4x

8x

16x

8800 GTX

-0.4%

-0.9%

-1.6%

-2.2%

8800 GTS

-0.9%

-1.8%

-3.0%

-3.4%

7950 GX2

-3.0%

-5.1%

-7.6%

-9.7%

7900 GTX

-4.0%

-6.6%

-9.6%

-11.5%

X1950 XTX

0.0%

0.5%

0.3%

0.3%

Even though they are doing more work you can see that the 8800 cards suffer much less of a performance impact when compared to the previous 7-series cards with High Quality texture filtering enabled. The ATI X1950 XTX, though, isn’t affected at all when HQ AF is enabled (slightly faster, even).