This series is one of the best. The first article was most illuminating, and the second keeps it coming. Before the first article I was clueless to nVidia's AA nomenclature. Now it makes much more sense, and I applaud nVidia for not making the situation worse (though nVidia and AMD need nomenclature help in other areas still).

I'm not a huge gamer and the games I do play mostly run awesome with my 2500K + GTX460. I decided that if it's going to be a while before the next generation of GPUs drop, I'd get another 460. So that's what I did, should be here in a few days. I was worried that even at 1920x1200 I'd have problems with AA and the lack of VRAM, but it's good to see that two 460s work pretty admirably.

As an aside, I'm totally on an efficiency kick, and I don't relish the thought of needing two cards to get decent performance, but the GTX 460 is one of the most efficient cards around well over a year after it's release.

Zeh

What happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?

Seriously, what is it?

jemm

Great article! Very ilustrating!

This article is going to have me diving into my settings tonight, I've basically set my aged 5770 to run as poorly as possible given I game at 1920x1200 Learn something new every day

ojas

ZehWhat happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?

Was thinking the same thing....part 1 and part 2 are contradicting each other hear...if i'm remembering part 1 correctly...

btw there's a typo at the start of page 2,

Quote:

This is because the GT 420 is not DirectX 11-capable

cleeve

ZehWhat happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?Seriously, what is it?

On release we tested StarCraft II because that was a game that choked with MSAA on Radeons. It turns out, that game is severely CPU limited, so it wasn't the best test subject for Morphological AA

MauveCloud

I don't like these animated gifs for comparing anti-aliasing modes, because 1. gifs are limited to 256 colors, 2. moving around in a game will affect how noticeable the differences in quality between different anti-aliasing modes are. (so will the physical size of the pixels, but that would probably be impractical to represent when viewed on other monitors). Would it be possible to get some animations that show antialiasing modes side-by-side (or half and half) while moving around in some of these games, instead of just fixed-position images that cycle between anti-aliasing modes?

cleeve

MauveCloudI don't like these animated gifs for comparing anti-aliasing modes, because 1. gifs are limited to 256 colors, 2. moving around in a game will affect how noticeable the differences in quality between different anti-aliasing modes are.

As for #2, there's no worries as the Half Life 2 engine in Lost Coast that we used for the majority of comparison shots doesn't move the camera during idle times. We used a save game and reloaded the scene at exactly the same position, so its not an issue here.

As for your first concern, I was worried about that, too, at first. But I carefully scrutinize the uncompressed TIFF files before exporting them to GIF and in these cases there's no practical difference, it does an excellent job of demonstrating the result with different AA modes.

wolfram23

Very interesting article! Although I'm a tad confused by your nomeclature of the Radeon AA settings. There's MSAA, AMSAA, SSAA, and within those you can choose box, narrow tent, wide tent, and edge detect types (edge being the only one AFAIK to increase demand), and then on top of that you can enable Morphological. So, I'm not sure what "EQ" means as it is not at all a term used by Radeon (or at least CCC).

Also, as the first poster said, why is morphological so demanding all of a sudden? When I first tried using it, I barely saw an impact on performance and in a couple games it made everything look blurry. I just tried enabling it in Skyrim (a game that really needs better AA) and my performance plummeted - which these results confirm. What changed?

cleeve

wolfram23 So, I'm not sure what "EQ" means as it is not at all a term used by Radeon (or at least CCC).

As it says in the article, EQAA is Radeon HD 6900-series exclusive. You probably don't have a 6900 card.

wolfram23Also, as the first poster said, why is morphological so demanding all of a sudden?

The answer is 5 posts above this comment. Depends on the game, you may have been using a CPU-bottlenecked title.

I wish for a DX10 card, they would have thrown in a card a lot of us probably have/had, like a GTX260.

mt2e

I must say the animated GIFs are pro status....I wanna see a giant GIF with all modes compared....like a gradual no AA to MAX aa...I know there are tons of modes but that is just the "jist" of the idea.

cleeve

mt2eI must say the animated GIFs are pro status....I wanna see a giant GIF with all modes compared....like a gradual no AA to MAX aa...I know there are tons of modes but that is just the "jist" of the idea.

FYI, typo in first paragraph, "the image quality the impart" (should be "they")

yyk71200

Also, if you download NVidia inspector, you can use Sparse Grid SS Transparency AA (doesn't work in all games). Looks better than regular Transparency SS AA but even more demanding. It is not available in the normal control panel.

DXRick

Interesting, but the 450px × 448px images are too small for me to see the differences well.

MauveCloud

CleeveAs for #2, there's no worries as the Half Life 2 engine in Lost Coast that we used for the majority of comparison shots doesn't move the camera during idle times. We used a save game and reloaded the scene at exactly the same position, so its not an issue here.

You misunderstand my concern here. I'm not talking about whether the screenshots are in the same position. I'm talking about users actually moving around in the game (it's not common when actually playing any of these games to stay in one place for long, right?) and whether the qualitative differences between antialiasing modes are still as noticeable while moving.

cleeve

MauveCloud whether the qualitative differences between antialiasing modes are still as noticeable while moving.

Ah. Well for that you should try it out. in my experience the stills represent what happens in-game quite well, except when it comes to post processing effects like Morphological and FXAA. Those tend to crawl because the filer re-assesses each frame independently, it's not based on the same geometric data like MSAA.

cleeve

DXRickInteresting, but the 450px × 448px images are too small for me to see the differences well.

It's 1 to 1, so if you can't see a difference there, you won't see it in game.

cleeve

pdxoutdoorsGreat article.FYI, typo in first paragraph, "the image quality the impart" (should be "they")

Thx, fixed!

wolfram23

21257 said:

As it says in the article, EQAA is Radeon HD 6900-series exclusive. You probably don't have a 6900 card.
The answer is 5 posts above this comment. Depends on the game, you may have been using a CPU-bottlenecked title.

Thanks for replying!

You're right I'm not on a 6900 card so I guess that takes care of that

The (recent) game I just tried to use morphological on was Skyrim. I've read that it's CPU intensive and yet my i5 750 at 4ghz is barely being utilized, like at best 50%. People have said it is a CPU intensive game, yet plenty of games use much more of my CPU like BF3 and Crysis 2, among others. At the same time, I've heard that it is really only using 2 threads, in which case I suppose it would make sense that I see 50% usage on a quad core. I don't know if that is true or not.

zepfan_75

On page 2 there is a typo that calls a GT 240 a GT 420

cleeve

wolfram23I've read that it's CPU intensive and yet my i5 750 at 4ghz is barely being utilized, like at best 50%. People have said it is a CPU intensive game

Well, Skyrim might be better described as CPU-*dependent*, not intensive. It doesn't appear to use more than 2 threads, at least it didn't in our Skyrim performance analysis. Having said that, frame rate was very dependent on the CPU at the Ultra setting, but this dependance was really minimized if you use High or lower settings.

It's the Ultra setting that kills Skyrim and really shifts the bottleneck to the CPU.