If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Radeon Gallium3D MSAA Performance (R300g)

Phoronix: Radeon Gallium3D MSAA Performance (R300g)

This week support for MSAA was finally added to the R300g driver so that the Radeon X1000 graphics cards and earlier can finally take advantage of anti-aliasing with this open-source Gallium3D driver. In this article are some benchmarks of the MSAA performance with a Radeon X1800XT, but even with this higher-end GPU when it comes to the R300g support coverage, the anti-aliasing performance isn't really usable.

Michael, something seems funny with the numbers, which are uncharacteristically low even with MSAA off. At first I thought it might have been from the difference between X1800 and X1900 (I see more x1900s which have ~3x the pixel shader power) but when I looked at older articles even the x1800 seemed to be much faster :

There are all kinds of possible explanations -- regression in one of the components, new GL extensions causing apps to draw fancier stuff more slowly, compositor difference, some other hardware/software difference I missed -- but the performance drop seems relatively consistent across all the apps which seems odd.

I hope Mesa wasn't compiled with --enable-debug and the Ubuntu PPA with fresh Mesa wasn't used either, because the PPA is using the --enable-debug configure flag (and maybe some other PPAs as well, maybe even Ubuntu itself!!). The behavior of --enable-debug has been changed in Mesa. Newly the flag disables all gcc optimizations, which makes pretty much everything bloody slow.

As a question, why such a huge screen resolution for a card that is now approaching being seven years old? I still doubt the assertion that most people today have screens that large attached to their standard PCs, but back then it was even more doubtful. I know you want to test the hardware to it's fullest possible extent, but seeing that did kind of bother me.

I wondered about that as well, but I guess (a) a lot of people probably buy new displays just to get the screen space and the high res comes for free, (b) the cards actually ran pretty fast at 1920x1080 last year with many apps over 100 FPS. Agree that a lower res might be needed with MSAA cranked up.

As a question, why such a huge screen resolution for a card that is now approaching being seven years old? I still doubt the assertion that most people today have screens that large attached to their standard PCs, but back then it was even more doubtful. I know you want to test the hardware to it's fullest possible extent, but seeing that did kind of bother me.

Especially since the MSAA tests use tons of memory bandwidth compared to non-MSAA tests. The high-resolution just amplifies that effect, meaning MSAA might be a lot less trouble at a 1280x720 resolution that's probably more common at least for those older GPUs.

Especially since the MSAA tests use tons of memory bandwidth compared to non-MSAA tests. The high-resolution just amplifies that effect, meaning MSAA might be a lot less trouble at a 1280x720 resolution that's probably more common at least for those older GPUs.

I agree. I don't like these benchmarks much as they don't seem very logical.

Unless you buy more modern and high end cards, you either get high resolution *OR* you get a low resolution + MSAA. That's pretty much expected.

Dell and HP are still selling a lot of 15" laptops with 1366x768 resolution screens and AMD graphics chips in the sub-$700USD market. I don't know why people buy those low-res laptops, but they sell well... So AA is still very important today as it was years ago, to compensate for low resolutions.

What I would have preferred to see is low resolutions with MSAA vs. high resolutons without MSAA Performance comparison! That would have made sense and if there were some photos up that compared low resolution + MSAA to the high resolution in image quality and performance, it would be even better.. But alas, maybe we ask too much .

Clearly if running a higher resolution looked better and performed better, we could see that MSAA still has a long way to go.. The benchmarks in the article though, really don't say much as most gamers already know turning on AA at a high resolution is a good way to make a nice slide-show on old or low-end cards.

I hope Mesa wasn't compiled with --enable-debug and the Ubuntu PPA with fresh Mesa wasn't used either, because the PPA is using the --enable-debug configure flag (and maybe some other PPAs as well, maybe even Ubuntu itself!!). The behavior of --enable-debug has been changed in Mesa. Newly the flag disables all gcc optimizations, which makes pretty much everything bloody slow.

Note that in my PPA I added a patch to properly build with optimizations (disabling the recent changed behavior which is still being discussed on mesa-dev). I wasn't able to notice any measurable performance difference when enabling --enable-debug (tested some months ago), so I prefer to leave it enabled since it can be useful for debugging mesa as well as games problem. Also I prefer a game asserting and crashing than possibly locking up the system later on.

The test I did with mesa from my PPA are here and are completely different from what get on this article:

To me, there is a big difference in performance with --enable-debug, so big that I can't even use the flag for development. I supply my own gcc flags through the CFLAGS environment variable if I want some level of debugging with all gcc optimizations enabled.