If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Radeon 3D Performance: Gallium3D vs. Classic Mesa

03-22-2010, 01:00 AM

Phoronix: Radeon 3D Performance: Gallium3D vs. Classic Mesa

Gallium3D, the graphics driver architecture started by Tungsten Graphics to overhaul the hardware driver support in Mesa, has been around for a few years but it is finally getting close to appearing on more desktop systems. Now that the Nouveau DRM code is in the mainline Linux kernel and its main 3D driver is Gallium3D-based, we will hopefully be seeing that adopted by more distributions soon -- it's already being flipped on with Fedora 13. On the ATI side the "r300g" Gallium3D driver that provides Gallium3D support for the R300-R500 (up through the Radeon X1000 series) is also being battered into surprisingly good shape. To see where the Radeon Gallium3D support is at for these older ATI graphics cards we have run a set of tests comparing the OpenGL performance under the latest Mesa 7.9-devel code with the Gallium3D driver to running the classic Mesa DRI driver.

Probably better to be at a constant fps then climbing the highest hill as this would be best for power saving on laptops and smoothness. What's the point of rendering frames that missing the screen's refresh rate. 60fps should the be the cap on most gfx driver settings.

Comment

Probably better to be at a constant fps then climbing the highest hill as this would be best for power saving on laptops and smoothness. What's the point of rendering frames that missing the screen's refresh rate. 60fps should the be the cap on most gfx driver settings.

Anyone think otherwise?

Well, there was that one guy I got into an argument with some weeks back on this very topic... special people are a fact of life, I guess.

When it comes to benchmarking, you usually have the vsync off to see how fast the engine/driver/level/whatever can actually run. When you're actually playing a game, though, you almost certainly want it on for a variety of reasons including the ones you listed.

The problem from a benchmark perspective with having vsync on is that you won't be able to see the threshold by which you are passing the monitor refresh rate (be it 60hz, 75hz, or whatever). There's an important distinction between just barely managing 60hz and being able to manage 100hz. If the game is just barely hitting 60hz then a more complex scene or some other load on the system will cause it to drop below 60hz, which is problematic for a number of reasons. If the game is capable of running at 100hz, however, then we know that some unexpected spikes in scene complexity are far less likely to drop the frame rate below the desired level.

Remember that with vsync, barely missing 60hz does not mean that you run at 59hz, it means that you run at 30hz. Each frame takes just long enough to miss the first vsync and ends up waiting on the second, halving the rate of frame updates. While it's possible a game could turn off vsync dynamically when the frame rate drops low enough, the result may end up even more jarring in that case, since the game would just periodically having tearing and an uneven frame rate.

There are other problems with missing the 60hz mark too separate from the monitor refresh rate itself. Many game physics simulations rely on fixed intervals for simulation updates (especially any games where complex environmental physics actually matter to gameplay, which is more and more of them these days). Many games use a 1/60th second interval to match the lowest common denominator of monitor refresh rates, so dropping below 60hz would require two simulation updates per frame, which just makes things even slower.

I'm not sure if the seemingly fixed frame rate on the Gallium driver is intentional or not (the numbers do look like vsync behavior on a 60hz display), but if it is, there should be a way to disable it for benchmarking purposes.

Comment

Michael, thanks for talking to the developers about the tests and for supplying some analysis as to what may have been causing the observed results. It makes for a much more interesting benchmark!

+1
Excellent article! Hands down for Michael, this time you really made interesting tests and interpreted them perfectly. Congratulations.

On the other hand the results are pretty good. With this I think it's possible for r300g to be the default driver in Mesa 7.9.
I hope it will be "finished" in the next month or so, so development effort can go into r600g!

Comment

Michael, thanks for talking to the developers about the tests and for supplying some analysis as to what may have been causing the observed results. It makes for a much more interesting benchmark!

Agreed.

It's very interesting that those flat lines occur right at the 30fps and 60fps thresholds. It screams out to me that there's some kind of refresh-rate related issue going on, but I suppose it could just be random.

Have you checked if the OpenGL2.1 support it advertises is working well? Do these tests exercise that, or are they strictly using 1.5?