Just wondering why you are doing this from X at all?
I know pretty much any app that can be run from command line takes a bit of a hit run on X.
I have no idea and maybe i'm missing something but is it not possible to run from the command line, or is it a driver issue?
SuSE is not an dist I would ever use, I had it installed but I didn't really like it.
I wouldn't mind seeing this done - best case linux scenario vs. best case windows
on the same hardware. Reply

Is there any change to see a 2D performance comparison? Yes, you're read properly: 2D.
I ask about it because 2D acceleration of some propietary drivers is ridicolously low, and people should know about it _before_ buying a graphics card.

From a different angle than some people suggested, it would be very helpful if you'd use the *same scale* on all graphs on a page. That way we can compare the FPS numbers between tests, instead of having to compute them mentally.

One graph ends at 80fps, one at 150fps ... the bars are at the same points, how do you notice easily that one test is so much faster than another?

This is a standard graphing concept -- use the same scale for all charts for comparisons.Reply

Another thing effecting this review is that Doom3 provides its own libstdc++.so and libgcc.so libs, which when replaced with symlinks against the system's own built in libs results in a decent performance boost. This is probably a bit more noticable on a distro like Gentoo which has optimized these libs more specifically to the architecture provided. Also, the kernel being used can have a big impact. The vanilla 2.6 kernel has a rather (IMHO) ugly scheduler that allows programs like Doom3 to get run over a bit more than it should. With the ck patchset, this is improved a good amount. As for how well gcc has optimized the doom3 code, an interesting comparison would be to see the performance of the windows version of doom3 running under wine. Granted, the overhead of wine might affect some things, but you would be able to figure out better where exactly the performance losses are coming into play.Reply

You compared Windows vs Linux performance on a 6800 with the 5950 driver for Linux (only)?

As far as I'm concerned, that makes this benchmark pretty much useless. The latest driver, as of release of this article supports the 6800 perfectly fine here. Since it doesn't seem to be the case on your system, did you ever consider that there might be other issues with the system?

Furthermore the whole point is pretty much moot seeing this test was performed on a "kernel 2.6.8-14-default". A Linux System is NOT to be used with a default kernel. You can use those to boot from a liveCD but thats about it. They are supposed to give you something to start with, not something to live with. Even Suse ships with automated tools to generate a kernel for your system.

If you are serious about comparing Linux vs Windows, use a distribution or system that supports your hardware 100% and don't stick with a non-optimized kernel.

If you were really serious, you would also talk about the various possible optimizations on an open source system (compiling code for your specific system *will* make it *a lot* faster), mention high performance distributions like Gentoo and use an optimized glibc2 which already gives you another 9-11% performance over the one provided with doom3 (which seems to be compiled without any optimization flags for whatever reason).

You left a lot of Linux potential uncovered and wonder why it turns out to be slower. Hopefully the next article will be more representative. Sadly, this one is just pretty pictures without any meaning. :/

My own results are similar to poster #18 's results: On an AMD64 3200+, FX6800, Gentoo (-O3 -march=athlon-xp) it's 25% faster than on WinXP. About 30-35% after replacing the provived crappy glibc with a symlink to the one my Gentoo system normally uses.Reply

This review may have been written by someone who spent decades in Windows, applied all the latest tweaks and tuning tools to it, yet on the Linux side, things probably don't look that bright yet.

Unfortunately the article doesn't mention whether it was the 32bit or 64bit SuSE distro that it tested.
64bit has been reported to be able to make a difference of up to 30% or 40% CPU performance wise *sometimes*, so in exchange for much worse Linux graphics driver optimization, why not bring the much worse 64bit CPU "optimization" of Windows into the game? ;-)
(64bit Linux drivers for Nvidia have been available for a few weeks/months, right?)

As has been said by #28, there are also enormous speed differences between various distros, e.g. Yoper or Gentoo should be blindingly fast compared to a rather slow "standard" Red Hat, and I'd guess that SuSE isn't totally on the fast side either.

I'd be willing to bet that with the fastest Linux distro and the fastest currently available kernel (e.g. a Con Kolivas kernel), Linux could actually beat Windows hands down, especially with a rather "slight" 25% performance difference as it stands now, despite less optimized graphics drivers...
And then let them enable SSE2 in the Linux binary and use gcc 3.4.x for compilation and let them have a well-optimized Nvidia driver etc., and the speed advantage should be very obvious.

Main point: a bog-standard off-the-shelf distro isn't necessarily what you'd want to use for very fast gaming.
(but OTOH you perhaps didn't tune Windows in a special way for this review either, so the comparison of currently available "standard" components should have been fair after all, I guess...)Reply

Perhaps you guys should learn a little more about linux before you continue with these comparisons. Don't get me wrong, I think it's _great_ that you're doing linux comparisons, but a little more knowledge would be helpful.

First, you're using the 2.6.8 kernel for this test, which is known to be buggy. The major problem: the scheduler. How does this affect game and graphics performance? Greatly. This is why I'm using the 2.6.7 kernel, compiled from source from www.kernel.org.

Secondly, since the port has been under development for a while, it was probably done with a 2.4.x series kernel. That series of kernel would have been my first choice for testing.

Third, why have a full desktop environment loaded up to play a fullscreen game (KDE)? You _want_ overhead introduced into the testing?

Fourth, you always should test multiple distros. As is, this is _not_ a Windows versus Linux comparison. It is a SUSE versus Windows comparison. At a minimum, try SUSE, Mandrake, Fedora, Gentoo, and possibly Slackware (my distro of choice). Each one has its own quirks. Slackware is truly linux, because the vendor does not modify any of the source. However, it is much more difficult to configure correctly. Gentoo is a great distrobution because it is designed for being compiled specifically for the machine it is installed on. Using the straight source packages rather than vendor modified packages also goes a long way towards creating a fair test.

Finally, learn to do the configuration from a shell, rather than from a gui. This is the way it is done by a vast majority of the linux community. This will allow you to see exactly what's going on, rather than scratching your heads over questions such as, "We're not sure why the reboot was required...". This is linux, not windows. If you know what you're doing, you should never have to scratch your heads over these types of questions. If you did it from a shell, a reboot should not have been necessary. Most likely, the reboot was required because you use a graphical login, and the X Server had to be restarted, _not_ the whole operating system.

Otherwise, thanks for finally taking linux seriously. Just please to the comparisons right. As I'm sure it took some time to learn the ins and outs of windows, it will also take some time to learn the ins and outs of linux, as it is a very different operating system. Hiring a linux guru would give you a good head start.Reply

#20... If it's not doing 16x, how do you explain the performance drop? Just because the images are the exact same does not mean it's not doing the work. Theres a limit to antialiasing, meaning there's a limit to how smooth an edge can be. I think in many cases, 8x hits that limit, and 16x is just overkill. But just because there's no difference in this particular screenshot doesn't mean theres not ever a difference. I'm sure it makes some kind of difference in most cases, even if only a few pixels. Just imagine if 32xAA existed... compare 16x to 32x and i doubt you would ever be able to find any difference. But that doesn't mean the card isn't doing extra work. With 8x and 16x you are starting to get into the same territory where anything above 8x makes little or no difference.

xp:
fresh copy with nothing more than a few games installed
NV driver: 66.72

At the same settings, the game feels noticably smoother on linux. Thanks to ReiserFS, the loading time is also much faster. Sorry for no benchmarks, but I got rid of the windows install after my first time playing on Linux. I had problems with the 6111 driver crashing, but 6106 works flawlessly. (Although I can switch drivers in less than a minute without even rebooting) I can't wait until nvclock supports overclocking on the 6800 series.

I'm a little disappointed that all the testing was done on SuSE. The beauty of Linux is being able to customize and optimize just about anything. I realize that SuSE is a distro that average joe is likely to use, but I think you should also include some scores from a simple, fast, optimized distro/kernel such as Slack or Gentoo to show what Linux is really capable of.

#14- bunk. All you can take away from this review is that Windows "owns Linux for" D3 1.1 performance. You can't generalize this for game performance in general. Most games show a less than 5% difference.

Interesting article. Was it just me or are the difference maps just pure black? Also I dont think things like 16xAA are really needed, after 2x or 4x the differences between anything higher and the original image is negligable, and definately not noticeable when playing, maybe if you really take the time to admire the scenary, but who does that when fighting monsters??Reply

According to an article over at LinuxHardware.org, the Linux build of Doom 3 is currently less optimized than the Windows build.
Most notably SSE2 code is missing (but will be added later) and GCC does not optimize as good as VC.net.
Details here:
http://www.linuxhardware.org/article.pl?sid=04/10/...I guess we can expect the gap to get smaller in later Doom 3 builds, but not to disappear entirely.
Reply

Neither NV nor id has much of a reason to optimize heavily for linux, and it's entirely unsurprising that the windows version performs better. Nevertheless, I expect that the gap will close considerably as the Doom 3 engine matures.Reply

The AA pictures won't work for me. When I click on them, I get a new window w/ the alert 'hold your mouse over the picture' and then the page doesn't load. When I hold my mouse over the original image, nothing happens. Also, the black pictures to the right of those are way to black to tell a thing about them. Interesting article though. I wonder if 16X AA would be playable in a lower resolution.
JasonReply

That bit about the 16x AA you can enable in Linux is surely interesting. Maybe you can try this on a less performance-demanding game than Doom 3.
I'd love to see UT 2004 and Enemy Territory benchmarks for this mode... :-)
Reply

From reading your article, I'm guessing that you ran Doom 3 from within KDE on SUSE. Can you run some Doom 3 benchmarks from a super-lightweight window manager like blackbox or better yet just the failsafe xterm. I'd be interested in how much running a bloated desktop environment like KDE or Gnome slows down the gaming performance.Reply

Putting the Linux and Windows FPS numbers on the first couple of pages on the same chart would have been very helpful (even more helpful if they were color-coded or something). As it is, it was pretty tough to compare the two.Reply