If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Hybrid View

Linux Can Deliver A Faster Gaming Experience Than Mac OS X

Phoronix: Linux Can Deliver A Faster Gaming Experience Than Mac OS X

Earlier this week on Phoronix were new benchmarks of Ubuntu Linux vs. Mac OS X using a new Apple Mac Book Pro with an Intel Core i5 CPU and a NVIDIA GeForce GT 330M graphics processor. When looking at the tests results overall it ended up being a competitive race between these two Microsoft Windows competitors. In some areas, like the OpenCL computational performance, Apple's operating system commanded a sizable lead. In other areas, like the OpenGL graphics performance, Ubuntu Linux backed by NVIDIA's official but proprietary driver was in control. Here's an additional set of tests showing the measurable leads of NVIDIA Linux over Mac OS X with Apple's NVIDIA driver.

I wonder if NVIDIA actively helps with development of Apple's drivers, or if they are just given documentation and Apple does the rest.

Nvidia develops their own drivers but Apple control the graphics stack. This is like Direct3D on Windows, where Microsoft provides the stack and IHVs implement a specific interface to communicate with the actual hardware.

While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?

Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?

I don't think any of these tests make use of AA by default.

BTW, that minimum fps that you see in the graphs isn't the absolute minimum framerate that the game hit, but rather the minimum average framerate (since the graphs only show that) from all the test resolutions. The current PTS doesn't record minimum and maximum framerates for each test run, only average. It would be cool if it did though, as as you have said, that's a much better measure of playability than average fps in cases where this number is above 60fps.

While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?

If you ever played online fps, you will know that anything before 80fps is not acceptable. The 60hz thing is what once medics found out, but it turned out to be still eye-restraining(on CRT) so it was later highered to 70hz. Still the best hz non-eye restraining started at 85hz. Same for fps - 60 is acceptable, 45 and lower unplayable. And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).

In fact my HD4770 system with Athlon II x4 630 reaches ONLY 60 fps on opensource radeon drivers (fullhd though) in OpenArena and it is much less playable than current nvidia chipset 8300 system with proprietary that Im now typing from(not at home) - 120fps+.

We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.