A blog by the Valve Linux Team

One factor in creating a good gaming experience is throughput. This post discusses some of what we’ve learned about the performance of our games running on Linux.
As any software developer can tell you, performance is a complicated issue. In the interests of simplicity, we’ll concern ourselves with the following high-end configuration:

Hardware

Intel Core i7 3930k

NVIDIA GeForce GTX 680

32 GB RAM

Software

Windows 7 Service Pack 1 64-bit

Left 4 Dead 2

Ubuntu 12.04 32-bit

We are using a 32-bit version of Linux temporarily and will run on 64-bit Linux later.
Running Left 4 Dead 2 on Windows 7 with Direct3D drivers, we get 270.6 FPS as a baseline. The data is generated from an internal test case.

When we started with Linux, the initial version we got up and running was at 6 FPS. This is typical of an initial successful port to a new platform.
Performance improvements fall into several categories:

Modifying our game to work better with the kernel

Modifying our game to work better with OpenGL

Optimizing the graphics driver

An example of the first category would be changing our memory allocator to use more appropriate Linux functions. This was achieved by implementing the Source engine small block heap to work under Linux. The second category would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.
The third category is especially interesting because it involves working with hardware manufacturers to identify issues in their drivers and, as a result, improving the public driver which benefits all games. Identifying driver stalls and adding multithreading support in the driver are two examples of changes that were the result of this teamwork.
After this work, Left 4 Dead 2 is running at 315 FPS on Linux. That the Linux version runs faster than the Windows version (270.6) seems a little counter-intuitive, given the greater amount of time we have spent on the Windows version. However, it does speak to the underlying efficiency of the kernel and OpenGL. Interestingly, in the process of working with hardware vendors we also sped up the OpenGL implementation on Windows. Left 4 Dead 2 is now running at 303.4 FPS with that configuration.

OpenGL versus Direct3D on Windows 7

This experience lead to the question: why does an OpenGL version of our game run faster than Direct3D on Windows 7? It appears that it’s not related to multitasking overhead. We have been doing some fairly close analysis and it comes down to a few additional microseconds overhead per batch in Direct3D which does not affect OpenGL on Windows. Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D.

Working with hardware vendors

We’ve been working with NVIDIA, AMD, and Intel to improve graphic driver performance on Linux. They have all been great to work with and have been very committed to having engineers on-site working with our engineers, carefully analyzing the data we see. We have had very rapid turnaround on any bugs we find and it has been invaluable to have people who understand the game, the renderer, the driver, and the hardware working alongside us when attacking these performance issues.
This is a great example of the benefits that are the result of close coordination between software and hardware developers and should provide value to the Linux community at large.

New Comment Policy

Starting with this post and in the future, only comments that contribute in a substantive way to the current discussion will be approved. We appreciate the excitement and support but we want the comments to be readable and informative, so please keep that in mind when you write a comment.

394 Responses to Faster Zombies!

In terms of the driver to be used for gaming, the AMD and Nvidia drivers are closed source (there are open drivers but they’re not in a usable state for gaming yet), and the Intel one is open-source, but Intel has never made a “gaming” GPU until their recent generations of CPUs.

So basically, I’d love to see any discussion on whether Intel is a viable option for gaming on Linux, so that you don’t need to mess around with binary-only display drivers.

Also, if you have any feedback to the AMD open-source team or the community that do the Noveau driver I’m sure they’d love to hear from you

>So basically, I’d love to see any discussion on whether Intel is a viable option for gaming on Linux, so that you don’t need to mess around with binary-only display drivers.
Ivy Bridge’s HD4000 is almost usable in terms of performance if it’s not thermal\power constrained to 17 W. It has 16 execution units.
Next year’s core – Haswell will have 40 execution units in it’s best bin. It’s 2,5 faster even if they not gonna change anything in shaders, which is unlikely. Plus 64 MB of on-package memory to deal with bandwidth problem.
With that performance and official open-source driver Intel will be the best choice for gaming in Linux next year, at least in notebooks. For desktops though discrete card + binary will give you better performance of course. Reverse-engineered open source drivers is no good for gaming (or power management and many other things), so if you want oss driver intel is your best bet.

>> “…With that performance and official open-source driver Intel will be the best choice for gaming in Linux next year, at least in notebooks…”

Open-source drivers for Intel may make the game playable, but would anyone really take the gratification of using an open-source driver even if it meant awful performance compared to just installing the proprietary driver from NVIDIA or AMD?

Also, I know what you meant about the open-source driver being great in notebooks and what not, but you shouldn’t overlook the A8 and A10 APUs. Those things are little powerhouses and, more often than not, outperform anything Intel can bring to the “integrated GPU” table. I think once this all happens, if I find myself wanting a meager, power-wise laptop that can play games at minimal or decent settings, I’ll be getting one with a nice A10 Trinity APU.

Yes, the intel drivers have excellent support for xrandr and other linux features that are typically “after-the-thought” hacks in the proprietary driver world. Also stuff like display brightness and hdmi audio is much better in intel drivers.

HDMI Audio is ertainly a big let down in the AMD-drivrs, although not very gaming related perhaps – at least not in a notebook. I you’d want to play a movie connected to more than two speakers, though, AMD is certainly not what you want in a linux powered machine today.

I also generally find their binary drivers lacking compared to nivida ones, although that i supposed to have bettered in recent years. Seeing as AMD at least publish spec and other details you need to mak a proper driver, I till kind of hope we’ll get a decent opensource AMD GPU-driver someday.

>>”Open-source drivers for Intel may make the game playable, but would anyone really take the gratification of using an open-source driver even if it meant awful performance compared to just installing the proprietary driver from NVIDIA or AMD?”

The Intel driver as such has a pretty good reputation. The difference is in the hardware, a HD4000 is significantly weaker than a discrete GPU from NVIDIA or AMD.

But depending on the game, it may be enough. In a review two years ago, its predecessor, the HD3000, was compared to an aging NVidia 8600 GT and came out with similar performance. And I can confirm that the 8600 GT will do fine for games from a few years ago

I think that once games are optimized to run on intel HD2000/3000 SNB and on HD 2500/4000 then discrete GPU won’t become truly needed. Sure discrete GPU have more horse power and run many games with high quality and etc.

imo, it’s a question of seeing the game devs to optimize their games to the best possible to intel HD GPUs.

Once you reach “good enough” performance, what more do you want? There’s a point at which more horsepower just isn’t necessary for a given game. 60fps on decent quality settings is all anyone really needs. Beyond that, the only purpose of bigger numbers is to give geeks a hard-on.

I agree with the nouveau situation. But radeon is not a reverse engineered driver. AMD releases specifications for their graphics cards. Unfortunately they sometimes seem to have a hard time with legal review and so it’s not always enaugh.

I wouldn’t bet about next year’s “best for gaming” yet, not even for notebooks. The radeonsi is not in a very good state yet but in one year, much can happen. I think I will be buying a notebook with a HD 7970M in the next few weeks/months and I don’t see intel speeding up their graphics so much that it can come close anywhere near to that card in only one year. The strongest reason would be that they probably don’t want to eat as much power in their integrated graphics for thermal reasons alone.

Yes agreed. On microdesktops and notebooks alike, the APUs face the same problem: Support hardware.

Thermal control hardware and voltage supply hardware are key to getting maximum performance out of them. The desktop models can be drastically overclocked and run incredibly if they have good on board power hardware, but if you get a cheap motherboard, you might just get errors running them at stock speed.

My mother’s A-8 laptop runs far faster now that I switched on active cooling for her. I can’t believe they set it to reduce CPU power BEFORE attempting to turn on active cooling. (I just wish she’d switch to Linux, but that would be an uphill battle. I’m happy enough that I got my wife to use Linux from the first computer I got her.)

The FOSS AMD driver as of yet is usable for most things, but generally not desirable for gaming, you’ll get playable framerates in some less-graphically-intense games, but it’s stuck at a little over half the performance of the binary blob; If AMD were to take it more seriously it would be much better off.

also, the FOSS driver doesn’t yet have stable support for Radeon HD 7000 cards because the architecture change was quite drastic, and the AMD FOSS driver team is understaffed.

I’d like to see AMD take it more seriously, but that seems to be a tad wishful; For now I choose intel graphics for everything I do.

The AMD OSS driver is mostly usable, but it lacks in some areas; namely power management (or rather dynamic reclocking doesn’t work flawlessly yet), video acceleration (using the special purpose hardware bits, I think people are working on getting things accelerated using the 3D/compute engine).

Performance is getting better but isn’t quite there yet. I can barely play Minecraft on my Radeon 5850 for example. I’m currently suffering from X crashing occasionally when playing videos and DPMS doesn’t work with my monitor (it’ll just endlessly wake-up, go to sleep, wake-up again, and so on). Gnome 3 feels a bit sluggish with the OSS driver, and it gets slower over time for some reason, so I tend to have to restart X a lot. Enabling my second monitor or playing a video also makes Gnome 3 even more sluggish

These are anecdotal experiences of course, but I seem to have mostly the same issues with my 4850 in a different PC currently.

Overall though the driver has made solid progress over the past few years and I think OpenGL 3.x support is almost done. So I’m confident they’ll get there, I just wish they were able to work faster, but I’m sure it’s hard enough as it is with just a handful of people working on it. The main reason I don’t use the proprietary driver is because I’ve tired dealing with all the X/kernel incompatibilities it has, since I like livin’ on the edge

The newest Open Source Radeon drivers are all written from documentation provided by AMD.

AMD is providing a surprising amount of support to the project, including complete developer’s documentation, and even some direct development support . Intel still has done more, but not by much.

The Radeon Open Source driver is currently faster than the proprietary on everything before the Radeon HD 4000 series and is par with the proprietary on Radon HD 4000 series.

As of later chips, the 5000 series is currently still a little slower in the open source drivers, and the 6000 series is significantly slower, and a small handful of features aren’t implemented in hardware yet. The 7xxx series is practically non-functional as about a quarter of the features are still handled by software, and thus I would not recommend even trying the Open Source drivers at all on that series.

As it is the AMD APUs are still much faster with the Open Source drivers than the Intel Integrated. However, this is mainly due to much higher performance GPUs being integrated into them. (The AMD A8 contains the equivalent of a HD 6550, and it can be easily be overclocked to the equivalent of an HD 6670, if you focus on the GPU instead of the CPU.)

Well, they surely have to support FOSS drivers as well, as, at least, Intel has only FOSS drivers. And recently ATI dropped support of all chipsets below Evergreen (<HD5600 maybe) in their proprietary driver (fglrx), so there is a big slice of not-so-old hardware which is supported only by FOSS drivers.

But, at other hand, AMD and Intel made a great improvements into mesa code, especially into r600g. While they improving Gallium 3D, all FOSS drivers benefits from that, including nouveau (but I'd rather disbelieve that nouveau would be ever rated by anyone as "supported" driver, due to it's unofficial nature).

I was able to run Diablo 3 under wine on r600g with only slightly lower FPS that on fglrx (and, hopefully, without many artifacts game was suffering before), but still, there is much work to do as even fglrx performance was harsh in comparison with proprietary Nvidia drivers on comparable hardware.

And still, FOSS drivers for radeon are unable to provide full software support of hardware features. And it's hard to speak about any modern graphics until OpenGL 3.3 is supported (core profile). Even with consideration of much more agile extension system of OpenGL (in comparsion with DirectX's strict feature set per version) the biggest problem of supporting new version of OpenGL in mesa (FOSS drivers) is new GLSL shading language.

ATI made a new breakthough with their new r600-llvm-compiler, which may eventually make it's way into generic Gallium 3D infrastructure (at least, parts of it), so everyone will benefit. But again, there is still much work to do – it's still not capable of doing at least same amount of job as current TSIG compiler and there is many things to implement before it will be "commercially usable", like optimizations, error handling and soon.

In short, I guess, with Valve's release on Steam and Left4Dead, things will only begun to improve, but I doubt that FOSS drivers will be ready upon release. But, surely, existence of solid platform and raise of interest will give a huge boost to development efforts in mesa, so as more games will emerge from Valve, FOSS drivers will catch up eventually.

The reason they dropped the support is that on those chipsets the Open Source Driver now performs better, and with the adding of the new Open Source S3TC library, those chips are now feature complete.

They are still maintaining a “legacy driver” branch that supports the old chips, but it’s mainly just fixes for critical bugs, and keeping it compatible with newer software. I wouldn’t expect to see any new features or improved performance in updated releases of the legacy branch.

I usually prefer open source drivers when I have an option, but I would be willing to use a proprietary driver as long as it do create issues and integrate well with the system.

I have experience with nVidia and Intel, never tried AMD on Linux and I tried both nVidia proprietary drivers and Nouveau compiled from sources.

When you choose a driver performance is not always the first thing you look at: with proprietary drivers I had stability issues sometimes, upgrading the kernel it’s a little annoying, you have to recompile the nvidia driver and you can’t just boot a previous kernel; nVidia driver apparently put some file outside the kernel module sandbox.

Furthermore I experienced issues in switching to the console (CTRL+ALT+F1) and attaching external videos.

Overall I found the recent Nouveau driver better in integration with the system, lacking in performance and not supporting some feature (example: SLI) but less annoying.

Maybe on power consumption the nVidia driver is better, do not know, I almost never use battery on my home laptop (the nVidia one).

The best option in Linux if you do not want to experience issue is Intel which does a great job in supporting Linux (even in non-GPU hardware).

In the end I think nVidia and AMD are wasting their effort in creating proprietary drivers when they could just work with the kernel developers and the many skilled developers around to build a well-integrated, performant and stable opensource driver by just providing specifications and documentation to their hardware.

There is a difference between a graphics driver and a game being closed source.

While personally I would prefer to have the game engine be open source and the game data and script package be closed source, that is idealism. The advantage of a graphics driver being open source is far better. The advantage of using an open source graphic driver for game development is incredible. You can see exactly how and why you are running into bottlenecks, and how to avoid them.

On the note of development, the AMD HD 6000 series is already faster at OpenGL ES and OpenVG on the Open Source drivers. This is a good sign.

Great writeup! I think it might be useful to add milliseconds per frame next to the FPS numbers, e.g. “315 FPS (3.17 ms/frame),” to make it easier to understand the relative differences in performance.

Agreed. This is a cool article but the at framerates this high the difference is very small since the FPS is the inverse of the frame time.

I would be really interested in seeing stats from rendering a more complex scene that was around ~60FPS and also comparing CPU and GPU limited scenes since that would minimize overhead/cosmic rays/etc.

I’ve seen people ask for ms/frame before, but I don’t get why. How does that make it easier to understand? If they want to stress the relative performance, wouldn’t it be easier to just state percentages?

270 to 315 FPS -> 16.7% more frames per second -> 16,7% increase in performance

3.70 to 3.17 ms/frame -> 14.3% less time to render a frame -> 16,7% increase in performance

As Wolfos and others mention, differencing FPS numbers is meaningless, whereas differencing ms is a great way to measure the cost or improvement of an algorithm.

Percentage/ratio comparisons are “correct”, but misleading. If a game is running at 5000fps for instance and I can get it to 10000fps, that’s a 2x increase, so a big deal right? In reality I’ve just shaved off 0.1ms… once the game is running at a more reasonable rate, that’s just noise. At 60fps that optimization doesn’t even get me to 61fps…

Measuring with time/milliseconds is just the most useful and least misleading way to report performance.

MMm that’s very interesting. Getting that kind of perf natively in 3D games should give a good impact on the Linux Gaming Platform.
By the way, It should be interesting to see the result with an AMD video card as well. Official Radeon’s driver are known to not be as effective as nvidia’s on linux.

That used to be extremely true, but since AMD bought ATI the drivers have made leaps and bounds and have pretty much caught up to the Nvidia drivers. I’ve been running straight AMD since the 5870 came out and it’s been pretty solid.

I, too, have been using AMD’s proprietary blobs over the open-source ones. Don’t get me wrong, I love a lot of aspects about open-source software and all, but the proprietary drivers compared to the others are better by outrageous amounts. And my experience with AMD on Linux has been less frustrating than in the past, as well. I currently have a Radeon HD 6850 that runs better than my old Nvidia card did. That may just be my example, but the differences between the two on Linux are getting smaller. AMD’s been doing a decent job as of late.

Well, I personally prefer the NVidia binary blobs over Nouveau. The rig I run Linux on has an SLI setup of two linked GTX260s plus an onboard 8200GT. I can’t use anything but the first GTX260 with Nouveau. With the binary driver, SLI is actually working and rendering everything. That said, I once ran Steam on that box using Wine, and Team Fortress 2 was actually quite playable at 1280×720 with SLI, averaging at 30fps . I’d love to see how well TF2 would run natively on my setup.

Speaking of which I’m still not seeing the same gains in overall performance with SLi nor Crossfire on Linux as I did on Windows, I wonder if this is just Nvidia missing features they had access to on Windows lacking in the kernel or something in the framework aka X-server or the Compositing (Aware of the Unity/Compiz bugs). I do wish someone with better knowledge could shed some light on this; instead of just Phoronix articles showing me I wasn’t imagining it in plain numbers. On the other hand this could have changed now during the time I’ve been absent from Windows, since Linux and the drivers changes from day to day. Moi!

Recent versions of the nVidia driver have XRandR support. I have no qualms with TwinView. Both are easy to use, with XRandR barely edging TwinView out for being “native” to the various Linux desktop environments.

If they could give us KMS that would be good, but unfortunately the DRI/DRM stack in the Linux kernel prohibits binary blobs despite them not really being in violation of the GPL in using the stack. An unfortunate example of FOSS politics getting in the way of functional software.

I have noticed this as well with my 5770. The drivers are now heaps better than two years ago when I wound up buying this card. Native performance though has never been an issue, although WINE seems to be optimized for nVidia hardware due to ATi’s pre-AMD performance on the development platforms. Oh, and the AMD blob has supported Xrandr better than the nVidia one.

The problem here is many older (but capable) cards are now unsupported. And while the drivers did improve severely, my spare X300 still has some problems compared to the same nVidia cards from around that time. The drivers are far from good, but much better than before.

I hope that Valve’s work does lead to improvements all around, since the problems with proprietary drivers on Linux are well known. If this leads to nVidia finally being at least slightly interested in fixing many known bugs, no one could possibly complain.

In the same vein, I run a Alienware M11x laptop for gaming on the go. It has an Nvidia Optimus set up on it and Nvidia refuses to support Optimus graphic card switching in linux. (That’s what got them cursed out by Torvalds.)

There are 3rd party softwares out that might work there but It would be nice to just have some actual driver support instead from Nvidia which is saddly dropping the ball here.

Just to clarify… It’s my understanding that muxless Optimus requires some changes in Xorg and probably the kernel (for memory sharing). These changes appear to be in progress, so don’t throw in the towel yet.

Nvidia refusing to implement Optimus just because they can’t do it without actually rewriting their drivers from scratch. Problem is, that they are using their own technology stack for all driver components, i.e they are not using “standard” GLX or DRI infrastructure (while ATI’s drivers does).

And intel drivers, which should be used to power internal cards, are strictly FOSS and strictly based on “standard” infrastructure. So they just can’t live along in same process.

Well, some time before, they even couldn’t exists in same kernel at same time (Nvidia drivers where conflicting for resources with FOSS DRM modules), but nowadays that’s fixed (at Nvidia’s side too, BTW). So one problem gone for good, and other just couldn’t be “resolved”.

So, there is bumblebee project, which is, essentially, just second X-server for nvidia card existing espectially for OpenGL rendering purposes. That’s mimics the way of Windows’s realization of Optimus (it’s also software only) and I dare to say, is much better around some corners.

So, actually, I didn’t think that anyone was expecting Nvidia to fork Xorg server (as bumblebee project done) and do this all themselfs.

They done necessary stuff at their side (fixed the kernel modules) and let the community to do the rest. And it works pretty good.

You shouldn’t be expecting any other “official” realization.

And about Trollwards. Well, he is right at some degree – NVidia not supporting FOSS as much as ATI and Intel does, but I guess we will see who’s path was correct at the end of the day (hint: ATI’s ;)).

In latest builds of chrome you have to bypass their blacklist; they blacklisted all optimus cards, but if you have it setup right through bumblebee, you will have full support for everything.

If anything goes wrong, recommendation is to apt-get install –reinstall libgl1-mesa-glx to make intel work properly, and then just use bumblebee.conf to point to the nvidia drivers (which it does by default).

If intel card still tries to load nvidia modules (check in /var/log/Xorg.0.log), then you have a symlink somewhere still pointing to nvidia. For ubuntu, I had to:
sudo rm /usr/lib/xorg/modules/extensions
sudo ln -s /usr/lib/xorg/modules/extensions /usr/lib/x86_64-linux-gnu/xorg/extra-modules

IMHO bomblebee is already decent project. I was able to play TF2 from my ASUS laptop with optimus card (GT555m) on linux via Wine(POL)+Bumblebee+proprietrary driver. Quality was a little worse then at same machine under Win7, but it had no crashes(namely “failed to lock vertex index buffer” error)
(Hopefully TF2 will be next after L4D2 :))

ATi seems to have dropped support for linux on GPU’s older than HD 2400 (i.e 9500/9600/x300 through x1900/x2100). However, if you have a Radeon 2400 or newer, you’re good to go (unless of course you’re an open-source purist, which, as an aside, there are practical – not just idealogical – reasons for). nVidia on the either hand… yes, their drivers on Linux were the shit 5 to 10 years ago but why does everyone still repeat that meme? Did they finally stop wrapping their Windows binary and packing it for Linux?? Assholes…

You said you’re working with NVIDIA, AMD and Intel to improve their graphics driver’s performance on Linux. Is there any collaboration of this kind with the teams that make the open-source drivers too?

i think the best choice is to help the Nvidia/amd open source drivers, so they will be more inclined to partecipate in the development of these drivers. I say this because soon will be available a release of Wayland graphic server, which probably substitute X.org in some important linux distroes. As i know, the binary drivers from Amd and Nvidia doesn’t work under this new graphic server.

Please do give time to the open source Radeon ati driver. It’s been coming along in leaps and bounds, but needs attention! It has many advantages over the AMD fglrx driver in playing better with Linux (KMS for a start), but is still more flakey than is altogether comfortable.

They have been closing the performance gap to some extent, but the move to Gallium paid off for both Radeon and Nouveau drivers some time ago, as all of the Gallium drivers for that hardware (r300g, r600g, nv30, nv50, nvc0) support a wider feature set and have better performance than their classic driver counterparts ever did.

Gallium is just a common framework for common stuff, so move to it just a good organizational and structural thing, in terms of code. And yes, as ATI has at least 5 people working full-time over their’s opensource drivers (which are almost exclusively Gallium based by now), everyone benefits, including nouveau.

Intel made a fallback to classic infrastructure in their rush for OpenGL 3.0 support, but in process they loss some features support for older hardware, so I guess they will return to Gallium 3D over some time.

Anyway, Gallium is a very nice thing to have, as that’s a good and mature (and becoming even more mature each day) infrastructure for building graphics drivers that will surely attract graphics developers as more graphics vendors (if any) will decide to support Linux as well (if Valve’s plan of making Linux gaming platform will succeed that surely will happen).

With Gallium 3D at hand thous new vendors will have more pros than cons of making FOSS drivers over existing infrastructure rather then implementing everything from the scratch. And everyone loves FOSS drivers, especially game developers!

Notice how Nouveau is not in that list you gave. This is due in no small part to Nouveau’s developers not giving two shits about hardware-accelerated OpenGL or Gallium support, something far more important than KMS, which basically only has the benefit of flicker-free console-to-X transitions and good framebuffers: Two things the average Ubuntu user and gamer care nothing about. Whereas OpenGL is PRECISELY what Nouveau needs good official support for if it really wants to replace nVidia’s driver.

As for Wayland, it won’t be a serious alternative to X11 implementations until either the KMS developers pull their heads out of their asses and let binary blobs use it (There’s zero licensing issues with it, they’re just being petty over binary blobs.) or Wayland doesn’t force the user to have KMS support. Until that happens, Wayland will never replace Xorg ever, because the hardware support in Wayland will always be about 5-10 years behind Xorg or whatever the next reference implementation of X11 is because GPU manufacturers can’t officially support it if they wanted to. nVidia’s has announced at least twice they won’t give Wayland any support, and it is thanks in no small part to the shortsightedness of KMS’ development or requiring the presence of KMS support.

The state of Linux video drivers is basically you can either have KMS or you can have hardware OpenGL support that DOESN’T suck. But not both. If a gamer had to choose, they’d choose hardware OpenGL support that doesn’t sycj, which Wayland will struggle with until open source drivers can get loads better with OpenGL. Because Wayland requires both KMS support and good OpenGL support. Something drivers rarely, if ever, have in Linux for one video card.

We’ve been working with NVIDIA, AMD, and Intel to improve graphic driver performance on Linux. They have all been great to work with and have been very committed to having engineers on-site working with our engineers, carefully analyzing the data we see.

and

The work we did with the Intel engineers was on their open source driver. We have yet to colloborate with NVIDIA or AMD.

contradict each other…

– have you been working with NVIDIA and/or AMD on site?
– have NVIDIA and/or AMD been upgrading their closed source linux drivers?
– have NVIDIA and/or AMD been upgrading their open source linux drivers?
– since your test machine was NVIDIA and not AMD,/Intel how could the AMD/Intel guys have helped?
– probably a bit early, but: will the ported games automatically become available for people that already have them (on steam)? or will this be a separate purchase? will this be up to the vendor?

NVIDIA, AMD, and Intel have all worked with us on site and we expect further collaborative benefits from all vendors going forward. The work we did with NVIDIA and AMD concerned their proprietary drivers while the work with Intel concerned their open source drivers.
Regarding your test machine question, the test machine discussed in the post is just one example of the many we use for testing.

While I’d certainly appreciate you guys working with the open-source driver vendors for ATI/Nvidia, I can understand it taking a back burner for you. It’s important that you squeeze performance out of the best-performing candidate that is most likely to respond, which is probably the manufacturer’s own driver team.

Does valve and/or the partner engineers have any best practices or tips you can share? The open source community may raise cain over not releasing source, but what would be really useful for a lot of us developers is the general strategies and philosophies that go into making games more performant on linux.

Information like what tools you’re using (or the general idea of the tools) and what to look for/where to look for it would help. Bonus points if it also helps us developers work on our other favorite linux gaming platform (android)!

You should really try getting S3TC texture compression support enabled on open source drivers.
My idea would be to talk with the patent holders, and if needed getting them onboard so the patent could be dropped.

force_s3tc_enable=true run_the_program
works for me when running Starcraft 2 on Linux, when it would otherwise not run. You could try it yourself to see if it makes stuff work, but I suspect it’s disabled on my system (Intel HD3000 graphics) by default because it’s a tad glitchy.

“Apple caring.” That’s a laugh. Apple doesn’t really care about platforms other than their own. If you’re not Darwin or OS X, you’re getting very very little “caring” from Apple. Don’t count on Apple improving OpenGL/OpenCL driver support for Linux even for their own benefit.

I experienced a similar trend using the Unigine benchmarks. Strangely enough the fastest was OpenGL under Linux ( I think it was Debian testing ), then Direct3D under Windows 7, and then OpenGL under Windows 7. I cannot remember the exact numbers anymore, but I believe it was around 10-20% speedup going from Windows 7 Direct3D to Linux OpenGL. I would have thought that most work would have gone into optimizing Direct3D, thus making it the fastest implementation of the engine, but I was mistaken.

I really like that you will improve Linux for everybody with better drivers, so have fun in Linux land and hurry up, I wanna shoot Zombies.

That is not surprising. All benchmarks I have ever seen show that Direct X outperforms OpenGL.

Also I note that this test was against an old version of Windows, but the latest version of Linux. Linux is always playing catch up to the Windows kernel and trying to implement features that Microsoft pioneers (for instance per processor cache lists) – so it is not surprising that a much newer Linux kernel might have closed the performance gap a bit. A fairer comparison would be against the current version of Windows.

> Also I note that this test was against an old version of Windows, but the latest version of Linux.
I could swear they’re saying the test was Ubuntu 12.04 32-bit vs. Windows 7 SP1 64-bit. That’s the newest version of Windows being compared to the newest version of Ubuntu.

(And if you were calling Windows 8 the “newest version” then that’s still not a fair statement because they didn’t use Ubuntu 12.10 either.)

> (for instance per processor cache lists)
I don’t know what you’re referring to, and it seems that neither does Google. Maybe this isn’t what it’s actually called?

> Linux is always playing catch up to the Windows kernel and trying to implement features that Microsoft pioneers
A) Nearly ALL products have things they do better/worse than the competition and so by definition are always “playing catch up.”
B) Your logic makes no sense. Microsoft contributes quite a bit to Linux – because they want you to have a good experience with Windows, even when it’s running as a virtual machine on a Linux host.

“Linux is always playing catch up to the Windows kernel and trying to implement features that Microsoft pioneers”

Oh that’s a laugh. I could make a long, long LIST of things that Linux had support for for YEARS before Microsoft even put a half-assed attempt at supporting in Windows.

USB 2
USB 3
SMP
NUMA
Itanium
x86_64
EFI
UEFI
OpenFirmware

And OpenGL has had pretty much the same thing going on vs. DirectX, having implementations for things, even if only in extensions, for years before DirectX. Perhaps the most notable example was hardware tesselation. 5 years before DirectX 11, OpenGL had it. DX 10 and earlier, for the record, did NOT have it.

Sorry, but the “Microsoft has been a pioneer” argument has been so thoroughly disproved. Someone else has ALWAYS done it first
at least 2 years before they did.

“Your logic makes no sense. Microsoft contributes quite a bit to Linux – because they want you to have a good experience with Windows, even when it’s running as a virtual machine on a Linux host.”

Microsoft contributes ONE THING to Linux (HyperV drivers. So people will use Linux under Windows.). And only because they got busted violating the GPL, not because they think Linux is worth supporting. It’s about as far as supporting Linux for Linux sake as you can get.

This is great news! Would you be willing to share how well the Mac version performed in comparison to the Windows and Linux builds? I’ve heard Apple writes their own drivers, or writes them with heavy involvement from Nvidia/ATI… curious how they stack up and how much the OpenGL optimizations confer to all platforms.

This question might have PR implications, but do you get the impression that Nvidia et al are interested in working with other, smaller developers about driver-related issues?

Those are very interesting results, I recently did some testing with a amateurish benchmark I put together and I got twice the performance on Windows 7 64/Open GL vs Ubuntu 64 12.04 using Nvidia binary drivers. The test uses SDL2 with OpenGL as the backend in both cases, and I was ready to bet the house that the difference in performance was in Nvidia’s blob. Your results indicate otherwise so I think it’s time to delve deeper! Can you share which version of Nvidia’s drivers you are currently using?

An example of the first category would be changing our memory allocator to use more appropriate Linux functions. This was achieved by implementing the Source engine small block heap to work under Linux. The second category would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.

Is there a possibility of going into some more technical specifics about these two categories? I imagine it’s too much to list all of it, but I would definitely find some exact examples quite interesting.

I’m interested in that as well. Especially since it seems everybody and their grandmother likes to implement a “faster” version of memory allocators in their applications instead of utilizing the thousands of man-hours spent on tuning the kernels VM. (See for instance https://queue.acm.org/detail.cfm?id=1814327)

The short answer to this is that system memory allocators are tuned for the general case. They have to handle both frequent and infrequent allocation/deallocation of large and small blocks without any serious degenerate cases and without seriously fragmenting the heap. They are tuned for performance yes, but reasonable performance for all kinds of workloads would be a better description.

An application specific allocator (or allocators) knows the patterns it will be dealing with and so it can be tuned with those patterns in mind. This can lead to vast improvements and is why so many applications have have their own special memory subsystems.

Custom allocators can also do things like track memory usage to report errors and provide performance metrics for the application.

That’s really great, I’m very excited. If I knew how to develop games, I would volunteer on this team. Keep it up! One thing I’d like to mention that I think might help; when I run my source games in wine, they are unplayable unless in windowed mode. I have intel mobile HD graphics 3000 and it and wine have been fighting for years. Glad to help out any way possible

In the past some of the better known Linux game ports ran slightly faster with OpenGL but sacrificed some features like reflection and bloom that their D3D renderers had. Is the Source port renderer on par feature wise?

The Linux version of Left 4 Dead 2 has all graphical features enabled. Obviously, there are still some bugs we are working on but overall, the stability and quality of the rendering is on par with the Windows version.

The 7850 is not unsupported. Download the binary blob from AMD Catalyst website and install it manually according to the instructions on the Ubuntu help page. If you have problems with underscanning/overscanning, the help page covers that as well.

Fine out of the box? If you’re running fglrx+compiz/gnome-shell and aren’t running into this, you need to tell the world your secret!

But on the whole I entirely agree. This is my second stint using AMD GPU’s with Linux, and is many times better than it was a couple of years ago. I’d definitely say AMD has caught up, and with some extra effort will over-take the competition in Linux if the likes of Nvidia aren’t careful.

I think you mean that little warning in the bottom right corner?
Well, if you get 3D acceleration: Just ignore it.

Some driver version 2 or 3 years ago added that warning for unsupported devices.
At least in my case (HD3870X2) unsupported seems to mean “might work, but we do not guarantee it” and the GPU produces some minor graphical glitches in Gnome3. And I am not even sure if this is Gnomes fault, as everything else works as nice as with my integrated Intel HD3000.

BTW: I would also love to see better drivers, as Linux might get some serious gaming business in the future – which is AMD/nVidias GPU main audience
But that might take a long while, I am afraid

What desktop environment were your tests utilizing on the Ubuntu 12.04 32-bit test? The default Unity or Unity 2d created by Canonical, or perhaps gnome-shell, gnome fallback or KDE?

I’ve read recently that there have been many improvements added to trunk versions of Compiz (what Unity runs under) so I’m curious if that’s what you were running on your tests. Historically, Compiz has been a very poor window manager when you need to run OpenGL applications such as Blender, AutoDesk or 3d accelerated video games. Have you guys run into issues with this?

Personally, I love Ubuntu and its Unity desktop feature, but I’m concerned about Valve’s video game performance as Unity 2d will be going away next release when llvmpipe is released (which lets you use a compositor without actually have a GPU available).

We are developing and testing on a variety of desktop environments with Unity being the most common. In addition, we are working with others to make sure that the Steam client and L4D2 run well in all environments. So far, we haven’t noticed any specific issues.

With these kind of performance differences, one wonders if Valve will change future development schema to include a Direct3D / OpenGL drop down (such as was available in the first Half-Life title) on Windows systems? Also coming to mind is “does Valve intend on doing any work with D3D10/11 or OpenGL 4.x in the near future?”

This also generates the question on a potential switch to OpenGL for foreseeable future development, because of its wider support (ie: Windows XP support for OpenGL 4.2), whereas Direct3D sheds its support for “Legacy” systems.

I’m curios as to why they’re continuing to support Direct3D, if OpenGL provides better performance consistently across all of their platforms. Do they see better performance happening if they optimize Direct3D for windows?

I suppose the xbox is basically the only reason to support D3D (that I can see), perhaps a wrapper like gldirect would be more economical?

It would definitely be a performance hit, but judging the the comparative performance of wines D3D wrapper, probably not that much of one, and it would certainly be easier to translate the open source calls into the closed api then trying to convert D3D into opengl, without good documentation of the D3D calls.

Ubuntu has a “updated” alternative package for the Nvidia driver for a while now. Best approach would be to suggest switching to that and run the apt-get command when the user answers. Not sure if the AMD driver has a package like that off the top of my head.

It could integrate through the package manager while operating through Steam. Valve (with support from driver maintainers) could have their own repository of .deb driver packages which it could install into sources.list.d upon running the client. It could then manage upgrades this way or allow you to upgrade through your distro’s package manager.

That would be nice. You have the maintenance issue of providing repositories for all the major versions of the major distributions (don’t forget, it’s not just Ubuntu, you’ve got Fedora too), but I guess there are tools to repackage packages between apt, yum etc.
nVidia are quite good with their binary drivers, I haven’t had an issue with installing their newest drivers for years. iirc they’re quite backwards-compatible with older versions of X. Hopefully ATI has/will follow suit, and intel are being nice and opensourced already.

Additionally, from my experience I noticed that a lot of binary blob drivers require a specific set of versions of xorg and may not work nice (or at all) with newer/older versions of xorg. So if you decided to upgrade the drivers outside the package manager, it might break Xorg

Very ugly in my opinion. I’m just wondering if Wayland fixes this problem.

Not only that, some drivers require you to manually suspend x and install the driver from shell, I know I had to with my gtx570. It is not that any given step is “hard” its just that learning the process is more trouble than most gamers are willing to go through, and to have to do it each time you try a new distro, ew.

Is valve in a position to twist Nvidia’s arm into providing decent open source drivers? Who knows, but if they did, it would be a massive blow against windows.

Wayland will fix this. Once the ABI is out, it will remain stable and dependable between minor versions. As well as KMS support.

Rereving versions of wayland is much easier then xorg currently. But the problem is academic. since nvidia has said it’s not going to release a wayland driver. Leaving us stuck with either OSS or intel/ati drivers

Ye, I had that happen to me with ATI proprietary drivers, as I have an HD4770 its support was dropped with catalyst 12.6 if I update xorg it will break X leaving me with reverting to an older version of xorg or going the open source driver way which isn’t remotely up to par in terms of 3D performance.

However this can be resolved like in the Windows by checking the xorg version and installing the appropriate driver if necessary.

In the Linux case I still think this should handled by the distro’s package manager, some of them do it already like Ubuntu if you chose to install the binary drivers it also keeps it update if I’m not mistaken.

I wish AMD could step up in delivering fixes and the missing features to their Linux driver both the proprietary and the open source one.

Actually competent package management fixes the problem. I run Debian unstable with Nvidia proprietary drivers from Debian’s own repositories. And I have the latest and greatest kernel, drivers and xorg pretty much all the time.

Not that I’m suggesting you should all run Debian unstable, but my point is more that you don’t have to throw away everything to get something that works.

Wayland “fixes” it in the sense that open source drivers wob’t need it. But since no proprietary video drivers can use KMS for politicalimaginary licensing reasons, Wayland will never see the sort of solid support or stability of Xorg.

Ever.

Sorry, but Wayland is just the latest of a long series of systems that tried to replace Xorg but died horrible deaths because no hardware manufacturers would touch support for them. Bringing up Wayland as an alternative to Xorg would be taken a lot more seriously if KMS support was simply a FEATURE and not a REQUIREMENT of the system.

It would be moderately straightforward for Valve to maintain at least Ubuntu packages for up to date Nvidia and AMD drivers and to have Steam install them. Or they could try to get the vendors to provide package repositories themselves.

Indeed. The Packet Management software is really important because of the dependencies. They are not working the same way as in Microsoft Windows. I don’t think that using the same option as the windows client will work.

I suppose the best way should be that Canonical provides more frequent updates of the differents graphics drivers via the packet manager. Steam on the other way could check the running version and warn if it not match the requirements saying that the drivers needs to be updated, with a link to a Canonical Help webpage on how to use the package manager (for the ones who really don’t know how to do

You can add a PPA repository to get more up-to-date Nvidia and ATI drivers. Packages from PPAs are managed by the package manager. The first one below contains stable upstream drivers, and the second fresh drivers compiled from source and updated regularly. The latter should be used with caution as in some cases it updates the x.org server as well as other system libraries.

This is because you often have to upgrade your whole Xorg stack in order to update the graphics drivers. And you really don’t want to do that all that often on a stable release since it often causes systems that were running just fine yesterday to stop working for what to the user appears to be no reason.

Actually, this was mentioned in the last UOA. They will be shifting focus towards providing newer kernels and drivers in LTS versions (so for example 12.04 will get newer kernels, 12.10 will not (suggested you just go to 13.04 once it’s released).

Awesome stuff! How easily will you be able to apply the work you did to other games that use the source engine?

Hopefully you guys pull an Apple and make everybody else follow in your footsteps, there are some upcoming games that I can’t wait to play but at the same time I’m getting very sick of using Windows…lol.

Will any testing be done on the notebook side of things? I know it’s not as common, but with the powerful laptops out there, as well as laptops with AMD’s Llano and Trinity APUs, some people have been opting for them over a traditional desktop — my girlfriend is one of them, for that matter.

Both the Steam client and L4D2 have been tested on a few laptops we have around the office but the current focus is on desktop configurations. As we get further along, we will develop a more comprehensive hardware test matrix.

As a company whose products literally drive sales and development of graphics technologies, you can attract engineering support from GPU companies, and I think it’s awesome that you have decided to use your influence to drive effort towards gaming on Linux… But it raises the question of what a smaller, less-influential developer can do if they want to ship a product on Linux. Linux as a gaming platform will substantially benefit from native support for AAA games, but for it to be a viable gaming platform long-term, indies need to be able to easily ship content for it as well, and those are the guys with limited resources to be able to manage separate code paths for different platforms. Being able to offer more than just Valve games on Steam For Linux is gonna be a huge business factor, as I am sure you are incredibly well aware.

Indie devs are actually surprisingly good about handling multi-platform releases. Take every game that’s been released from the Humble Indie Bundles, for example. Plus, with a heavy-hitter like Valve encouraging indie studios to be multi-platform, and possibly providing some support or assistance (even in terms of just advancing the available tools to do so), things will certainly improve.

Very interesting, thank you for sharing your progress with us, these are very exciting times. Some of us have been waiting for the rise of Linux gaming for a dacade or so.

I was surprised by your statement about working with Nvidia because I was under the impression that they were not particularly interested in high quality driver support for Linux considering recent events regarding Linus, open source, and Optimus support on Linux. That actually brings me to my question:

Do you plan to do much testing on mobile GPUs? I’m sure many people would be interested in seeing how the Intel HD4000 and the Nvidia 680M compare to their desktop counterparts.

I don’t find the GPU maker’s response surprising at all. Valve products drive GPU sales, something that all the open source music players and text editors in the world will never do.

I think this is the first time there has really been a major player specifically pushing graphics performance in Linux, and the numbers they’ve achieved make it clear the platform is first class for gaming when given the chance and attention. Anyone who doubted Linux can hack it in graphics performance has been proven emphatically wrong, and this in the process of overcoming one of the largest barriers to adoption.

This is not the first time, there have also always been the 3D/VFX industry (most major companies in that field run on linux) and some engineering & science fields, otherwise there might have been no nvidia/fglrx drivers at all, but of course Valve’s requirements are probably somewhat different.

Awesome! We recently did some performance tests between Window 8 (DirectX) and Arch Linux (OpenGL) with similar results. My ATi HD 4870 is a different story, sadly. However I hear things are looking up for AMD and Linux with their newer cards.

I’ve been a long time player of CS, and can’t wait for CS:GO *hint hint* It’s great to see such success with a large game project though, and it’s encouraging to me as an upstart game developer with a eye on Linux.

I’m really hoping this results improvements in the open source drivers in Linux. As an important client, you should encourage AMD and especially Nvidia to collaborate more with the open sourced driver developers to benefit the maximum number of potential customers. Considering that open drivers are used by default in many Linux distros and they tend to be more compatible with the latest Linux packages, focusing on the open drivers will lead to a better experience on Linux, which would pull in potential customers.

Ideally, there should be no closed drivers and GPU manufacturers should work on the open drivers full-time. Not because closed software is evil, but because it would be easier to maintain and provide a better experience for the end users. Instead of splitting peoples efforts, they could be pooled together and interested vendors like yourselves would find it easier to contribute.

Well, hardware manufacturers often don’t want to provide too much direct assistance to driver development, as a driver stack that is closely written to the hardware often can give away things about the logic of the hardware itself. Hardware companies aren’t avoiding working on open-source drivers because they don’t like open-source software (they probably love it as much as everyone else!), they avoid open-source drivers because they don’t want to accidentally give away any of their core business secrets to their competitors.

I know the reasoning, but I feel like it’s naive. Nvidia’s policies haven’t prevented people from reverse-engineering the hardware to develop open drivers. What it has done is hurt their image, their business, and their customers.

Thing is, AMD and NVIDIA regard their drivers as just as much a product and object of competition as their graphics cards.
If company A makes faster graphics card than company B, software can still run faster on company B’s hardware if compan B’s drivers are better than company A’s.
This makes drivers business secrets.

It’s been a well established fact for at least a decade now that any well written native linux application (including games) will outperform the well written windows counterpart, and vice versa. This is no surprise. I am glad to see a name as powerful as Valve confirm this though, and present actual data to back it up.

It doesn’t, windows is a bloated OS, performance shouldn’t be your reason for using windows, proven by this blog, very little time has been spent on this Linux port, yet it’s already faster than the windows counter part.

Perhaps because Windows has much more hardware support so hardware is much more optimized and working on various piece of hardware. However, this blog post shows that Linux, even more effort and hardware vendor support is added, can be capable of being as powerful if not more so then Windows. In my opinion, this is almost as if Linux has latent power that can be “unleashed” if hardware vendors were to contribute to making their hardware more optimized for Linux and releasing appropriate data to make various parts of Linux work better, especially drivers. Even OpenGL, while arguably not as good as DirectX 3D, is lacking perhaps because of the the lack of the degree of support from hardware vendors and game developers that DirectX 3D enjoys.

What Valve is doing with getting nVidia, AMD, and Intel to help out is significant. This is just a start. And Valve helped to start support from big name game developers to support Linux.

Great news and I really appreciate all the work you do.
But I highly disagree with you when it comes to the open source (at least ATI) drivers: my opinion is that they are in a shape which at least makes it possible to play games with them. I use the latest stable radeon driver here (6.14) together with Mesa 8.0.4 and Gallium enabled ant I can play the following games:
– Unreal Tournament 2004 + 99 (native)
– Xonotic (native)
– Neverwinter Nights (Native)
– Amnesia (native)
– Call of Duty 4 (wine)
– Dragon Age 2 (wine, did not work with the proprietary drivers when I tested it directly after release)

I have even made the observation that the open source radeon drivers work better together with wine than the closed source fglrx drivers. Also the support of ATI is really, really bad – they most of the time don’t even support the latest stable Xorg version and bug reports are more or less ignored.

The only drawback that I have with the open source drivers is the power management (which just works manually on my 4870, but it works and my card is silent if I want it to be)

So at least here I’m really happy with the open source drivers even (or especially) when it comes to gaming.

What about image quality? Is it in pair with Windows and Direct3D? Open source drivers on linux are a lot behind when it comes to OpenGL features support if I recall, closed source drivers are much better, at least Nvidia chips that I use to buy. Can we expect to have the same level of details and eye candy under Linux when running on Intel/Nvidia/Amd chips?

It will be interessting to see how performance between Windows and Linux is on lower resolutions or weaker system. Especially when playing with just around 25 FPS on Windows getting an extra 5 or so FPS on Linux would greatly improves the feel of the game.

That’s true Tim.
Gaming has always been the last drawback of Linux, and this could be an important milestone for this operative system, as it could demonstrate that Linux is a better game development plattform than Windows, even with DirectX.

By the way, you are doing an excellent job at porting this game. Keep on please!

Leaving aside that X will eventually be dropped completely, it seems crazy that they wouldn’t support a new technology that should both increase performance, and make their lives easier in terms of version updates!

X isn’t going away at any time in the forseeable future. Wayland was never intended or expected to make X11 be dropped from distributions; it’s only intended to replace X11 as the center of the OS’s display management and make X11 a legacy component.

Just think of all the (proprietary) games that have been released on Linux so far. They all use X11. Dropping X11 completely would break compatibility with all of them.

Why ? Compositing = off, and you care only for OpenGL performance, in games you don’t care about X or Wayland, at least directly.
Wayland will be better than X if it’ll be easier to write support for it and you’ll have drivers out faster, thats probably the only thing that connects with gaming.

This is true for windowing apps or apps which draws thought some sort of GUI-library code (like gdi in windows.) Game utilizing OpenGL usually uses direct rendering as well, so it’s bypassing Xorg pretty much completely.

I gotta discuss semantics here. GLX is just there to bring GL to X, hence the name GLX. What makes OpenGL fast even on something as slow as X(which is still pretty fast considering what it can do), is direct rendering. This is where the program bypasses the whole X protocol and communicates directly with the card.

Now as to why X is considered ‘slow’ (faster than windows, even if barely), is because X is a suite of client libraries and a server. The server is what will hold the actual display and is what will be rendered to. The applications are clients, that send render commands either via Unix sockets or TCP. So yes, bringing this further down, you can start a remote application and have it render to your X-server, if you can do without direct rendering, you could do the same for most games as well.

Wayland should improve maintainability (with a seriously smaller codebase) and reactivity (with fewer context switches). I don’t know of planned improvements to throughput, which as I understand it is in the hands of graphics drivers. X.org or Wayland have about the same small overhead when compositing, and no overhead with full-screen unredirected windows.

Stability of Wayland would depend on how mature it is. The motto of wayland is “Every Frame is rendered perfect!” so the goal is clear. I’m also curious how much getting away from X.Org would boost performance. There is already a working version of Wayland with Weston compositor and it runs a open Version of Doom 3 rather well.

Another intersting question would be how the impact of the desktop enviroment is on game performance. You could easy boot straight into steam by changing your .xinitrc to “exec steam” so startx launches steam as the only graphical application without any resouce hogging desktop whatsoever.

Somewhat strange question.
There is nothihng stopping you. With any graphical application you can do it already.
Technically its not question to Valve. Valge got nothing to do with it. But I understand your concern.

Some preloader, for example, GDE can do it.(its nice and graphical). Machine starts, it offers what environment to load. I am guessing, its trivial modification.

Xorg isn’t a big bottleneck in the Linux graphics stack (not at least when it comes to OpenGL which uses direct rendering anyway), the issue is just that it’s really old and has collected a large amount of cruft along the way. That’s not to say Xorg is bad, it’s quite remarkable what they’ve been able to do with a really old protocol, but the signs point to it nearing the end of its life.

No, the X server is not a bottleneck on a (modern) linux system, and Wayland will not improve performance by a meaningful amount in any non-trivial scenarios.

In terms of 3d gaming, neither the X server nor Wayland is involved in the bulk of the work: mainly, the application uses the 3d libraries (mesa) which talk to the graphics card drivers, and render a frame which is then stored in the video card memory. Where X/Wayland come in is in getting that frame to be displayed. Both (modern) X and Wayland do this in the same way: they update a pointer that says what video memory the card should output, and set it to the location of the frame. This process takes essentially no time.

(This has assumed that the application you’re running is full screen. If it’s not, X or the X compositor, or the Wayland compositor, has to do one copy of the buffer to place it in the final image.)

With all that said, I am very interested to know how X-specific the code is and whether it will be ported to Wayland in the near future.

The problem with supporting this kind of application on Wayland is that you’re really going to have to wait for AMD and nVidia to port their drivers to the new architecture (if they even do it). OpenGL support is still sub-par in Mesa and 3D performance in the open source drivers still leaves a lot to be desired.

Native OpenGL in Wayland too is limited to the GLES variants since supporting full OpenGL would require bringing in GLX which is of course tied strongly to Xorg (Wayland uses egl currently to bring up an OpenGL context).

As far as any porting to Wayland goes windowing and context creation is a relatively simple bit of code so the port wouldn’t be overly involved, I’m not sure what the input story looks like under Wayland but I wouldn’t expect that to raise any serious issues either by the time any porting might happen.

But for now considering all this and that Wayland is still a way off shipping en-mass means that we’re likely to be using Xorg for a while yet.

The cries of doom for Xorg are greatly exaggerated. Wayland is a very young project, and will not meaningfully displace Xorg for many years. Additionally, the speed at which it’s adopted will depend totally on the strength of its support for the X11 protocol.

Lastly, as others have explained, the areas in which it improves performance are not relevant to anything we know about video game technology at present.

Wayland doesn’t support the X11 protocol so much as allow you to run an X server seamlessly underneath. This migration path means that support for legacy applications should largely be a non-issue for Wayland.

Otherwise though I very much agree, although hopefully many years turns out to be only a few

So am I right in thinking you have a DirectX Renderer where in your code you catch the specific DirectX Calls and convert them to the Native OpenGL calls more closely?

Now you have worked with the hardware guys to figure out how to call OpenGL a bit better and you are getting improvements, but now if you call the DirectX via OpenGL on Windows you are getting speed improvements and you are going to back port some of this into the DirectX Renderer to make that faster?

Also, wouldn’t it have been or be wise to create an abstract renderer which has calls on it and get the correct renderer from a Factory style, and calling the OpenGL directly or is Source too coupled and this was the simplest way to do it?

We use a modified abstraction layer (based on the original Mac OS X work) that translates Direct3D calls to the proper OpenGL calls. This layer has received the most work but changes have also been made above this layer that resulted in improved performance.

Forgive my ignorance, but is getting Device Driver teams in house for a bit not usually done for when coding for games only when there is a bug that can’t be found?

Is there a lot of “This is how it actually works at the hardware level and if you tweak x,y,z you should see performance gains?” and thus helping everyone on the engine team and maybe even help PS3 development? (For the OpenGL)

How open do you hope to make the work you’ve done. Obviously you are contributing to the drivers by working alongside AMD when doing your testing but are you hoping to be able to add back more by sharing what you’ve learnt?

Hopefully other developers and development teams of other games can look at what you’ve achieved and take it futher still; Gaming on linux is something I’ve dreamt about and while UT and Quake can keep me happy, I dream of more!

Also; how much of the work on ubuntu is portable? Please don’t just release ubuntu packages but try and aim for pure binaries at least (I know we can’t expect open source steam anytime soon :))

They didn’t lose that contract because their Linux drivers were poor, but that they didn’t exist for the architecture.
And Linus dislikes Nvidia because of the way they behave as a company with regards to the Linux kernel team, Linus made no comment about the quality of the proprietary desktop graphics driver.

Why?
nVidia proprietary driver not available for MIPS, and they not support nouveau officially. AMD proprietary driver is not available for MIPS too, but AMD have FOSS drivers that working on MIPS and developers team that working on this driver, so AMD able to provide GPU’s for MIPS-based PC.

It wasnt because their drivers are terrible (even though they are). They lost the deal because their drivers are closed source and Nvidia is notorious for not prioritizing their development efforts on Linux.

You can’t really compare 32bit OS vs 64bit OS when you have 32GB RAM.
32Bit OS can use less thaen 4GB of RAM (Unless PAE Kernel) while 64Bit OS can use much more then 32GB…
It would be more interesting to see 64Bit vs 64Bit comparition in the later stages of the development.

Consider, though, that L4D2 is likely a 32-bit executable even on Windows, and thus cannot use the full 32 gigs of RAM on the system anyway. Furthermore, I find it highly doubtful that L4D2 would _EVER_, under any circumstance, want to use more than about 2-3 gigs of RAM. Being the simple shooter that it is, the amount of RAM in the system, as long as it’s over four gigs, is unlikely to affect the tests in any meaningful way.

This piece of misinformation has been invented by Microsoft. You very well can compare.

Windows 64-bit is a mixed 32-bit and 64-bit runtime environment. Source games run as 32-bit non-native executables under 64-bit Windows.

Porting software to native 64-bit is often error prone, and i understand, if Valve are to undertake it, that this effort may not be complete yet. Running 32-bit software under 64-bit linux is possible, but often difficult because of issues with mixed runtime support, which i think is incomplete in Debian (and Ubuntu) based distros. It’s subject to distro bugs. Development and debugging is even more difficult than merely getting the software to run, again because of distro bugs.

32-bit Linux, as opposed to Windows, can address any amount of RAM that a modern computer can be built with, beyond 4 gigabyte. I believe configurations up to 64 GB are known to work, and it’s generally believed that the limit is much higher. This is accomplished by utilizing PAE, “Physical Address Extensions”, which requires extra care on part of drivers to allow them to manage vast amounts of physical memory. Every program is still limited to 4GB space, but hardware mappings and the total sum of all programs, kernel data and cache are not limited.

According to Mark Russinovich from Microsoft, the reason why Windows limits 32-bit versions to 4GB physical memory, in spite of them also supporting PAE, is that legacy drivers need to keep working, which are known to crash when subjected to higher physical addresses than 4GB. This limitation does not exist in Linux.

On Linux, benchmarks have shown 32-bit executables to typically behave at exact same performance between PAE-disabled, PAE-enabled, and true 64-bit configurations, unless there is memory contention, in which case PAE-disabled configuration loses. PAE-disabled configuration is no longer available in recent Ubuntu, and PAE had been enabled by default for a long time now. Native 64-bit executables have been shown to perform often either significantly better or significantly worse than 32-bit executables.

Thus for the purpose of this comparison, it is absolutely sufficient and correct to run it under 32-bit PAE-enabled Linux with comparison to a 32-bit executable on 64-bit Windows, and may in the meanwhile be the only way. Further along the road, extra performance gain from 64-bit Linux support is possible – 64-bit compiler on Linux compiles C++ code more efficiently due to a different ABI standard.

Support for installing & running 32-bit applications in 64-bit Ubuntu 12.04 is actually pretty good. Of course, for applications that are not in the repositories, you will either need applications packaged well so that they refer to the right dependencies, or you will have to install the necessary dependencies yourself, but that’s not any different from installing 64-bit packages.

(And yes, not all applications & libraries have been converted to multiarch, but the most “popular” ones usually are.)

Yes, you can — in fact, the great improvement of the 64-bit extensions to the x86 architecture is not so much the 64-bitness, but the increased number of registers, which means less storing temporary values in memory, which means speedups. (Just as a hard drive is much slower than RAM, RAM is much slower than direct register access.)

Other architectures are a different beast, of course — on PowerPC, for example, the jump to 64 bits didn’t increase the number of registers, so ppc64 applications, unless they work with gigantic numbers, are often slower than their ppc32 counterparts (since there are larger numbers to contend with and nothing to compensate).

Nouveau+Ubuntu 12.04 Works totally fine for me, It only fails with games that demand OpenGL 3.1 + features(in which case you better off go with blob), I have played many native Linux games and also windows games via Wine and Nouveau performed damn good. maybe you are using a very old version.

Now I’ll start by saying I know very little about coding, so please do not flame me for asking this:

As Android is – for most part – a distro of Linux, why is it relatively easy for developers to make great looking games (Take the MadFinger games as an example) on numerous handheld devices without many problems at all, yet this is something that Desktop builds of Linux have struggled with for decades? What makes it so different? Can you (Valve) learn something from the developers of Android games that would help you with the process of getting Linux into a fully working alternative to Windows?

Have Valve considered the idea by perhaps looking at a style Chromium route and having a linux gaming OS – a Steam OS if you will? Would this then not enable you to have full control over the entire Kernal etc, ensuring gaming performance would always be optimal? (Long shot but would be nice to have an insight into ideas that have been considered)

Given that a lot of Linux users are willing to reboot to a different OS environment today to run Steam, and that it wasn’t all that long ago that gamers often had a collection of fine-tuned DOS environments they used for specific games, this is not that much to ask from the user. Given the choice between maintaining a Windows installation I use only to run Steam, and a second Linux installation I use only to run Steam, the decision would be easy. Linux is easy for me to configure and maintain, and Windows is a constant hassle.

Of course, maintaining a Linux distribution is no small task, and Valve already have substantial business and technical commitments to their game projects and to the Steam platform. I would hope that shipping an entire distribution is not the path of least resistance to bringing Steam to Linux users, but would not find the situation untenable if that’s how it worked out. It’s also entirely conceivable that such a distribution could be easily maintained as a minor modification to one major distribution and the community could take care of it with no assistance from Valve.

The reason there aren’t great looking games on Linux, and that Android has some is purely because large professional game companies don’t make games for Linux due to it being a small market. Android is a large market and so worth the investment from large companies.
The reason for this is a chicken and the egg scenario, that Linux is made less popular due to the lack of games. Valve realises this and has decided to do something about it.
Motivated clearly in no small part by the fact that Gabe Newell actively dislikes where Windows is going, and Valve doing what it can to make Linux a better place for gamers is a great way to stick it to Microsoft.

1)Android has nothing in common with GNU\Linux distros except Linux kernel. They are not compatible. Also there are more OSes that use Linux kernel. Google it.
2)Android runs everything in VM therefore good compatibility and poor performance.
3)It’s a phone OS that has nothing to do with Valve’s target market.

It is not that a surprise to me. I have been a gamer on Linux for a long time, and I’m very pleased by the direction that Valve took with it.

This phenomena has already been experience by many gamer, even on windows version of games. Wine, a windows API implementation for linux, is used by many gamers to play windows game. If it is sometime buggy (it is done by reverse engineering) and less performant with DirectX games (DirectX call have to be translated to OpenGL one).

But OpenGL games tends to be faster, even windows version of them. It has been the case for Q3, or WoW, to mention some very well known titles. Good to see game company noticing Linux is a suitable game plateform. Good luck !

I am incredibly happy with these numbers. What is more interesting in this discussion than Win vs Linux is that you are working with hardware vendors. This just may wind up being the first of the major nails in the windows coffin.

The reason? The #1 problem I faced EVERY time I installed linux was video card drivers. Pretty much that was the main reason why laptops/desktops were compatible/incompatible with linux. The #2 was sound. And I am pretty sure that this is true for many others.

If you work with Nvidia/ATI/Intel to ensure that linux drivers are on the same quality in all aspects as the windows counterparts, I foresee linux gaining traction rather quickly.

Stalman said that you introducing DRM to a free OS is bad. I think you working with video card drivers is good, and if you were to convince them to open source their drivers so the linux community can finally get a crack at it, the benefits would go through the roof. I would not be surprised if linux open source video drivers would quickly begin to surpass windows in performance by leaps and bounds.

In any case, congratulations on the port. I wish more companies would spend time ensuring their engine works on all operating systems, THAT is truly the revolution.

I think what I am most interested in is the indirect benefits you are bringing to the Linux platform by porting your games onto it. The post discusses having to optimize the graphics drivers for the game, and having the work with the manufacturers for it.

Have you found the process easy at all? Are they giving you a hard time? It was not too long ago where there was discussion about nVidia not really giving a hoot about Linux, but here you guys come with your fancy pants games, making everything better again.

Would love to hear about the process on working with these manufacturers!

I have noticed when I run my benchmarks that the default desktop that Ubuntu puts me in runs slower than Windows. I have to switch from Unity to Xfce and cut off Compiz to get my benchmarks to faster than Windows. When doing your benchmarks, what desktop environmental were you using and what settings did you change from Ubuntu’s defaults? I’m asking this because publishing an article like this may let people down when their system runs slower than Windows just because Ubuntu’s defaults aren’t the best.

Never mind the OSS purists who moan about proprietary binaries, please. They are a small but vocal minority of the penguins.

I think I speak for the large complacent mass of users who don’t give a damn about that when I say that we much approve of your work and anticipate the outcome.

I do not really wish to bash the purists – they are important and the reason why Linux exists in the first place – but when a commercial company develops commercial products using free-as-in-beer drivers, it is a bit retarded to bitch about the licensing. Looking the given horse in the mouth, and such.

Hooooooold on there cowpoke… Just because A can be B doesn’t mean that B is exclusive to A.

SOPA and PIPA were bad policies that the MAJORITY of people disagreed with… It wasn’t stopped simply because a few vocal FOSS purists said something… Hell, the BIGGEST voices against it were for profit companies (see Google)

Given the wide variety of hardware that Linux supports, what OpenGL implementation(s) are you guys looking to target (i.e. OpenGL 2.0 and/or 3.0, or possibly 4.0 when supported by Mesa3D)?

I’ve found that “gDebugger” is a great GL profiler for my projects, have you guys used this tool? Would you recommend any other tools for performance tuning?

Having worked on numerous commercial Direct3D9 games over the last few years then returning to OpenGL for a side project, I found that compatibility amongst different vendors to be a challenge. Have you guys seen had to work around differences between Hardware Vendors (i.e. shader compilers being loose/strict, etc…)?

Will Intel video cards be supported in the Linux release or is it strictly targetting NVidia/AMD video cards?

A couple of observations when working on OpenGL (not necessary to publish, but may help everyone):
-Pack Vertex Buffer Object(VBO) data tightly to remove the necessity of VBO sub-binds by separating each sub-array
-Some Vendors (i.e. NVidia) suggest using 32-byte sized vertex structures for optimal performance
-Batch render all VBO/IBO data
-The most obvious optimization for OpenGL is to minimize state changes (I found using Observable pattern to control state changes helped immensely)
-Use “gDebugger”, its performance tools are amazing and comparable to PIX (imho)

I’ve used gDebugger some, and while it’s helpful for some things it’s rather prone to crashing and seems a little outdated. Hopefully with more interest we will get a proper open source gpu debugger on the level of gdb or gprof.

Suggestion: I know Wine is not and won’t be your target platform, but it would be really interesting to try your benchmarks on it for comparison – LfD Windows version running on latest Wine on Ubuntu 12.04.

For some games it’s pretty close to the Windows version in FPS, and I’ve seen the network latency even better. Would make an interesting addition to your benchmark suite.

It performed better because WINE lacks (not sure if it does still) certain graphical enhancements. Like Anti-Aliasing and Anisotropic-Filtering etc. So yes in some cases depending on the engine it will run better with Wine.

Reason being that the free drivers ( Open Source ) use Mesa ( now known as gallium ). Now as I am sure you are aware Gallium does not implement a full GL stack since it cannot due to patents on OpenGL extensions like S3TC and alike.

I think the question is, will Valve perhaps help leverage these patents invalid since really OpenGL should be just that, OPEN.

Well, as long as you’re adding people to lists, I might as well throw my name out there and pray. I’ve got multiple hardware setups, ranging from a Macbook Pro 5,1 to a fairly run-of-the-mill gaming desktop to an old desktop which has components that are about four to six years old. I’d be more than willing to switch my distro from Debian testing to Ubuntu to give you guys a hand in this.

I play minimum of 4 hours per day (not currently in Steam :D) and some days i reach 6 or even 8 hours per day, i have the will and time to burn to help as much as possible !!! (Sundays i can play even more hours !)

My player name (not account name) is AJSB (account name is same as before the @ in email provided in this message)

My Avatar is a Skull with a white background….

I have almost 900 hours played online in ETQW, more than 600 hours in BFBC2 online and more than 200 hours in Black Ops (that one is on Steam :D)

Are there plans for an opt-in beta? I’d be keen to be involved, but it feels inappropriate to beg for beta access when it’s not clear whether or not testers are something that’s required at this point.

At this time, we are collecting interested persons for possible inclusion in an external beta in the future. If you have expressed interest in a comment or email, you’re on that list; nothing more is needed.

I’d also want to be part of the beta team !
I’m not a hard core gamer anymore since I have a work and a life, but I’m a Linux system and network engineer so my work is to resolve OS level problems every day.
Maybe that I could be of any help

I’m pretty sure i read somewhere that the reason why Valves main reasoning behind programming for opengl was due to it being just as capable as direct x and if not more so, due to the fact that it was open to more platforms and could be tailored to your liking. The problem is finding someone experienced in opengl.

I’m a zombie fan by all means, but I only use Linux, so having this port should be a game worth buying.

Also, to those crying foul, I have no issue with either valve not releasing source, or valve’s DRM. Why?

Because the problem with DRM is that it restricts what device you can run it on, in the case of games, this is not an issue since they have to be ported. There is nothing wrong with stopping piracy, which DRM helps to do (although not very well,) it’s only when it interferes with legitimate use that it becomes an issue, which this does not seem to do often.

Further, while I would highly appreciate valve releasing source (just engines, not game data), I don’t have the same expectation that they do this. Valve releases games, not programs that manage our money, nor that handle our communications. We aren’t required to play these games to earn income, they aren’t part of our lives in the way our OS is. If your OS spies on you, you have no expectation of privacy, if a game spies on you, it knows how well you aim. Games are also not reused in the same way normal programs are, although they benefit from each-other’s work, the benefit is less pronounced.

However, I really hate having blobs for drivers, please convince AMD and NVIDIA to open source their drivers, it would be better for users and game developers alike.

In a perfect world, all game engines would be open source, but we can’t always have a utopia.

This is probably a minor thing, but important for me personally: does the linux version handle multiple monitor setups correctly? I’ve had weird issues with my two-monitor setup and native linux games, like warsow and openarena (e.g. half of the game window on one monitor, half on the other).
Might be a problem with my configuration, or may be it’s just tricky to get right.

Just my own experience, but I’ve found that the configuration can be tricky to get right but that once you get it right it tends to work better than windows. Actually, I think that is a common theme in windows vs. linux.

P.S. Wine works freaking awesome with dual monitor btw, setup two virtual desktops and bind one to each monitor and it is nearly perfect. Used to play Eve Online with two simultaneous accounts that way. Had to have a third party tool (Evemon) to do it on windows, and it didn’t work quite as well.

Actually thats a decent point. Basically every Linux game I’ve tried will launch on any monitor it bloody well feels like it seems. I guess there may be some setting somewhere (like in windows, I believe it hits the primary monitor) but i’d be nice if you could just tell steam “launch games on monitor x” as opposed to faffing about with other stuff.

10098, There are usually 2 modes of operating two monitors on linux. TwinView, which stretches a virtual desktop across two monitors. To all the video games it looks like one big monitor, that’s why you are having that “half game on one, half on the other” problem.
Second mode is “Separate X Screen”, which will allow you to use second monitor with a completely new instance of your Desktop Environment, you can even run KDE on one monitor, and Gnome on another. The downside of it is you won’t be able to drag windows from one monitor to the other.

I’m not sure about those specific games, but that sounds like one of several reasons I run my games windowed.

First, are you using nVidia TwinView? I haven’t confirmed, but it’s possible you get that effect because TwinView announces the merged desktop to XRandR as a single monitor.

Second, have you tried setting up an xorg.conf with a MetaModes option? That’d allow you to add “one monitor at resolution X, the other turned off” as an option for fullscreening.

(Last time I used that, games Just Worked™ when you did that because it was closer to their desired aspect ratio than the merged desktop… but it periodically throws my desktop sticky notes into disarray so I don’t use it anymore.)

My approach these days is to set the game to run windowed at the resolution of one of my monitors, then set my window manager to automatically remove the window borders, set always on top, and position it to fill the one monitor exactly.

As a bonus, doing so lets me leave something visible on the other monitor so I can keep an eye on it.

We are already in contact with other game developers and building a list of those with interest in porting existing games in their current catalog to Linux. Some of these companies already have Linux versions available.

Great work on improving performance. Could you comment on image quality?

Early reviews of your ports to Mac OSX showed poor performance as well as poor image quality. If Linux is achieving a higher framerate but image quality is inferior then it isn’t a fair comparison vs Windows+DirectX.

I’m also curious about this. With the last Humble Bundle pack I bought, many (or all) of the games were cross-platform games so I registered the Steam versions but also downloaded the Linux binaries.

It will likely be up to the game developer to submit their Linux version to Valve which I doubt will be a problem; if they already have a Windows and OSX version on Steam then I doubt they’ll be too hesitant to submit their Linux version. As someone who has been spoiled by the ease of using Steam, it will be so much nicer to be able to install and run games via Steam in Linux than by doing everything manually.

Also curious. The team’s efforts to have some linux-compatible games at launch is admirable, and I appreciate it. But to my knowledge, there’s already a decent number of such games–as has already been said, the Humble Indie Bundles are prime candidates, since many of those games already have Steam Keys.

I don’t have a huge issue doing things manually, but my directories do get a bit… cluttered sometimes. I’d love for the team to release at least the client for testing. I’m interested to see how it performs on other distros than Ubuntu.

The difficulty in supporting Xorg is only on the side of the window manager. It’s not particularly difficult to support as far as desktop applications are concerned.

Neither Xorg nor Wayland have very much to do with OpenGL rendering performance.

The issue with nVidia and AMD supporting Wayland is that they’d have to re-architect their driver quite significantly to deal with the new kernel interfaces (not a part of Wayland, but what Wayland is built on top of). There are some tangential benefits in doing this but it’s really just a bunch of work.

So the linux version run faster than the windows version?
thats a good news!
And the researchs on open gl successed to speed up the windows version.
Its all means something,Linux has a lot of potential!

What do we have to do to have you guys port this stuff to a “desktop” ARM processor and use OpenGLES instead of OpenGL? Granted you might lose some graphical quality (I’m not sure ambient occlusion or other advanced Source features done after the initial Half-Life 2 ship would be prudent, but the games themselves did run fine on my low end Pentium M with not-so-great Mobility Radeon 9200 graphics card).

Devices like (plug, plug) the Efika MX would make great little gaming boxes. Things like Torchlight or Half-Life 2, or Portal would work fine if you weren’t doing it for the special graphical effects or to stress out a GPU (intentionally). 800MHz and a low-end AMD-based GPU might not be that awesome, but 1.2GHz, quad core and a high-end OpenCL-capable GPU (as in i.MX6 or Tegra3) would certainly be available. After all, since 300fps is faster than we can even perceive visually, a steady 30fps is what we would like to see for a reasonable gaming experience – and modern Mesa and modern GL support on Linux has GLES layered on top anyway so it’d still work on the big PCs..

One of the dangers of the OSS community is that they are used to getting everything everywhere… maybe we should wait until they have all of their code ported and working on x86. No doubt ARM will definitely be a powerhouse in home computers in a few years, but it’s still at least a few years away before coming to it’s own in that market.

While it’s admirable that Valve is willing to put such a (relatively) big risk investing in Linux at only 5% of the marketshare, I think it’s asking too much to try and get them to also invest in a much much MUCH smaller smaller market besides that. Though it could be used as a possible future stepping stone into mobiles (kind of surprised Valve hasn’t stepped up their portable apps to be stores as well)

I posted a comment around this time yesterday, but it seems to have disappeared (others have since asked most of the stuff I was asking, so it’s a little confusing).

One part hasn’t been covered that I’ve spotted, so I’ll try again.

Will you be talking about how much success you have with increasing performance of Direct3D? It may sound counterintuitive to talk about Windows stuff on your Linux blog, but with OpenGL being a significant F/OSS project (and a key technology for Lunux game developers), it seems relevant.

Have the performance results you’ve cited changed the way that Valve looks and and approaches these technologies? Assuming that future titles (and engines?) will be cross platform, will OpenGL have a higher priority in general now that you seem to have tamed (or unleashed?) the beast?

I noticed that it was mentioned that you’ve been in touch with existing developers regarding getting their Linux games published – it would be pretty neat to have a future blog post about how that process has been and what kind of launch titles we might be able to expect.

Super excited for the switch, gaming has always been the #1 reason I’m not primarily using Linux.

Also, once the engine is ported to Linux, wouldn’t that mean in theory it should work on Android (scalability and control issues aside)? Would you consider going down the route of creating an android app store for cross-device syncing of purchased games(apps)? Considering you’ll have steam on Linux, it seems like a logical step.

I just wish there was more time in the day to spend on tools to release something even half as polished/accessible as the Unreal engine so you can promote / sponsor more stuff running on Source and get more 3rd parties (especially indie) using your engine. As it stands, while Source can now run on Linux, Most of my Steam library doesn’t.

Either way, really looking forward to the future, and great work as always!

One thing about opensource drivers for NVidia and colaboration on OpenGL and OpenCL under NVidia GPU’s is their current partnership with Blender Foundation sponsoring their Project Mango (http://mango.blender.org), Cycles render engine seems to be a good way for real time rendering with OpenGL.
Check it out.

You are welcome, I think the role of NVidia on that was almost just giving them the hardware to test and benchmark, and releasing some expecifications, I don’t know the extent for sure yet, as the Blender Foundation website is not much clear about it. It could be useful or not, you better contact Ton Roosendal, the head of the foundation, because he’s the one managing it all…

Will the released deb file be created under Ubuntu or Debian? If you do it under Debian it can still be installed on Ubuntu machines but you also have the added benefit of being able to install on other Debian based distributions as well.

Better yet is to be in a .bin file like the installer for ETQW….that will allow to install in many more distros like SLACKWARE

However, the game installing is probably be done directly by the Steam client…possibly the game itself will be static build that would solve a lot of problems between distros….the Steam client would be in a .bin….

Yeah….and the build should be static of course…ETQW continues to install and play well in any distro AFAIK preciselly because of that….STATIC builds solve a lot of problems when it comes to Linux cross-distro.

That might become more and more wasteful as games get ported to linux… Maybe there should be an option of using native libraries? Or possibly a shared set of libraries only used by source games on that system. I’m interested in how VALVe will deal with this, maybe it could be a separate steam package. The guys at Google seem to be doing it right, although this seems unrelated, their Android SDK contains its own package management system (Much like steam) and functions perfectly across all sorts of distros, although I’m not sure how they do this and whether or not it’s of any use, it’s not like those tools are as hardware dependent and resource taxing on a computer as games.

Most of us sensibly put games in /usr/local/games anyhow and have that $PATH’d accordingly.

This would create a steam dir that would update OTR, as would all games, so the thing with deps / pkg management evil is reduced / moot.

Why add the layer in the first place?.

A decent log system will tell you if a certain lib / driver or whatever should be able to tell you “hey your abc library is out of date!”. perhaps the steam client can do a quick sanity check to make sure its got the bare minimum reqs?.

Yes, i have same opinion…it should be the Steam client that should check out the bare minimums if not to install directly what is needed by itself, to at least inform user that something is wrong and that the user should upgrade package A or B.

Steam integrates with Windows. Why shouldn’t they integrate with Ubuntu? And they should integrate with every other operating system too, including Fedora, OpenSUSE, or any other operating system they may choose to support. Trying to create a one-size-fits-all solution will only end up creating a one-size-fits-nothing problem as it will be crappy everywhere instead of being crappy most places but working really well in some places.

+1 with the ..bin/.run installer and statically linked libraries. Valve has absolutely nothing to lose for doing so (in fact, statically linked libraries and .bin installers guarantee maximum compatibility across various distros), and while some people want to install to /usr/bin/games, there are advanced users who wish to install steam to a location of their choice (i.e. /opt, which may be mapped to a separate hard drive or even a SSD).

If removing overhead is a goal, isn’t that literally one of the main points of DX10 and onwards? I know Valve is seemingly big on backwards compatibility, but a DX11 path for those on Vista or newer seems to fit the slot. Is it something potentially in the cards for the future, or will the DX9 path still likely be the preferred Windows option for the foreseeable future? For example, World of Warcraft, also seemingly CPU bound much like Source often appears to be, received a DX11 path and a 64 bit version — both of which improved the scenario but quite a considerable chunk.

Second, this is extremely good news and I’d like to know how the Linux OpenGL renderer is different form the one that already exists for OSX Source engine games. Is it completely rewritten or just a Linux optimized copy of the OSX one?

And something completely different: If we know there is game on Steam where the devs are interested to port over to Linux but might need the right incentive and/or help, would the Valve Linux Team render assistance and what would that look like?
I know you are already reaching out to game devs who have existing Linux versions but it might be smart to reach out to interested parties as well.

I’m very interested in beta testing this too. I may not have the raw talent of breaking things to find bugs, but my luck is pretty terrible sometimes which I’m sure is very useful for identifying and reportings bugs.
Thanks,

I’m interested to read that the rendering is achieved via a D3D > OpenGL abstraction layer, especially as it appears to be faster to do that than native D3D usage, that seems somewhat counter intuitive!

I’d be interested to know though if your abstraction layer is generic enough that it could be applied with the same success to non-source-engine games, since that would be a nice big door to open, unless that’s something you don’t wish to discuss right now! The prospect of an almost no-effort-needed wrapper that achieves native performance is a good one.

Although the abstraction layer is currently quite powerful don’t think of it as an “almost no-effort-needed” layer. A lot of time and effort was spent to get the layer (and L4D2) to the point it is today and we still aren’t finished.

Sorry, I wasn’t meaning to imply it was easy to create what is there now. I was more suggesting that once the vast majority of the complicated work is done on creating the abstraction layer, would it then be usable almost in a vanilla state with other projects, or does each project need substantial changes in order to use such a layer? Is it overly simplistic to think of it as a library that exposes a DirectX API that converts to OpenGL on the fly?

I find the prospect of Linux/OSX being a viable gaming platform pretty exciting. Would it be valid to compare what you are doing to be similar to what Transgaming’s Cider does?

I don’t think that wayland would make any difference. Modern X11 protocol is designed with extensibility and compatibility in mind, it took years but it’s so good that programs written tens years ago can run on modern servers. But wayland currently looks just like a limited version of Xorg, something like X Windowing System in its early days. Even those few programs written for wayland break when the protocol changes in a new version. And some of its architecture decisions also look weird (no external taskbar/dockbar possible, no windowing managers, no decorators and hence no common desktop theme possible). Currently wayland popularity is based on the “X is bloated, Wayland will be better” words spread all over the internet, but it’s not really clear why wayland should be any better.

Talking about Xorg alternatives — how about checking out DirectFB? No xorg, no wayland, no intermediate servers, just the hardware and you. Imagine a Steam/L4D2 livecd, something that you can boot from and get directly into the game, no need to install/configure, everything is already done for you, just boot and play.

And I would also love to see some comments about closed vs. open drivers for AMD/nVidia.

In this hypothetical situation it does not have to be a LiveCD, USB is okay, but also it doesn’t really matter. If there was a patch you would just burn another disc, like how you would a update a Linux LiveCD with new packages anyway.
It’s a good idea, but it would be irritating to have to reboot just to play a different game. That is one of the points of an Operating System. A Steam version of Ubuntu may actually be viable though, just boot directly into the steam client like JoliOS or ChromeOS. You could then tailor updates specifically for gaming and Steam software.

About numbers and proportions: it looks that it was told that are 3 areas to improve: Linux kernel behavior, opengl coding and video drivers.
I’ve been surprised that it wasn’t specified nothing about compilers:
– the mainsteam compiler used in Windows in my knowledge is MS Visual C++
– on Linux is GCC
As my knowledge goes, both of them are great optimizing compilers but generate different performance profilers, the tighter loop is not so well optimized, so you have to use other tricks to basically assembly loops or intriniscs.
As for me this means to me one of the following things:
– tighter loop is written in assembly, making the compiler to be of little relevance, as you know the final code what it would be
– you’re using the same vendor on all platforms (Intel compiler!?, GCC/MinGW on Windows)
– you’re using the best compiler (and best compiler options of the platform) and you notice that at the end the performance is similar
– or just: compiling code is not that relevant (excluding memory behavior)

Ubuntu 12.04 was only released a couple of months ago. A more realistic comparison would be to Windows 8 which is now RTM. If the most recent Linux can only beat a now ancient version of Windows by 3% then Windows 8 would absolutely kick Linux’s back doors in :

1. When Windows 8 is released to the general public (October 26th), Ubuntu 12.10 will have already been out for 8 days, so that would be an even better comparison.
2. Ubuntu is not and has never been the most recent Linux. For that, there are “bleeding edge” rolling distributions like Arch.
3. The benchmarks in this blog post are not Windows vs Linux. They’re how well L4D2 is optimized for Windows vs how well L4D2 is optimized for Linux.
4. L4D2 on Linux is 3.8% (which rounds to 4%, not 3%) faster than the Windows OpenGL version, and 16.4% faster than the Windows DirectX version. OpenGL generally works about as well on either OS (most of the driver code is the same), so I wouldn’t expect much change there. I expect DirectX performance to be brought to OpenGL levels, but that’d be from Valve further optimizing their DirectX rendering path and drivers being optimized, not a new version of Windows being released.
5. How would Windows 8 improve performance of L4D2? Last I checked, the Source engine only uses DirectX 9 and below, so Windows 8 coming out with DirectX 11.1 (which will probably also be available on Windows 7 anyway) shouldn’t change much.
6. The article you linked to seems to be about adding 2D hardware acceleration to apps using the Metro design language. Not only is Microsoft rather late in adding this feature to their user interface (open source hardware accelerated UI libraries have been around for years), but this 2D acceleration is completely irrelevant to 3D programs, such as L4D2.

While I understand OS market share, I’ve never understood why Linux/OpenGL didn’t become the penultimate PC gaming platform years ago. Perhaps due to your efforts (and the ever-increasing nature of the Microsoft OS) it will be in the near future?

Your conversion of the Steam client and Source engine will directly benefit Valve games by bringing them to another platform – one more popular than many believe (gaming support is the only reason it isn’t my only platform). But in order for Steam to fully benefit from this work, more developers of major titles (at least those selling on Steam) should be taught/encouraged to perform the same work on their games. Do you have any plans to create a “Linux port bible” or any similar device to help spread the adoption of Linux as a major gaming platform?

Thanks for the suggestion Steve. We are documenting as much as we can about our experiences and discoveries. Currently, we hope to post the majority of our findings on this blog and are looking into other alternatives for the future.

I am unsure if the team working on steam for Linux has worked with gaming capable laptops with the Optimus technology (that enables the laptops equipped with nVidia cards and intel HD graphics to switch between one and the other), however as far as my knowledge go, nVidia has not given any help to those gamer. However you might want to check out project BumbleBee, which does the job nVidia didnt do… I hope this wont be a problem for Steam, because I am one of those gamer (ASUS G74SX with Optimus) and I have to dual boot because only the Intel chipset works properly under Linux (Ubuntu 11.04). Thank you for your great work

Thanks Valve! You’ve just erected the titanium framework to effect the largest mass exodus to the Free/Libre GNU/Linux operating system (in Desktop/Notebook role) in the history of Technology!! I mean, who wouldn’t want to gain nearly 20% in performance by switching to something that is licensed as absolutely FREE!! Outstanding!! Here are a few suggestions for success wtih GNU/Linux: 1.) Figure out how to apply the Android open source factor for developer frenzy-ness to the steam platform, 2.) In the same way that your major industry pull gives you obvious command over the graphics cards makers – convince every OEM (and also system 76, zareason, eightvirtues, etc.) to provide a specific Ubuntu GNU/Linux-based open-source Steam-preloaded gaming computer (notebook and desktop version) that is branded as such for the masses to embrace the rocking, newly-available 20% worth of awesomesauce, and 3.) Figure out some very convincing incentive for game producers to port their games to GNU/Linux in the steam platform (and ASAP) (this, I think, would be easily done after step 2 above is achieved). Way to revolutionize Steam! You Rock! These are exciting times for Technology indeed!! Humans Enabled – That’s what Technology is For!! Get your GNU/Linux on!!

I would also like to be a beta tester. I’ve got an dual boot pc. I have got the Left 4 dead 2 and it run ok (60-80 Fps) my windows 7 computer. What will be the Left 4 dead 2 minimum requirements on ubuntu?

With regards to the per-batch overhead in Direct3D9, isn’t that caused by the syscall overhead? IHV drivers for D3D9 are in kernel-space only and the whole user-space portion is provided by Microsoft, vendors have no control there.

In OpenGL vendors have “half” of the IHV in user-space so they can mitigate this by marshalling multiple operations and waiting for the right moment to do the syscall.

Just wanted to ensure I’m added to the “interested in beta” list. I believe my past experience with both Linux and gaming could be of help to your efforts, and I would do my best to provide useful feedback. Thanks for your consideration.

I’d be very happy if Steam on Linux had support for (or at least grudging acceptance of) tiling window managers, such as awesome or xmonad. I understand that dealing with a window manager which takes control of the shape of your window isn’t trivial, so I won’t hold my breath for the initial release, but I kindly request that this be evaluated at some point. Thanks!

Hey it’s me again. I have a question about being allowed to bundle just the steam client with a Linux distro. e.g. it comes preinstalled with say Linux Mint or even a custom made Linux distro for Steam.

I may be a little late to this comment thread, but I was just curious. It seems to me that the one big complaint that Windows developers transitioning to Linux is that the development tools seem to be lacking, that there seems to be no good enough substitute for Visual Studio. I was just wondering if you were using the traditional Linux development (e.g gcc, make, gdb, etc) or if you were using something more like Visual Studio like Eclipse. I would understand if it was a trade secret though :).

Well, I highly suggest QtCreator (with a .pro file and not a Makefile import). Very good IDE with very nice features and growing very fast, also, soon will come with Clang support (and this will bring lots of features in par with XCode and VS IntelliSense). What do you think?

If you still need more beta testers, I can be one of them, I will be glad to help you guys and the community in anything I can, I have Steam for Windows too, with almost 40 games, I am a long time user of it, here is my specs:

This question may not be answerable, but if anyone is at liberty to speak up it’d be an interesting answer. For games that already are available on Steam, and already have stable Linux ports, might Valve solicit the dev/publishers? One prime example being Doom 3 which has had Linux binaries for a while. For Steam that seems like low hanging fruit to have more titles to play with.

Dear Valve: I notice you have some pretty cool Valve-games related apparel at:http://store.valvesoftware.com/index.php?t=1
Any chance we could get some kind of “Steam’d Penguins (Linux)”, or Steam+Linux, or “Linux Steam For Speed” branded apparel available for purchase? I would gladly wear some awesome T-Shirts that promote both Linux and Steam. Thanks!

Will Valve make use of ATSC in the Linux port of L4D2? I believe the Nvidia drivers for Windows and Linux supporting the latest OpenGL 4.3 should be up soon… it would be fun to see Valve make use of ATSC instead of S3TC and show if it helped reduce the size of textures streamed to the GPU under Windows and Linux? I’m not a graphics person, but they seem to imply that switching from other formats like S3TC for compressed textures to ATSC will result in higher visual appeal, but perhaps they mean it enables for other effects operating on textures to result in higher visual appeal. Either way, the Linux port of L4D2 and Steam has given us a nice view into development at Valve and it would be great to see if ATSC further improves the situation.

Something to keep in mind, there is no hardware support for ASTC yet, so using that would not be helpful.

Also, only the proprietary AMD and nVidia drivers (will) support ASTC, and the proprietary drivers already support S3TC, so there is really no reason to switch away from it.

Finally, ASTC is not a completed standard yet. There is a reason it is only an optional part of OpenGL 4.3 and that anything made with the current spec may not be compatible with future specs. In all likelyhood, it’s going to be another 3-5 years minimum before we see the standard adopted (and most likely, not until it’s supported in DX12-13 in hardware for cross-platform support)

Fantastic work! This post said a lot for me, here’s a test of my machine running Amnesia: The Dark Descent on Wine (on Ubuntu 12.04) with a nVidea 310m, and I can max out everything. I did a test for both 64 bit OS (Windows and Ubuntu):

I have long been interested in Linux but have been put off by (it’s? my?) inability to install it.
I am technologically competent, yet I cannot seem to get any version of Linux to run completely on any system. I have tried on and off for years: Mandrake, RedHat, Ubuntu, Puppy, DSL, and dozens of other distributions on dozens of media, dozens of versions on many builds. Occasionally I will succeed in booting to a desktop but in those cases there is always the issue of drivers: never does all my hardware work.

I have friends that are ‘into’ Linux, but they have never shown me a fully working installation either. I have heard many and varied claims of it’s wonders but everytime I ask to see their Linux install their is a convenient excuse for why at this moment in time one aspect or another of the system isn’t quite functioning.

The only working Linux I’ve seen is on my android phone, a radical departure from more traditional flavors.

I am excited and glad to know that Valve is working with hardware manufacturer’s on driver support, but I hope they are also working on the installation and distribution side of things. It would be wondrous if this operating system could operate my system, because as things stands today only Windows can operate a wide variety of systems.*

*(no, recompiling the linux kernel so it runs on your watch does not mean it can run on a wide variety of systems)

Sorry but it isn’t Linux that doesn’t work with a wide variety of hardware but the hardware that doesn’t work with the wide variety of Linux. This is an immensely hard fact for most technically competent users to understand.

Windows doesn’t do drivers for hardware not made by Microsoft and when end-users run into problems on Windows, update your drivers (The ones made by developers targeting Windows and whom the know the hardware best) is the mantra.

You can build an amazing system with cross-operating-system support if you research what you’re buying. If the hardware claims to be supported on Linux, you can almost be certain it’ll be supported on Windows and in the end, a sweet dual boot computer.

It only takes a single piece of hardware (just imagine any piece) to be unsupported for your experience with Linux to be considered an epic fail but when all the pieces come together, it certainly feels like an epic win.

Good luck, I hope you give it another try sometime and I hope it works out for you.

Well man, I don´t know what you mean by “…inability to install it.”, I use Ubuntu, and I never had any issue installing it in my system, from the very first time I had a very easy experience, I put the CD, type some information and that´s it.

I agree that in the past, Linux was very difficult to install, but things changed a lot, I don´t know when you tried to install it, but today, it´s very simple indeed, even the drivers scene is changing, for an example, I tried to install my HP 840C Deskjet printer in my sister´s Windows 7 laptop and it simple couldn´t do it, but on the other hand, in my Ubuntu 12.04, all I had to do was to connect the USB cable and the system recognized it at the same moment for my surprise, I didn´t have to install any driver.

If there´s someone to blame for lack of support for drivers, it´s manufacturers fault for sure, not Linux, but with actions like this one Valve is doing, things will improve, don´t give up man, you can try newer versions of other Linux distributions, I use Ubuntu 12.04 64 bit and can´t complain.

Having used FreeBSD and various linux over the years, by and large this hasn’t been my experience: drivers are re-packaged into new releases of many distros, and so my typical experience (driver-wise) is actually better than Windows, because usually the only 3rd party drivers I need to download are graphics drivers (and only for 2D/3D acceleration), everything else works ‘out of the box’.
Which isn’t to say that your experience is somehow ‘wrong’, only that such anecdotes can’t be generalized.

First linux i ever put in that ran everything fine was kubuntu. I was coming straight from windows and i was used to the look and feel of windows which is something kubuntu does quite well. Ive heard similar stories with Mint but it alas it wasnt the same for me(couldnt find my wireless network card). Now I use neither but… the point of what im saying is transitioning is always kind of tough because its not what youre used to. when you come across a problem you might not want to deal with it because its something you wouldnt normally have to do. After years of using linux the same thing happened to me when i had to fix a problem in windows! I went to a friends house and his computer wasnt finding their network. He asked me to take a look at it and my mind drew a blank i had no clue what i was doing and it just seemed like a hassle. It had been so long since i had to put in drivers or anything like that for windows(in linux i usually connect the computer to the internet and let the updates roll.) When it comes to transitioning, just do it. Dont think of it as a switch, think of it as an addition and keep your windows side-by-side for things you do in windows. It wont be long before you’ll have “things” you do in linux that you’ll prefer linux for.

Unfortunately, your experience is the exception, not the rule. That’s not to say that Linux is compatible with 100% of the hardware it encounters, but the advancements made over just the last 4 years alone have been significant, especially with the *buntu series.

I had to install Ubuntu recently, and after the installation reboot, everything just worked. Display, network, everything. My friends that install Ubuntu, Mandriva, and other flavors have the same results. It just works.

Also, what is your definition of “fully working”? Is it the system time resetting every reboot, or your PC emitting sparks every time you type on the keyboard?

I’m looking forward to hear more about the optimization effort on the blog: I can see how -fPIC might have been a bottleneck on an engine like Source which has lots of dynamic linking.

One point that wasn’t clear from the summary: when you talk about multithreaded drivers, are you talking about the drivers being multithreaded internally or accessing OpenGL from multiple client threads?

It’s great that there’s communication between you and the driver devs since most proprietary drivers on Linux are quite sub par compared to their Windows counterpart. I don’t think this has happened since Doom was released on Win95 which started the whole monopoly of DirectX. I hope that this project won’t end up like the 64 bit versions on Windows which did run faster when it was first released but eventually the only good thing left in it was the soundtrack of HL2 in .wav format hidden in the gcf.
Been dual booting Ubuntu 12.04 and Windows 7 both on a laptop and a desktop system for a while now so I’d be up for testing.

All this work is great. I love linux.
The problem is that when this is all done we need more than a couple games then no more support once windows 9 comes out and fixes some issues. So the question that needs to be asked now is how to get games and gamers into linux.
Developers: give them a larger cut for games sold on linux (this will pay for the added cost of development).
Gamers: make the games cheaper (couple buck is all that is needed), also provide a copy of Linux with steam store installed on the steam store and website.
Just some ideas, but I am wondering what are the plans for luring in developers and gamers. The if you build it they will come plan really only works in the movies. People are far less computer literate, and too scared to change without a push. Developers won’t put the money into linux development without a payoff.

I’m interested also to see how they’d play on the new range of AMD APU’s. Whilst not ostensibly gaming chips, I’ve been able to play some older(ish) games on it when running Windows 7.
Given that Lenovo and Samsung are shilling machines rocking these chipsets, and they’ve started to represent a small but significant share of Steam’s userbase, I’m curious to see how they’d perform under linux.

I am not sure why it comes to such a surprise as why the game runs faster on Linux. over a thousand people submit patches and updates for the kernel and drivers. The Kernel has been updated not scrapped and redone such as on windows and mac. monolithic kernels by nature out perform
micro kernels certain performance hacks on micro kernels are by nature and automatic on a monolithic. the main argument for switching to micro was stability, but that’s something that Linux has always been top dog in.

I’m going to back dimgel’s comment here. Humble Bundle has shown that Linux users are usually the most willing in giving money for products.

Anyways, in regards to the Linux graphics stack I think things are looking relatively positive for it. With Intel and AMD both backing open source movements, the momentum that the open stack has seen in the past two years has nearly doubled. People often say AMD doesn’t but they release the needed technical documents on the GPU and has a staff for just open source graphics on top of the existing Linux-based team for their binary drivers. That being said, centralizing the 3D Core to the Mesa/Gallium3D projects was probably the best thing to ever happen to the Linux stack next to the movement of graphics drivers out of the X11 tree.

Anyways. Rant aside, why Ubuntu of all distros? They are the most hostile towards to upstream. They’re one of the few distros who try to “Think Different(tm)”. It would’ve made more sense to collaborate with Debian or Fedora before collaborating with the Ubuntu guys.

But you know what I suppose that’s a “whatever” factor in the end. Valve can probably push Steam to other distros after focusing on one Linux distro. I just feel iffy about them backing Ubuntu as the main distro, though..

Well, Ubuntu is one of the most popular mainstream Linux distros, and Valve isn’t going to waste their time initially porting to a distro with a low market share. Of course, down the road, I’m sure they’ll support as many as possible, but they need a strong start, and the Ubuntu install base will provide just that.

Your findings with regards to OpenGL vs Direct3D are in line with what I’ve seen in Blizzard’s World of Warcraft community postings. That team built support for OpenGL into WoW from the beginning from what I understand, making it easy for Linux gamers to get it working under Wine. It turned out that the Linux players often saw better overall performance than the Windows players. I think the reason it didn’t catch on within the player base, though, was that some graphical features (such as sunbeams and some shader effects) were never implemented in the OpenGL wrapper, or weren’t implemented until much later after Direct3D.

I agree completely. In 10 years, I’m willing to bet that we will look back and point to Steam’s Linux port as a real sea change in the state of the Linux desktop.

It would be interesting to see a comparison between the closed-source drivers and the nouveau and radeon drivers (open-source drivers for NVidia and AMD cards, respectively). It’s more than just a difference in ideology: the open-source drivers are part of the kernel, come with most distros by default, and are updated alongside other system packages. The user experience is so much cleaner with the open-source drivers. I feel like it would be an all-around win if this effort resulted in better open-source drivers as well!

In either case, this is definitely something Valve can be proud of; contributing not just to computer gaming but to the whole desktop computing ecosystem!

I hope this is true, all my boxed Linux games from Loki say otherwise, that was 10 years ago. Then again we now live in an age of digital distribution so the economics of game publishing are totally different and Valve is porting their own games not licencing other peoples games.
Either way I look forward to Valve’s efforts.

Blizzard has internal release WOW client for Linux since years. They developed it for testing and software quality purpose but they never released it because the QA and support cost wouldn’t be covered by the incomes.

why is this relevant again?, how is it itelf relevant
this is not going to happen to any game worth its salt, especially since information linux users tend to snoop makes income per player go up steeply to how good the game actually is

point is, there is well enough gaming population (which has increased quite a lot the last 2 years) to support something like wow, which only will need around 15’000 subscriptions for basic service at a price break

First, Steam already collects wine version for a long time, so I believe Valve has much more info about potential linux market than you and I do.

Second, from my own experience, it’s Windows users who tend to snoop. Because those moneyless kids (students, office workers) carrying of nothing but rest would never bore themselves installing even Ubunutu – they just use what comes with comp or ask some boy to install them pirate Windows (they don’t know anything else) for food.

The opposite could be argued.
With decreasing player numbers Blizzard might be more interested in finding niches with new players.
Sure, Linux desktop market share is only a % or 2. But probably with disproportianately high interest in getting a game running and willing to pay for it (Linux gamers have consistently paid more for Humble Indie Bundles compared to Win and Mac gamers).
And porting a game involves only a fraction of the cost of developing it n the first place.
Sure – support could be a problem – which is why it makes sense to start off with just Ubuntu – it’s the most popular dekstop distro by far and picking an LTS release like 12.04 simplifies suport issues.

Our mission is to strengthen the gaming scene on Linux, both for players and developers. This includes Linux ports of Steam and Valve games, as well as partner games. We are also investigating open source initiatives that could benefit the community and game developers.

Initially formed in 2011, the Valve Linux team is currently 11 people and growing. Our mission is to investigate open source development with a specific focus on supporting Steam and other Valve products on the Linux platform. The Linux background of our team varies from those who have a deep knowledge of Linux development to those who have just scratched the surface. However, one thing we all share is a great passion for supporting all things Valve on Linux. We are currently seeking experienced developers for Mac, Linux (kernel and driver), and OpenGL to fill our team. If you match these qualifications, please let us know.