If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Gaming On Wine: The Good & Bad Graphics Drivers

02-08-2013, 02:30 AM

Phoronix: Gaming On Wine: The Good & Bad Graphics Drivers

If you're wondering what graphics hardware / driver is best if you intend to do much gaming within Wine or CrossOver on Linux, a CodeWeavers' developer responsible for much of the Direct3D layer spoke about his experiences with gaming on Wine for the best results...

That is a pretty damning assessment against AMD. Between this and their lack of support for the Southern Islands/HD7000 series cards, I do not plan on making any future AMD GPU (or CPU) purchases. I say that sitting in between three PCs, two Linux and one Windows, each with AMD cards in them.

The video itself was pretty good. Sometimes those talk videos are dry and boring, but that one was pretty good and well-paced. I also appreciate the honesty from the presenter, in acknowledging where WINE and Linux gaming in general falls short.

I really appreciate your assessment notes, Michael.

Comment

This comming from an entirely Nvidia friendly piece of software does not surprise me.

Latest case being Path of Exile. In the Wine discussion thread on the game forum, we managed to find out that disabling GLSL use for Wine (registry key) makes all AMD cards no longer DirectX9 capable. Nvidia cards on the other hand are still DX9 capable. Since both companies differ only in custom extensions, I bet Wine uses NV extensions for shaders while it sees GLSL disabled. AMD has to go back to ARB shader extensions which means only GL 2.1 shading is available.

I do admit AMD has their own shit to clean up (catalyst posting different capabilities when it detects Wine running in some cases), but Nvidia special treatment from Wine devs is in some cases too much.

Comment

This comming from an entirely Nvidia friendly piece of software does not surprise me.

Latest case being Path of Exile. In the Wine discussion thread on the game forum, we managed to find out that disabling GLSL use for Wine (registry key) makes all AMD cards no longer DirectX9 capable. Nvidia cards on the other hand are still DX9 capable. Since both companies differ only in custom extensions, I bet Wine uses NV extensions for shaders while it sees GLSL disabled. AMD has to go back to ARB shader extensions which means only GL 2.1 shading is available.

I do admit AMD has their own shit to clean up (catalyst posting different capabilities when it detects Wine running in some cases), but Nvidia special treatment from Wine devs is in some cases too much.

Code:

if ( !get_config_key( hkey, appkey, "UseGLSL", buffer, size) )
{
if (!strcmp(buffer,"disabled"))
{
ERR_(winediag)("The GLSL shader backend has been disabled. You get to keep all the pieces if it breaks.\n");

Comment

If you don't buy AMD cards and get a nvidia card instead you don't support the company that at least tries to release documentation and even open source code and support an open source unfriendly company. Good job.

Comment

Both AMD and NVidia are failing to support Linux developers and users. The entire reason I bought those HD6000 series cards was, in fact, because of AMD's promises to support Linux.

But, here we are years later and the AMD end-user experience is actually worse than the NVidia end-user experience. That is my hands-on experience with their proprietary drivers. The implementation matters. As an end-user, I don't give a fk why one experience is better than the other; I just recognize what is.

Ultimately, fk em both. Intel may end up with the best Linux graphics experience here soon.

Comment

Personally i'm impressed with r600g performance with Wine. I was able to play two Tomb Raider series games (Legend and Anniversary Edition) in a HD 5650 on my HP notebook with mesa-git and they ran really smooth (I don't know if this is great at all, but I was expecting a lot less)

Additionally, I tested Windows version of Heroes of Newerth in Wine and got nearly the same fps (around 30) for Direct3D and OpenGL rendering (in the native version too)

Comment

if ( !get_config_key( hkey, appkey, "UseGLSL", buffer, size) )
{
if (!strcmp(buffer,"disabled"))
{
ERR_(winediag)("The GLSL shader backend has been disabled. You get to keep all the pieces if it breaks.\n");

you can find lots of NVIDIA preferential treatment in arb_program_shader.c, examples:

Code:

/* Always enable the NV extension if available. Unlike fragment shaders, there is no
* mesurable performance penalty, and we can always make use of it for clipplanes.
*/
if (gl_info->supported[NV_VERTEX_PROGRAM3])
{
shader_addline(buffer, "OPTION NV_vertex_program3;\n");
priv_ctx.target_version = NV3;
shader_addline(buffer, "ADDRESS aL;\n");

I did not go deep into the code, just a couple of searches, but you either get plain ARB_vertex/fragment program (meaning ATI/AMD) or NV_* extensions on Nvidia. I guess that if you delete all the NV_* handling, it will break the same as on ATI/AMD.

Comment

Personally i'm impressed with r600g performance with Wine. I was able to play two Tomb Raider series games (Legend and Anniversary Edition) in a HD 5650 on my HP notebook with mesa-git and they ran really smooth (I don't know if this is great at all, but I was expecting a lot less)

Additionally, I tested Windows version of Heroes of Newerth in Wine and got nearly the same fps (around 30) for Direct3D and OpenGL rendering (in the native version too)

The same with me. Steam for Linux working without issue, CS:S, X3. CS:GO under wine too. I am using r600g with HD4850 (Mesa 9.0.2). The only problems I have is with slow Trine2 and incorrectly rendered Arma2OA under Wine (fragment shaders using too many registers).