If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

On phoronix I always see "zomg, $magicfeaturebuzzword is still not implemented while the next-next-next-generation hardware is out". I don't even know what a "Stippled Primitive" or "Tessellation Shader Stages" or "Piglit spec/glsl-1.20/execution/variable-indexing/fs-temp-array-mat4-index-col-row-rd hangs" are; 99% of CG is just technobabble + magic from beyond my universe, though I am a computer scientist but I majored in a very different field. I have a deep understanding of operating systems though, but the VGA card is just the thingy that draws pictures if you need them.

Could you recommend a book or summary paper that explains all the techniques (aka technobabble) and how everything works together without programming examples and only essential math details? I do not intend to become a developer, I just want to understand what everyone is talking about, this is why I want the book to spare me the practical details.

By the way: With three developers or so this will obviously need forever. But even if you are a programmer you cannot do the job, because you need knowledge in computer graphics. On my university there are at least three master lectures about that (maybe more) and one chair dedicated to only that topic (and other ones for embedded devices, hardware stuff, operating systems), so people with proper knowledge will be very rare, because they need knowledge in all these fields. But even when someone has this knowledge, this does not mean that they will ever be involved in driver developing: I would not do it, because I would do research with a graduation like that; maybe if you are bored enough (haha) you could do it as a hobby.

So we have no other option but to be patient and should never buy the latest cards on the market, but 5+ year old ones. Just buy a game console and use the Linux PCs for work and basic multimedia (which saves energy and money because a 50 bucks fanless card has enough power for that). It is very simple ;-)

That's one approach...

Originally Posted by mark_

So we have no other option but to be patient and should never buy the latest cards on the market, but 5+ year old ones. Just buy a game console and use the Linux PCs for work and basic multimedia (which saves energy and money because a 50 bucks fanless card has enough power for that). It is very simple ;-)

Well, actually there is another option: you can buy the cards and raise bugs against them, provided you are also prepared to do some "git bisect"ing and post stack traces.

I've done a fair bit of this, but it did need the card to have at least working 2D support first. Although that might also have been before the days of composited desktops...

This surely helps the developers in finding bugs but I don't see how this speeds up the development of new features. If an application needs OpenGL 3.3 and my card supports only 3.0 yet and thus shows garbage on the screen, then I could file bugs for all missing .1, .2, .3 features (assumed I even know about these details - which I don't - and what features my app exactly needs, which I also don't know, especially if the app is commercial closed source and 32bit and runs on wine; there is no chance to provide useful information regarding to the graphics driver or mesa).

This surely helps the developers in finding bugs but I don't see how this speeds up the development of new features. If an application needs OpenGL 3.3 and my card supports only 3.0 yet and thus shows garbage on the screen, then I could file bugs for all missing .1, .2, .3 features (assumed I even know about these details - which I don't - and what features my app exactly needs, which I also don't know, especially if the app is commercial closed source and 32bit and runs on wine; there is no chance to provide useful information regarding to the graphics driver or mesa).

You can't do much on the feature front unless you hire developers to do it for you.

Am I the only who is HAPPY with the Mesa/Gallium/ATI stack? power management works here (with the echo power_profile stuff), all of my games work at over 60fps mostly 100+ (Using a XFX 6670 on a 4Ghz Phenom II x6), No crashes, where's Crapalyst crashed DAILY. So sure its fast but you have to put up with your system locking every so often (never happened here with the free stack)

Am I the only who is HAPPY with the Mesa/Gallium/ATI stack? power management works here (with the echo power_profile stuff), all of my games work at over 60fps mostly 100+ (Using a XFX 6670 on a 4Ghz Phenom II x6), No crashes, where's Crapalyst crashed DAILY. So sure its fast but you have to put up with your system locking every so often (never happened here with the free stack)

No, I'm also quite happy. Gallium runs stable, no more strange crashes etc. Only my laptop is a lot hotter with the current power management... But they are definitely on the right track forward!

Gallium is stable as hell. The last time I tried Catalyst it caused constant crashes. One of my work machines has the Nvidia blob, and despite what some people say, it's not much better. I can be guaranteed of at least one crash a day, but usually more.

If the Radeon driver can manage 30fps on the Valve games, I'll settle for that rather than dealing with the shitty blob. I don't have any issues playing 1080p video without acceleration on my machine, although that would be awesome for my media center. At this point the only thing I'd consider lacking for my own needs is better power management.

I was an arch user and I'm stuck with windows:
open-source drivers don't permit me to use laptop: it fries.
catalyst drivers are buggy, don't work with latest xorg server versions, no hd decoding, hdmi not working.
Bah... crapware.

I don't know what you guys do with your machines. I run Catalyst 12.8 on my HD6870 and Catalyst 12.6 Legacy Beta on my laptop with HD3200 without any problems. No crashes, hibernate/suspend working, no problems at all.
If I run the radeon drivers on my laptop I not only get 15-20°C higher temperatures, but battery is also draining really quick, even on the low power profile. Unusable for me and other people with similar laptops).

I'm quite happy with Gallium, it's annoying to have to manually switch profiles for power management, but other than that it is stable and great.
I've written a "service script" so that I can simply "sudo /etc/init.d/ati {low, mid, high}".

Though I'd like to improve the performance; Does anyone have info about improving performance, to get the most out of my card? or is there already a thread with such information?
I've also read something about "floating point textures" or something like that, how would I enable it?

I use OpenSUSE.

I'd also like to add my thanks to the Gallium/Radeon/Mesa teams, you guys have done a great job!