If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

You know, running the numbers, it seems that with each Mesa release we're getting further away from being up-to-date with the latest OpenGL spec, which is surprising and a bit scary. Comparing the difference between the release times of the OpenGL specs and the Mesa versions that support them, Mesa 4 was 0.2 years late, Mesa 5 was 0.3 years late, Mesa 6 was 0.4 years late, Mesa 7 was 1 year late, Mesa 8 was 3.5 years late, Mesa 9 was also 3.5 years late, and now Mesa 10 is 3.6 years late... Hopefully this trend can be reversed at some point.

A DBUS interface would be perfect, and let the various local solution tap into it (like KDE's Kscreen and battery widget, etc.)

Even if it was all done inside the same company 3DFx had a similar approach for their windows drivers:
- the low-level driver, when installed by it's .INF uploads a bunch of info into the registry.
- the end-user configuration software, reads the registry and draws drop-down menus and horizontal bars to let the user change settings. These graphical edits will change variables as defined in the registry by the low-level installer.

So if you want to hack the drivers to create new settings or new mode sets, just edit the .INF file so it creates new entries in the registry, and the conf application will automagically show you newest creation.

People, it already works that way. Your driver outputs XML to describe all options it supports.

xdriinfo options 0

(insert obligatory note how dbus deserves a painful death, so much so that even XML is superior)

Well, 3dfx did back on windows during 199-something. Of course back then, it was just limited to one driver talking over dbus to one end-user application. (Well actually not only one. There was a small cottage industry of people making "optimised" drivers with different settings in .INF, and other people making "nicier" configuration app, than the default, all talking through the same registry. So in the end, you ended up with loads of applications and drivers, except they all were for pieces of hardware from the same constructor, and more or less using similar drivers binaries (or binaries compiled from the same small pool of few available codebases (°) ) ).

Now we are in 2013, so I think having an open standard to be followed by all opensource mesa drivers(*) and that could be used by all compatible applications.
The same way that RANDR exposes a unified API to mode setting and multi-monitors, and can be used with lots of applications ranging from pure command line (xrandr, useful to retake control remotely of a machine with a b0rked screen configratuion, whitout losing applications) all the way to nicely integrated applets inside desktop environments. (KScreen is nicely integrated inside KDE, providing a continuty in end-user experience).

(*): ...and after while probably also by AMD's Catalyst, given their track of eventually implementing the standard too, while Nvidia focuses on their approach of "we only do the same thing as Windows". Some on-camera cussing by Linus may be required before they move their asses.

(°): - DirectX driver was kept binary-only for the most of the lifetime of the parent company, though a few managed to get a copy of the code base and provide patched/fixed builds after 3dfx got acquired by Nvidia. For the rest, most of the "custom" drivers simply repackaged the official binaries with only very rare binary patches.
- Glide proprietary API has always been binary on windows. Gut later in it's life 3dfx opensourced its linux implementation, meaning the latest Windows drivers could actually contain recompiled glide drivers from the linux code base.
- OpenGL: huge complicated mess. At the beginning 3dfx didn't provide any openGL support, only mini drivers, small "extra-light" libraries that only implemented a very small subset of openGL API, only the bare minimum to get games running, all done by calling into Glide for the actual rendering. As more games started to use OpenGL, 3dfx developed further these "miniGL" to support newer games. Eventually, they released a full ICD. Which later got "hidden surface removal" capabilities (accelerates by only drawing visible portions of polygons, but very buggy leading to missed frames and glitches at more aggressive settings). All this was binary only. Meanwhile, Mesa3D had a nice full openGL implementation that ran also using Glide for the actual rendering (and was even available on MS-DOS, in addition to Linux and Windows). Although a little bit slower than the official Windows drivers, this got progressively more and more popular, specially since later versions got faster and covered more openGL API. Thus Mesa became the better alternative to play game released later thank to better openGL API (Mesa was capable to play Doom3 at reduced quality even if 3dfx lacked the necessary hardware for some shading) and less bugs (specially vs. HSR).

In a way the life expectancy of Voodoo hardware was greatly extended thanks to opensource: 3dfx' own glide, and mesa's opengl implementation.

Your experience with 3dfx is impressive. I had never owned a voodoo card. My first video card was a rage3d. since then I've been through dozens of ATi cards. They too had some difficulty with opengl at first. Realize that it was nVidia that pushed opengl so hard. This was at a time when they were having problems with D3d which was what MS was pushing hard for. Opengl exists today in the state that it does because of nVidia.

Your experience with 3dfx is impressive. I had never owned a voodoo card. My first video card was a rage3d. since then I've been through dozens of ATi cards. They too had some difficulty with opengl at first. Realize that it was nVidia that pushed opengl so hard. This was at a time when they were having problems with D3d which was what MS was pushing hard for. Opengl exists today in the state that it does because of nVidia.

OpenGL exists because of thats what engineering workstation use... Games do use it but it was not the games industry that pushed its initial development. The workstation industry has also held its development back in recent times due to the fact that compatibility was more important for them than progress.

OpenGL has its origins at SGI... Nvidia and AMD both have made large contributions to it through the OpenGL ARB group since 1992. Improvements now come through Kronos... which replaced/absorbed the ARB.

You know, running the numbers, it seems that with each Mesa release we're getting further away from being up-to-date with the latest OpenGL spec, which is surprising and a bit scary. Comparing the difference between the release times of the OpenGL specs and the Mesa versions that support them, Mesa 4 was 0.2 years late, Mesa 5 was 0.3 years late, Mesa 6 was 0.4 years late, Mesa 7 was 1 year late, Mesa 8 was 3.5 years late, Mesa 9 was also 3.5 years late, and now Mesa 10 is 3.6 years late... Hopefully this trend can be reversed at some point.

It would be interesting to make a time plot of it.

I am under the impression that OpenGL has had a slow evolution for a while (to quote another post:

Originally Posted by cb88

The workstation industry has also held its development back in recent times due to the fact that compatibility was more important for them than progress.

OpenGL has its origins at SGI... Nvidia and AMD both have made large contributions to it through the OpenGL ARB group since 1992. Improvements now come through Kronos... which replaced/absorbed the ARB.

)
Specially there have been quite some delay before OpenGL 3.x and massive delays before OpenGL 4.x.

I am also under the impression that overall Mesa development *is* speeding up also. With more ressources poured in recently.

The "increasing gap" effect comes (in my opinion) more due to the fact that Mesa for a while used to be an almost-standing-still target, and has boomed not so long ago after much stagnation, and that Mesa ramped up development only very recently.

I am under the impression that OpenGL has had a slow evolution for a while (to quote another post
Specially there have been quite some delay before OpenGL 3.x and massive delays before OpenGL 4.x.

I am also under the impression that overall Mesa development *is* speeding up also. With more ressources poured in recently.

The "increasing gap" effect comes (in my opinion) more due to the fact that Mesa for a while used to be an almost-standing-still target, and has boomed not so long ago after much stagnation, and that Mesa ramped up development only very recently.

OpenGL 3.0 to 4.1 only took 2 years. That's 6 versions, because it was catching up with all the DirectX 10 and 11 stuff in the hardware all at once. After 4.1, new versions are coming out approximately 1 per year.

For Mesa, 4.0 and 4.1 are going to fall further behind their original release dates, but starting with 4.2 they might actually start catching up. They'd have until Feb/March of 2015 to get 4.2 out and stay on the current gap, and that seems reasonable. 4.3 support then might take a while after that, but they'd have a whole year to work just on that and avoid falling further behind.

You know, running the numbers, it seems that with each Mesa release we're getting further away from being up-to-date with the latest OpenGL spec, which is surprising and a bit scary. Comparing the difference between the release times of the OpenGL specs and the Mesa versions that support them, Mesa 4 was 0.2 years late, Mesa 5 was 0.3 years late, Mesa 6 was 0.4 years late, Mesa 7 was 1 year late, Mesa 8 was 3.5 years late, Mesa 9 was also 3.5 years late, and now Mesa 10 is 3.6 years late... Hopefully this trend can be reversed at some point.

That's actually incredibly disheartening. It probably wasn't so much of an issue 5 or 6 years ago, when most of the gaming on Linux was designed for and optimized with Mesa in mind (in the current Mesa release) but we're starting to get more and more big games (Natural Selection 2 is the one that comes to mind for me all the time) that require at least nominally up-to-date drivers (NS2 requires OGL 3.1 at least).

I'm hoping that the faster release cycle, plus the new developers that have come on-board in the past year, plus the urging/motivation from Valve and SteamOS, will all hasten a speedier release schedule and maybe we can see parity within the next year.

Specially there have been quite some delay before OpenGL 3.x and massive delays before OpenGL 4.x.

Delay before 3.x – yes, two years, due to the whole Longs Peak thing. But not at all that with 4.0. As a matter of fact, OGL 3.3 and OGL 4.0 were released at the same time, and OGL 3.2 was released only half a year before that. Due to that, any time spent working on OGL 4.0 in Mesa at all is going to only widen the gap. As smitty mentioned, though, the later OGL versions might be where Mesa can start bridging the gap. The good part about the current situation with all that many OGL specifications being out and not supported is that parts of those specs are implemented in parallel, so, much like with the automatic 3.1→3.3 transition in Mesa 10, we can hope to get more rapid implementation of several spec versions. If both OGL 4.1 and OGL 4.2 get implemented at the same time, for example, we get a jump of one and a half year, as opposed to just a half year.

You know, running the numbers, it seems that with each Mesa release we're getting further away from being up-to-date with the latest OpenGL spec, which is surprising and a bit scary. Comparing the difference between the release times of the OpenGL specs and the Mesa versions that support them, Mesa 4 was 0.2 years late, Mesa 5 was 0.3 years late, Mesa 6 was 0.4 years late, Mesa 7 was 1 year late, Mesa 8 was 3.5 years late, Mesa 9 was also 3.5 years late, and now Mesa 10 is 3.6 years late... Hopefully this trend can be reversed at some point.

Yes, but OpenGL up to this point has driven its mad rush evolution by emulating DX features. It now has everything DX has, so development of new OpenGL versions (at least major versions with huge features like tessellation and compute shaders) should be much slower now

I am not so shure about that, like I said many stuff in the driver stack got changed in the last year the biggest ist the switch to gallium3d. And not only the switch also the supporting of both drivers.

So there are now 2 things that happend :

1. at some point new features will only come to the gallium3d driver, so not double the work, I think that did happen already for radeon at least?

2. one of the main features gallium3d has is the abstraction between a small core of drivers and the feature-state trackers, like we got some idea when we saw the directx or direct3d state trackers. So in theory it should now be possible to implement feature xy in a mesa-statetracker (is this the name for it?) and only few or none changes would be needed in the driver part. So you do more work 1 time instead of multiple times.

The driver-devs I heard from said that they did something like a windows 98 to windows 7 driver architecture update in a few years, so in that time of such a change of course infrastructure-work will take much resources.