Are you serious? What is missing compared to SM4? You need to ask someone what that is, when you can just list the features side by side and cross off the ones that both don't have?

With people like this on the ARB, it's no wonder GL "3.0" turned out this way.

08-11-2008, 04:49 PM

Brolingstanz

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

In particular I had geometry shaders in mind when I wrote that, Rob.

08-11-2008, 04:57 PM

Rob Barris

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

In reading the direct state access extension, it seems to me that the majority of calls simply act as if there was a private binding point that does not affect drawing state - they look up the object and they set the state, without affecting drawing state or binding points.

As with any paper spec, seeing an implementation running can provide a good existence proof of its efficacy. If one were to become available soon, that would be an important signal that it's doable and low risk.

(that phrasing "The EXT_direct_state_access driver is +2.5% larger." could lead one to believe that it already exists in some form).

funniest thing i've seen all day.
every function that relies on a client selector gets a new entry point.
and to think, they didn't want to change the api.

08-11-2008, 05:56 PM

MZ

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

Hello Slashdot!

*waves hand*

08-11-2008, 05:57 PM

Rob Barris

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

The approach in the DSA extension doesn't invalidate any existing code.

The approach in Longs Peak invalidated all of it.

08-11-2008, 06:05 PM

Groovounet

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

I first have a look on GLSL 1.3 and I thought first: "So great, there is the SM4 feature too!" then I arrived on the texture function and I thought "What the heck is going on!".

There is no word to tell you how much I'm disappointed by this OpenGL 2.2 specification. No actually for OpenGL 2.2 specification it would have been great, for OpenGL 3.0 specification, it's just a shame. You made us wait 2 years with so much great concepts for OpenGL 3 to end up with this...? Who did this? Is it marketing reasons? I can't see a clue on this, it doesn't make any sens.

Even if OpenGL 3 would not have been at the level of D3D10 in terms of features, I'm sure we would have wait for this and start with what we got.

In some ways, I expect nVidia to create it's own API but I just hope there are not behind this mess... I really wonder that's not the case but some will pay for this for sure.

Anyway, I fed up with OpenGL now, it's time to go to something new I guest.

The greatest, and we are speaking about this here, always been done by people... maybe someones need to hear them.

Congratulation!

08-11-2008, 06:13 PM

Khronos_webmaster

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

We have resolved our bottleneck and should be good to go with the rest of slashdot.

08-11-2008, 06:16 PM

Korval

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

Quote:

The approach in the DSA extension doesn't invalidate any existing code.

The approach in Longs Peak invalidated all of it.

Yes, but Longs Peak would have been implemented by everyone. This extension will not be implemented by ATi or Intel until it becomes core.

08-11-2008, 06:30 PM

skynet

Re: The ARB announced OpenGL 3.0 and GLSL 1.30 tod

Quote:

The approach in the DSA extension doesn't invalidate any existing code.

The approach in Longs Peak invalidated all of it.

And this is the all-wrong thought. If I had a 10-year-old application, of course I wanted it to continue working. This is what the old opengl2.x dll would have been for. Install the new GL driver, but it would still work. NOTHING is invalidated.

Now, if I decided to switch to GL3, I would NEVER expect to just link against opengl3.lib without any work. Instead I knew that I probably have to rewrite the whole thing, because that old application is not _architected_ to meet the needs of a modern gfx card. Instead you just need a clean redesign. I'd had to throw out display lists and switch to a VBO based design. I had to throw out fixed function stuff and write shaders instead. I must not use that SELECTION/FEEDBACK stuff anymore... etc. Even those mighty CAD companies WILL have to rewrite their stuff some day. And I _bet_, they'll use DX that day.
It would have been just honest to draw a line and introduce an all new API instead of what we got now.