If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

opengl is not directx meaning opengl 4 is not better or faster or cooler or pretiest than opengl 2, opengl 4 only provide extensions or improvements to support new techniques present in more recent hardware like tessalation for example.

You're wrong all over.
OpenGL 4 _is_ way better in pretty much every sense, it not only provides significantly more core GL & GLSL functionality than GL 2.1 but it's also cleaner and deprecates old stuff and provides new extensions. The difference between GL 4.3 & 4.2 is pretty big, between 4.3 & 2.1 it's simply astounding. It's also (much) easier to program for GL 4.x because of the much higher lowest common denominator.

And of course you get better performance with GL 4 than 2 because of the new ways you can do stuff (instanced drawing, compute shaders, new stages, etc etc) and you do it in a clean way, not having to "hope" that a given extension exists.

Comment

You're wrong all over.
OpenGL 4 _is_ way better in pretty much every sense, it not only provides significantly more core GL & GLSL functionality than GL 2.1 but it's also cleaner and deprecates old stuff and provides new extensions. The difference between GL 4.3 & 4.2 is pretty big, between 4.3 & 2.1 it's simply astounding. It's also (much) easier to program for GL 4.x because of the much higher lowest common denominator.

And of course you get better performance with GL 4 than 2 because of the new ways you can do stuff (instanced drawing, compute shaders, new stages, etc etc) and you do it in a clean way, not having to "hope" that a given extension exists.

jrch2k8 is right. All GL4 hardware exposes those extensions for GL2 too.

From a dev perspective, writing for pure gl4 is worse, because it requires more boilerplate, and has worse syntax. It's much better to do gl2 + extensions.

The hardware requirements box can still say "requires GL4" if you don't want to run without some GL4 feature. The end result is the same in that case, no matter whether you use gl2 + some gl4 extension vs gl4.

Comment

jrch2k8 is right. All GL4 hardware exposes those extensions for GL2 too.

From a dev perspective, writing for pure gl4 is worse, because it requires more boilerplate, and has worse syntax. It's much better to do gl2 + extensions.

The hardware requirements box can still say "requires GL4" if you don't want to run without some GL4 feature. The end result is the same in that case, no matter whether you use gl2 + some gl4 extension vs gl4.

Remind me not to take your posts for serious ever again. "Worse syntax" haha, it's amazing. I've done 3.3 and compared to 2.1 it's got way better and cleaner syntax, to me getting rid of long words like "attribute" and "varying" alone is worth the transition, not to mention all the goodies bla bla.

I still hear people ranting about how X.org is good and "no need to fix it" so it's no wonder that there also are people with their heads up their asses claiming GL 2 is better or equal to GL 4.

And yeah, by their logic GLES 3 is not better than GLES 2.

Comment

Remind me not to take your posts for serious ever again. "Worse syntax" haha, it's amazing. I've done 3.3 and compared to 2.1 it's got way better and cleaner syntax, to me getting rid of long words like "attribute" and "varying" alone is worth the transition, not to mention all the goodies bla bla.

I still hear people ranting about how X.org is good and "no need to fix it" so it's no wonder that there also are people with their heads up their asses claiming GL 2 is better or equal to GL 4.

And yeah, by their logic GLES 3 is not better than GLES 2.

nope you are misinterpreting the point and the syntax is very subjective issue some love it and some hate it like some love C and hate C++ ,etc so i won't enter in a religious war about syntaxes.

back to the point ill try to be more clear this time OpenGL is a collection of extensions per se not a unique release like DirectX which need you to link to each revision library[libgl vs d3d9_xx.dll d3d10_xx.dll d3d11_xx.dll] among other things to support various generation of hardware <-- i think up to here we both agree

now opengl versions are more like this "hey we got X number of new functions that require class X hardware among some a superset of agnostic features so lets name it profile X.y" and thanks to this you can mix all this new and previous extensions or just merry with a specific revision or only support a subset of revision, etc.

about performance what i said is still true, sure if i try to emulate tessalation on shaders via GL2/3.y instead of use the tessalation silicon present in dx11 class hardware using the respective 4.y profile is gonna be hell slower but this is not a OpenGL goodie or pit fall is a hardware dependant issue[dx11 is faster than dx10 hardware the same as dx9 hardware is slower than both] and the same is true for shading code since many of those cool functions introduced in gl4.3 like shader storage objects or compute shaders require silicon that is only present in DX11/.1 hardware[some can be partially emulated in previous generation but most likely will underperform beyond utility]

now you could say "but i tried 2.1 vs 4.3 in the same GPU and is still faster jajajaj !!!!" and it makes sense not cuz opengl 2 is slower than 4.3, this phenomenon is mostly cuz the new features are very optimized for the current silicon while the older ones[especially those that substitute functionality of extensions from previous releases] are emulated and hence much slower <-- again hardware dependant issue.

now the milestone of reach GL 2.1 for this specific case[kwin] is very very good because gl1.x was mostly fixed functions only and really annoying to be GPU efficient or flexible but gl2.1 is widely used and introduced the use of shaders[properly as used today].<-- my previous point

now the wonderful thing about opengl is that i can at runtime level detect the hardware and properly decide which shaders compile or which fixes function can improve performance or readiness for the GPU installed <--- curaga point so you can efficiently suppot many gpu generations exploiting properly every generation goodness without penalty[for example use FBO/PBO in my 4850 and pbuffers in my GFx5600 or use STO in my amd 7700]

now is stupid to develop an application like kwin that only support gl4.3 profile and reject everything else LOL you need DX11 class hardware for that[1% or less than kde users], so not gonna happen, what you can do is contribute gl4.3 profile code to kwin so it get faster in you dx11 but still work fine in my dx10 hardware

Comment

GL 4 is (much) better than GL 2. Period. The fact that it is better to target GL 2 is not because it's better (it's much worse) - but because its market share is much larger.

Don't confuse market share with the quality of the technology itself, anything starts with a zero market share, and don't give the "anything is subjective" semantic bullshit.

If GL 2 is better or equal to the newer GL versions then AMD, Intel and Nvidia (and other) devs are idiots because they're putting a lot of work into supporting newer versions. Tell them about this since you're such a smartass.

Comment

Remind me not to take your posts for serious ever again. "Worse syntax" haha, it's amazing. I've done 3.3 and compared to 2.1 it's got way better and cleaner syntax, to me getting rid of long words like "attribute" and "varying" alone is worth the transition, not to mention all the goodies bla bla.

And it introduced many more. What do you call the word monster that is "layout(location = 4)"? Amazing?

But indeed, syntax is a subjective issue.

I still hear people ranting about how X.org is good and "no need to fix it" so it's no wonder that there also are people with their heads up their asses claiming GL 2 is better or equal to GL 4.

And yeah, by their logic GLES 3 is not better than GLES 2.

You're not considering their points, are you? Or mine, for that matter.

Let me rephrase - if I can get better syntax and same features by using gl2 + extension foo, why should I not do so?

Comment

GL 4 is (much) better than GL 2. Period.
Don't confuse market share with the quality of the technology itself, anything starts with a zero market share, and don't give the "anything is subjective" semantic bullshit.

If GL 2 is better or equal to new versions than AMD, Intel and Nvidia (and other) devs are idiots because they're putting a lot of work into supporting
newer versions.

1.) won't even loose my time here since you ovbiously dont understand a shit about GL

2.) gl2.x is a superset of functions and extensions for dx9 class hardware forward compatible with dx10/11 class hardware
gl3.x is a superset of functions and extensions for dx10 class hardware forward compatible with dx11 class hardware
gl4.x is a superset of functions and extensions for dx11 class hardware

i dont know how you warped brain twisted the info into this OMG or you are trolling me or somehow you assume each gl release is a new language and every gpu in mankind history can do exactly the same thing and you need a GLX.y language to do cool 3d "thingies" and somehow is better than GLY.x cuz is an older language and hence do less cool 3d "thingies".

whatever make you happy ...

Comment

1.) won't even loose my time here since you ovbiously dont understand a shit about GL

2.) gl2.x is a superset of functions and extensions for dx9 class hardware forward compatible with dx10/11 class hardware
gl3.x is a superset of functions and extensions for dx10 class hardware forward compatible with dx11 class hardware
gl4.x is a superset of functions and extensions for dx11 class hardware

i dont know how you warped brain twisted the info into this OMG or you are trolling me or somehow you assume each gl release is a new language and every gpu in mankind history can do exactly the same thing and you need a GLX.y language to do cool 3d "thingies" and somehow is better than GLY.x cuz is an older language and hence do less cool 3d "thingies".

Comment

No and as long as Wayland itself is not actually working, there is no point it targeting it. KWin supports OpenGL ES2 which is the most important prerequisite for Wayland support. Now itís the Wayland devsí turn to get Wayland out of pre-alpha quality.