It appears that your OpenGL implementation doesn't support glXCreateContextAttribsARB(), which is what GLUT uses if you request a specific version or profile.

FreeGLUT doesn't actually check for the extension when using GLX (although it does for WGL). It checks the the function pointer is non-NULL, but that only tests whether the client library provides the function, not whether the server supports the operation.

Originally Posted by LastHorizon

Does anybody know what is causing the error? Or indeed what BadRequest, Major Opcode 153 (GLX) actually refers to?

Major opcodes are dynamically assigned to X extensions. In this case, it tells you that 153 is assigned to GLX. Minor opcodes are determined by the extension and indicate the actual operation. For GLX, 34 is X_GLXCreateContextAtrribsARB (you can find the symbolic names in GL/glxproto.h).

In short, you'll need to add an option to control whether to use glutInitContextVersion() and glutInitContextProfile(). Otherwise it will fail in this manner if the X server doesn't support the extension.

In short, you'll need to add an option to control whether to use glutInitContextVersion() and glutInitContextProfile(). Otherwise it will fail in this manner if the X server doesn't support the extension.

I tried forcing GLUT to initilise using `glutInitContextProfile(GLUT_COMPATIBILITY_MODE);` and I got the same error as before. I think I am misunderstanding your answer. What do I need to do when initilising glut to avoid returning this error. Apologies for my lack of understanding. I suppose the other way to avoid this error is to not use GLUT at all, but this does feel like re-inventing the wheel...

Does the paid for version of GLUT offer more support in checking what is compatible and what isn't?

************EDIT**************

it appears that OpenGL2.1 doesn't support glutInitContextVersion() and glutInitContextProfile(). What commands do I need to use instead of these that are compatible with OpenGL2.1?

I tried forcing GLUT to initilise using `glutInitContextProfile(GLUT_COMPATIBILITY_MODE);` and I got the same error as before. I think I am misunderstanding your answer. What do I need to do when initilising glut to avoid returning this error.

Don't call glutInitContextProfile(). You'll just have to accept whichever version and profile you get by default.

Originally Posted by LastHorizon

Does the paid for version of GLUT offer more support in checking what is compatible and what isn't?

There is no paid-for version of GLUT.

The "Free" in "FreeGLUT" refers to the fact that its licence is less restrictive than that of the original GLUT library (which didn't permit distribution of modified versions).

- Doesn't support layout qualifiers. You need to either set the attribute location in the client code with glBindAttribLocation() or allow the implementation to choose locations and query them with glGetAttribLocation().
- Uses "attribute" for vertex shader inputs rather than "in".
- Uses "varying" for variables which communicate data between the vertex shader and fragment shader, rather than using "out" in the vertex shader and "in" in the fragment shader.
- Doesn't support user-defined fragment shader outputs. For a single colour buffer use gl_FragColor; for multiple colour buffers use gl_FragData[n].

Also, it doesn't support the overloaded texture() function; you need to use a variant whose name depends upon the sampler type, e.g. texture2D() for sampler2D.

I haven't managed to get it to load a texture to all faces of my cube yet though, I guess that's a question of finding the correct texture loading command though. At the moment if I use texture2D() it loads a single 2D with the incorrect colouring. It should be a create but instead I get a brown square.

I'm curious as to whether the co-ordinate system has changed between 2.1 and 4.X, when I ran my code previously I had my cube rotating around the centre of the object, now it rotates around the bottom vertex of the object.

The colour gradient is also a lot coarser than before, but I feel that is more to do with technology limitations than anything else. However if I'm barking up the wrong tree, please let me know!

I haven't managed to get it to load a texture to all faces of my cube yet though, I guess that's a question of finding the correct texture loading command though. At the moment if I use texture2D() it loads a single 2D with the incorrect colouring. It should be a create but instead I get a brown square.

Are you using a 2D texture for each face, or a cube map?

A cube map is created by calling glBindTexture(GL_TEXTURE_CUBE_MAP) then glTexImage2D(GL_TEXTURE_CUBE_MAP_*) for each of the six faces, and accessed in the shader via a samplerCube uniform and the textureCube() function (which takes the coordinates via a vec3).

Originally Posted by LastHorizon

I'm curious as to whether the co-ordinate system has changed between 2.1 and 4.X, when I ran my code previously I had my cube rotating around the centre of the object, now it rotates around the bottom vertex of the object.

When using shaders, coordinate systems are entirely up to the program. Vertex attributes will be passed to the vertex shader as-is.