NOTE: This is a continuation of a thread about problems with the glGenVertexArray() function, but apparently the problem is that my program is not getting a recent OpenGL context since I upgrated my GPU card from GTX285 to GTX680 and my nvidia driver from <who-knows> to <v310.19>. The new thread is because the old title is very misleading.

-----

Yes, I am almost sure the context is the problem. Further down you'll see that various output my program prints implies the program only has a OpenGL v2.12 context. Probably OpenGL v2.10 didn't support VAOs yet, hence the behavior.

The program was running with OpenGL v3.30 context before I installed the latest nvidia driver on my 64-bit ubuntu v12.04 computer (nvidia driver 310.19, which is still shown as current as of today). After I installed the new driver, the program stopped working and the first error generated by my program was the VAO issue.

So the question probably is, why is my program not getting an OpenGL v4.30 context?

Since we seem to have narrowed down the problem to OpenGL context and not VAO functions, I create this new, appropriately named threat to continue this conversation.

I have always had problems initializing the combination of xlib, GLX, GLEW, OpenGL. I have never been able to sufficiently grasp everything necessary about xlib, GLX, GLEW and OpenGL to fully understand what is necessary, much less appropriate.

To be clear, I suspect I only need to handle the new OpenGL v4.30 context. I assume OpenGL v3.30 programs will also run on this newer context. If not, please say so.

I guess the best way to attempt to resolve this is to explain exactly what I am doing now. Here goes:

#1: Before anything significant happens in my program, it executes the ig_graphics_initialize() function, which only calls old-style GLX functions and creates, then destroys a window.

#2: Later, after everything is initialized, the application creates a default window with newer GLX functions (and should-be most recent context). It was creating an OpenGL v3.30 context before I updated my nvidia driver to v310.19 (64-bit version).

Next I'll be more specific and show the code in the ig_graphics_initialize() function, and then the ig_window_create() function. I suspect that something "minor" changed that obsoletes my old approach, and now requires something be done in a different order, or with different functions or arguments. But what do I know? Not much, obviously.

Please use [code]...[/code] or [highlight=cpp]...[/highlight] tags to mark code blocks, especially long ones. Keeps the indentation, uses pretty-printing, and doesn't take up so much space. Fixed that for you.

Glanced over your code, and it looks like you've some unneeded code in there. I'd simply it down a bit. Also, there's some surprising things to me such as passing GLX_CONTEXT_MAJOR/MINOR_VERSION_ARB symbols to glXChooseFBConfig() rather than glXCreateContextAttribsARB(). Are you sure this is valid?

Anyway, here's a stand-alone, working, test program that creates a new-style context via GLX. Specifically, it requests a 4.3 compatibility context -- though you can tweak that. Might give you some ideas. Give it a shot:

Please use [code]...[/code] or [highlight=cpp]...[/highlight] tags to mark code blocks, especially long ones. Keeps the indentation, uses pretty-printing, and doesn't take up so much space. Fixed that for you.

Glanced over your code, and it looks like you've some unneeded code in there. I'd simply it down a bit. Also, there's some surprising things to me such as passing GLX_CONTEXT_MAJOR/MINOR_VERSION_ARB symbols to glXChooseFBConfig() rather than glXCreateContextAttribsARB(). Are you sure this is valid?

Anyway, here's a stand-alone, working, test program that creates a new-style context via GLX. Specifically, it requests a 4.3 compatibility context -- though you can tweak that. Might give you some ideas. Give it a shot:

Thanks for the comments and code, and for fixing up my code.

Yes, I do have a lot of debug code to query various values and sometimes print them out. That was to help me find problems in the code by reference to strange looking values. In fact, in this case I noticed that it seems to report my OpenGL version as 2.1.2 instead of something newer (like v3.20 or v4.30).

Yes, I just noticed GLX_CONTEXT_MAJOR_VERSION_ARB and GLX_CONTEXT_MINOR_VERSION_ARB there too. I assume they were allowed in earlier versions of GLX --- or I simply screwed up somewhere along the line. But that code was working until I installed my new driver, though it might have been failing on those lines and then finding one of the later lines without those constants. Not sure.

In the new code I've been trying to get working the last day or so, I put those constants where they belong. Unfortunately, now glXCreateContextAttribsARB() is blowing up. My application calls that function, but never returns, glXCreateContextAttribsARB() just terminates itself somehow, and no dialogs appear saying "segment violation" or anything, which is odd.

When I try creating the context with the older functions, then the context is created, but I'm back in the situation that I was in before... glGenVertexArrays() always returns zero for VAO identifiers (which is invalid, of course). I assume this happens because I have an old OpenGL context (probably v2.10) which doesn't know what VAOs are.

I tried to compile your code, but... I don't have GLU on my computer. I avoid every library that I possibly can (though admittedly I gave in and adopted GLEW a while back to save myself a LOT of pointless work and hassle).

So I hesitate to download and install GLU, because then if that works, I still don't know what is my problem. I'm trying to make my program work without GLU but with GLEW... that's my problem, and that's what I need to get to work. I am very appreciative of the time and effort you put into helping me, but I've been avoiding GLU for years, and wish to continue without GLU (but with GLEW). Not sure what to do now.

The approach I was taking was working before, but I didn't have glXCreateContextAttribsARG() in my code --- I was calling glXCreateNewContext() instead. That was giving me an OpenGL v3.20 or v3.30 context, which is adequate for the state of my code as it stands now, though the reason I bought the GTX680 and installed the nvidia v310.19 drivers was to update my engine to v4.30 capabilities. My code ran on the GTX680 before I installed the new drivers, so my working theory has been that something changed in their new drivers. However, the nvidia website says all the old stuff should work, and I haven't noticed anyone else having the problem I am having.

the presence of GLX_CONTEXT_MAJOR_VERSION_ARB and GLX_CONTEXT_MINOR_VERSION_ARB in the glXChooseFBConfig() args which I believe is invalid.

Then address the missing glXCreateContextAttribsARB() call in your code. Check the man page links I've provided in this paragraph for details. Note that your call to glXCreateNewContext() according to the GLX_ARB_create_context extension specs will only create a context with version <= 2.1.

An update. I decided to take your approach and create a fairly stripped down version of my ig_graphics_initialize() and ig_window_create() functions. I don't know why, but after the initialization code created a context via the older functions and then called glewInit(), queries indicated that the code created an OpenGL v4.30 context.

Also, when the ig_window_create() function executed, the glXCreateContextAttribsARB() function no longer failed.

All I can guess is that I have some bit set or value specified in my non-stripped-down functions that cause the problem. I have no idea what that might be, but I suppose I can find out by disabling bits and values one by one until it produces an OpenGL v4.30 context and the functions don't terminate the application.

For whatever it is worth, and in case someone else has similar problems, I will paste the program that works correctly below:

I'm just now working through the ig_graphics_initialize() function in my main application, making it more exactly like the sample. It is only slightly different from the sample now, but still only gets a v2.1.2 context versus v4.3.0 context in the sample (according to the printout of values). So there has to be something different. I'm beginning to wonder whether somehow the two cases are accidentally including a different header file, or in a different order. Or something even more mysterious. Once I find out, I'll add the information to this record.

Ah, just shoot me! I hate when this kind of nonsense happens. Maybe/Probably my fault.

In my IDE, I had directory "/usr/lib" specified as a linker search path... probably from before they switched over to /usr/lib/i386-linux-gnu and /usr/lib/x64-linux-gnu a year or so ago. I had both specified, but "/usr/lib" was first in the list (which might matter).

So presumably what was happening was one or another of the libraries my 3D engine was linking to was an older version in the "/usr/lib" directory, while a newer version exists in the "/usr/lib/i386-linux-gnu" and "/usr/lib/x64-linux-gnu" directories.

It also seems to me like the codeblocks IDE (or perhaps the GNU tools/linker) has gotten "smarter" (maybe too smart) about what directories to search for shared libraries. I forgot to put any linker library names or search directories in my test program, yet somehow it linked to xlib and GL (neither of which is that standard a library, nothing like the C standard libraries or something).

Anyhow, that was the problem all along. Gads, can't believe how much BS I went through where no problems in my programs existed.

Probably the new nvidia drivers put their shared libraries in the /usr/lib/i386-linux-gnu and /usr/lib/x64-linux-gnu directories... while older versions of the shared libraries got put in /usr/lib directly (or something along those lines). Otherwise, who freaking knows.

Nice to have it solved, but I hate when I waste a day or three on something as stupid as this. Sigh.