Hybrid View

Shader/glDrawArrays crash only on nVidia/Win32

I'm the author of an open source, cross-platform freeware 3D engine dim3 (www.klinksoftware.com). Two components: the engine and the editor. The editor is all fixed function, it works everywhere. The Engine is all shader code. It works on iOS (es2), OS X with Intel GPU, nVidia, or ATI(AMD) cards, and on the PC with ATI(AMD) or Intel GPU.

On nVidia, it always crashes, the first shader call (against a glDrawArrays), with pretty much any shader, no matter how simple.

Sadly, I don't have this configuration, and debug code to my users isn't getting me anywhere. Has anybody encountered this? Is there a way to get further debug information? I'm sure it's something simple the PC nVidia drivers are doing differently, maybe they require things to be set in a certain order?

Further notes: No built-in variables are used, all vertex, uv, matrixes, etc, are passed in by vertex arrays or uniforms. Everything uses VBOs. The shader compile correctly, and all the locations have proper integers.

You can find the latest PC build at the url above. Should crash right away if you have a PC/nVidia setup.

I'm the author of an open source, cross-platform freeware 3D engine dim3 (www.klinksoftware.com). Two components: the engine and the editor. The editor is all fixed function, it works everywhere. The Engine is all shader code. It works on iOS (es2), OS X with Intel GPU, nVidia, or ATI(AMD) cards, and on the PC with ATI(AMD) or Intel GPU.

On nVidia, it always crashes, the first shader call (against a glDrawArrays), with pretty much any shader, no matter how simple. ... [>] Brian

As always, the minute I post this, I think I've found the problem, but will need to verify with my users, hopefully by tomorrow. It has to do with glEnableVertexAttribArray and glVertexAttribPointer.

I've got some enables that are leaking to shaders where no AttribPointer call is made -- because -- as it always is -- I was getting a bit too aggressive with the optimizations. But here's the interesting part: The IDs aren't hooked up to any attributes in the shader code. It works everywhere, except PC/nVidia. The drivers must be doing some kind of pre-flight check, and that's causing the access violation.

I got away with this for a long time, until I ran into a user with that setup.

For instance, I might have A, B, and C all enabled, only A & B have offsets into the VBO set, but only A & B are used in the shader, or referenced at all.

Is what the driver is doing right or wrong? It's certainly checking data it will never use, but then again, I shouldn't be enabling data without setting a pointer to it. I'll have more later when I know if this is the real reason.

For only nVidia PC drivers (not OS X), if you enable a vertex array that doesn't exist in the shader, it'll crash with a access violation when you attempt to draw on the shader. Other drivers ignore this, as it's really a no-op.

Thanks for the "reference". Few years ago Alfonse said that it was nonsense when I said all unused attributes have to be disabled to prevent application crash. I'm using NV hardware for years and this behavior is quite natural to me.