Hybrid View

GL_BYTE vertex attribute giving wrong data

I have a debug text rendering setup which feeds char data to a vertex/geometry/fragment shader.

On my nvidia gtx460 feeding the char* data using this as the data description:

Code :

glVertexAttribIPointer(TEXT_LOCATION, 1, GL_BYTE, 1, 0);

and having the shader input as

Code :

layout(location = TEXT_LOCATION) in int inCharacter;

works as expected, but on my ATI card the contents of inCharacter seem to be constantly 0;

As a test i changed the vertex attribute to

Code :

glVertexAttribIPointer(TEXT_LOCATION, 1, GL_INT, 4, 0);

and manually converted the const char* data to const int* data each time i update the VBO and it works as expected.

Is it OK to feed a vertex shader data in the way I originally intended?
Is there likely to be some small detail i have missed that would make it not give the correct data on an ATI card?
This is with GL+GLSL 4.2 on both machines and an AMD A10 APU for the machine with the problem.

i'm not sure, but according to spec for functions like glVertexAttribPointer, glDrawElements - GL_BYTE\GL_UNSIGNED_BYTE are valid types. but lately, i've encountered very similar issue - they don't work with AMD cards. i was trying to fill ELEMENT_ARRAY(for a simple quad of 2 triangles) with GL_UNSIGNED_BYTE indices and draw it with glDrawElements. it failed with no error. maybe there's special trick to make it work, or maybe AMD are actually follow the spec in this case and i was doing something wrong... i didn't care much cause i didn't NEED to use bytes.