I am trying to specify some pixel transfer options for a simple 2D texture. The code below does not apply the transfer settings to the texture data. I have substituted the texture calls with glDrawPixels and everything works, so I am inclined to believe it involves the glTexSubImage pixel path. Here is the code:

glTexSubImage2D(GL_TEXTURE2D, 0, 0, 0, 512, 512, GL_LUMINANCE, GL_FLOAT, array);There are no GL errors being generated. This works fine on the most recent driver for the Radeon 9800 and the Mesa DRI drivers. This does not work on my older Radeon 9800 driver, which just so happens to be the one I need to use. Am I doing something wrong, or is this just a driver problem?

glDrawPixels(0, 0, GL_LUMINANCE, GL_FLOAT, NULL);
glTexSubImage2D(GL_TEXTURE2D, 0, 0, 0, 512, 512, GL_LUMINANCE, GL_FLOAT, array);The only change I made was to add the call to glDrawPixels and provide it with no pixels to draw. I am assuming that this is causing the driver to flush some registers or something. This feels like a total hack, so if anyone has a valid solution, please let me know.

tamlin

07-31-2006, 11:12 AM

ATI's drivers have many, many bugs. Most I've experienced are "only" in user-mode, why you "only" get a SEGV (and how much doesn't it suck to single-step that code in assembler, find what the bug is, and not have source code available to really fix it!), but there are some bugs in kmode they haven't fixed for years - which makes me wonder if they even have access to even one logic analyzer, if they even care, or even have debugging available on their cores.

Please note I'm not writing this as a troll, even that it likely looks like it. Should ATI really, seriously, have interest in finding and solving the bugs only _I_ have found (not for the shiniest new toys, but at least ATI hasn't abandoned these chips yet), feel free to contact me - 'cause all my attempts to contact you have been fruitless.