You probably could build your own, scaling down the width and height and not depth. I will probably try that whenever I get the chance, since I really don't want to handle interpolating between 6-8 textures. (all I have to do, though, is find an algorithm to downscale an image... )

Edit: never mind, you can't do that. You must reduce the r as well. So, TomorrowPlusX, how did you do it? Meanwhile, I'll look up solutions to get massive multitexture interpolation implemented...

Jones Wrote:And it would seem that multitexturing is an extension, whereas 3D textures have been in the core since 1.3 (or was it .4? ).

Whether it's an extension or not is completely irrelevant. What matters is hardware support. Multitexturing is supported by everything Mac OS X supports; 3D textures are not.

In any case, everything we've discussed was once an extension --
ARB_multitexture and ARB_texture_env_combine, promoted to core in 1.3
EXT_texture_3D, promoted to core in 1.2
ARB_shading_language_100 & friends, promoted to core in 2.0

I suggest declaring a static (or global) variable so that you only cal glGetError once in your drawing function. (aka: only cal glGetError if it's TRUE, and have it originally FALSE and set it to TRUE when you call glGetError) Then, move the call around until you get the exact place (or places) where the error occurs.

Were you sure to call glDisable(GL_TEXTURE_3D) after being finished with 3D textures, and call glDisable(GL_TEXTURE_2D) after being finished with 2D textures? (aka: do you ever have more than one enabled at once?) You should only have 1 texture type enabled for each texture unit enabled at 1 time.

OSC: Thanks for the info on OpenGL profiler, I didn't know that. (of course, almost all of my debugging is done through print statements coupled with stack backtraces... )