Will having an unused alpha chennel in your textures affect speed performance? I know it is a waste of memory.

Cheers,
Robin.

j

04-18-2002, 03:16 PM

It won't affect performance, and on many current graphics cards doesn't even hurt memory either. For example, many (all?) nVidia cards store 24-bit RGB textures as 32-bit.

In fact, uploading 32-bit textures to memory can be faster than uploading 24-bit textures, because the driver doesn't need to pad the values for it.

j

dorbie

04-18-2002, 05:41 PM

Hmm... you may want to test this theory.

There are also more complex issues, like hosing your quality depending on the chosen internal texture format, often forced by the user.

Robin Forster

04-18-2002, 06:32 PM

Thanks for the help.

One thing I did notice was that choosing GL_UNSIGNED_BYTE rather than GL_UNSIGNED_INT_8_8_8_8 speeded things up for me. This was strange to me but I guess it has to do with the acceleration hardware... perhaps the driver was a bit dumb.

zeckensack

04-18-2002, 10:19 PM

Originally posted by Robin Forster:
One thing I did notice was that choosing GL_UNSIGNED_BYTE rather than GL_UNSIGNED_INT_8_8_8_8 speeded things up for me.