I was going with NeHe's tutorial on loading a bitmap and he said something about that being the largest that would work with his loading thing and that you can load larger and im wondering- how? http://www.opengl.org/discussion_boards/ubb/smile.gif

but it seems from what i can tell that loads 1 bitmap, how could i change the function(s) to make it load any bitmap i tell it in any number, and bind it to whatever?

[This message has been edited by guzba (edited 03-18-2003).]

kehziah

03-18-2003, 10:18 PM

The same way you load any bitmap. Your loading function should be able to handle any bitmap size (if you implement your own bitmap loader, beware : data is DWORD aligned).

When loading an image to make it a texture, you must be carefull that its dimensions are powers of 2 (e.g. 128x512 or 512x256). If not, glTexImage2D will fail.

Regarding limitations as to the size of the texture, it is hardware dependent. You can reasonably assume that any card can handle up to 1024x1024 (with one exception : 3DFX cards whose limit is 256x256). You can query the limit with :

int maxSize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &amp;maxSize);

acerb

03-19-2003, 12:33 AM

Hi, thanks for the tip but it always returns 0 for me (@work: so it's only an onboard 845G chipset)

You can query the limit with :
int maxSize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxSize);

any idea or comment ? Thanks

acerb

M/\dm/\n

03-19-2003, 01:12 AM

DON'T USE AUX!!!!!!!!! It's old as h**l and be cautios with NeHe tutorials, as some of them are little outdated http://www.opengl.org/discussion_boards/ubb/frown.gif
If you want nice, aux independent loader look at www.gametutorials.com!!!!! (http://www.gametutorials.com!!!!!) I have no problems loading standart BMP's in style shown there, although experienced some problems with BMP saved with native XP programs. I guess there is a problem with BITMAPINFOHEADER, I had no time to check this out, but I think problem is quite simple.

kehziah

03-19-2003, 02:33 AM

Did you create a valid OpenGL rendering context before calling glGetIntegerv ?