Recommended Posts

I'm doing a DirectDraw 7 tilebased game. With win32 api.
...and suddenly it popped into my head: "How much video memory does the video card have, and how much is free?"
Why do I bother at all...I mean...a tile game doesn’t necessarily waste that much space does it?
If you have a resolution of 1600 x 1200 with 32bit colour, that would use 7,3MB of the memory. A front buffer, a back buffer, and a buffer where you keep your different tiles….. it should be difficult finding a silly graphics card that could not run this silly game…….
But I have 128MB on my overpriced video card ( I got it cheap though ), and if I want to make a buffer in video memory that could take advantage of the free memory, how would I then measure total memory and free memory?
With my game I could take a 150x150 map, with 32x32pxl tiles, 32bits images, and store it in video memory….that would use almost 90MB of the memory……
I could also see other games taking advantage of free video memory as well.
So how do I do it?
There is a function in directx which is IDirectDraw7::GetAvailableVidMem. But it looks like it returns video memory + system memory reserved for the video card, so that wasn’t entirely easy.
So how does the pro’s doing it? Creating huge silly surfaces and see which one of them returns a “out of memory” and not? Can’t say that would be all that smart thing to do….. how could I then create a game and let it take advantage of graphics card with 256MB or 512MB of video memory?
Another question on the side here, if I create a ton of surfaces in video memory to the brink of no free memory available , and someone then use alt+tab or whatever, how can I be sure that tings stay “correctly” when my program receives “the focus” again?
Bring out your dead!
Bring out your dead! [clang]
Bring out your dead! [clang]
Bring out your dead! [clang]
Bring out your dead! [clang]
Bring out your dead!
-= Monty Pyton: Holy Grail =-
[edited by - DarkSlayer on December 6, 2002 2:13:29 AM]