Icefall uses literally dozens of textures… (and it will probably be ‘hundreds’ by the time the game is completed).

Textures are used for everything, from the mouse cursor, menus, fonts, buttons, to the game world itself, spells/action icons, monsters, equipment… everything.

Although my own main PC has 512MB of video (texture) RAM, not everyone does! And
following on from my minimum requirements post, I don’t really want someone’s video card memory size to be a limiting factor if everything else meets the requirements, so Icefall needs a way to manage things when there are more textures than there is video memory.

I don’t want each bit of code that uses textures to worry about whether they’re loaded or not, so the logical choice is to encapsulate all of the texture handling into one place: I call the class that handles this the TContentManager (“content” because it also handles sounds, fonts, music, etc.).

When the game code asks the TContentManager for a texture: the first thing it does is look to see if that texture is already available in video memory: if so, it just hands out a reference. Easy! If not, it retrieves the filename for that texture from Icefall’s ResourceDatabase, and attempts to load the texture from disk. If that fails, it looks at why it failed: if it’s D3DERR_OUTOFVIDEOMEMORY, the next step is to unload some other texture and try again, repeating until the load succeeds or we have no loaded textures left. (If it still doesn’t load, or if the texture load fails for some other reason, it’s goodbye Icefall).

UnloadTexture is a method that unloads a texture. It takes a TTextureID as a parameter, but passing 0 in this case tells the method we want it to choose a texture to unload itself. This is where it gets interesting.

Before I got to this, I originally had set up a few states so that they explicitly called TContentManager.UnloadTexture to release textures when the game left that state (e.g. the game would unload the ‘Option-selection’ texture when the player closed the Options dialog). However, this turned out to be sub-optimal for several reasons:

Players might well leave and re-enter states (like Options) several times to accomplish whatever it is they’re trying to do.

It doesn’t make use of extra video memory: it keeps loading from disk (or the disk-cache anyway) while the extra video memory stays idle.

I had to explicitly declare what textures I was done with. Not a problem really but ‘just another thing’ that I had to do when changing states.

So now I no longer explicitly release anything. UnloadTexture always chooses the texture to unload that was least-recently used (a linked list makes this a very fast and efficient check – no timers or array scanning involved). In practise “least recently used” turns out to be about 97% optimal (when simulated against a 100% optimal algorithm which is permitted to see the future when deciding what to unload) so it’s virtually perfect – substantially better than my explicit declarations were.

If for some crazy reason the user doesn’t want Icefall consuming all of their video RAM*, I can chuck in a call to IDirect3DDevice9.GetAvailableTextureMem and trigger UnloadTexture if it’s below some specific amount. Alternatively, I could keep a sum of the amount of texture memory I’m using, and trigger UnloadTexture if that amount threatens to exceed 32MB or whatever (I haven’t decided yet).

The point is, if you find yourself manually balancing resources, it’s probably an excellent idea to profile your resource usage and see if you can find a pattern that will let you just automate the whole thing**. You’ll save yourself much time and your code will be cleaner and more flexible.

*Note: this only actually matters on Vista or Windows 7. Under the Windows XP model, multiple applications can’t share texture memory anyway. As soon as an application takes focus, DirectX invalidates all texture resources belonging to everything else (forcing them to reload from system-memory or disk when they get restored). The WDDM (Windows Display Driver Model) in Vista “understands” video memory, and can share it amongst applications. This is one of the reasons DirectX 10+ can’t be backported to XP: XP has no concept of video card resources.

**If you’re creating a game frame-rate locked or just frame-rate dependent, you might still need to explicitly acquire and release resources (e.g. between levels), because a 10ms pause at an unexpected point in your game could be noticeable or detrimental. Either that, or move the resource acquistion to another thread and have low-quality ’emergency’ textures to display while the real ones load… this is a very common technique for FPS games.