The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

If you've been following Asylum's development, you already know that the game's graphics are based on pre-rendered textures projected on top of a cube with inverted normals. Since gouraud shading is avoided in these faces, you have the illusion of a panoramic view. Using this technique instead of 3D rendering allows the game to have great looking graphics without the need of costly (in development and computational time) real-time rendering techniques. A clever trade off if you ask me. The downside of this technique is that you lose all the benefits of dynamic lighting and depth provided by the third dimension, and with it, a lot of realism. I had been thinking that with some shader-level magic we could re-create some stuff needed for a few visual effects we wanted to implement. This post is about one of those visual effects...

A horror game without fog? No way!

This was basically the feeling of the whole team. We needed to have animated fog and it should be realistic, it couldn't be just an overlay on top of each cube's face, that wouldn't look good enough. Pablo and I started discussing the idea of using a Z-Depth mask exported from 3D Studio Max in some way to simulate depth in each face. For you to understand, Pablo exported an image where he defined white as far and black as near and all the distances in between as shades of grey. You can see an example here:

The idea is that I could use the depth information inside a shader to control the fog's opacity. To give you an idea, if fog is represented by a white square the result would look something like this:

This was the first test we did, and it already looked promising! The second step was to use a cloud texture instead of a white square, which ended up looking fine but if we were going to use a cloudy fog, we couldn't use a static one, it needed to be animated.

Let's move!

I've done some post-processing effects in the past, so I was already familiar with using framebuffers as textures. This approach worked like charm. In Unity we solved this using Render Textures in a few steps (sorry, but this is a Pro-Only feature):

Create a fog particle system

The particles move really slow and have their alpha changing slowly during its lifespan, they also rotate a little bit over time. We added a layer called "Fog" and selected it in the particle system.

Create a camera

We created another camera (without the main camera tag), and positioned it to look at the particle system, and limited its Culling Mask to 'Fog' only. The background of the camera was set to black too, in this way we were able to sum the fog texture on top of the original one.

Create and assign a render texture

We created a Render Texture (in the project browser Create->Render Texture) and assigned it as the Render Texture to the camera. Then also assigned it as the Fog texture to send to our custom shader and...

Conclusion

This was the first experiment to achieve a set of ideas we have about re-creating some 3D visual effects on our 2D projected space in Asylum. Using a texture with z-depth as an opacity mask for an animated fog texture proved to work really well and the results looked good enough. In the future we're planning to use similar techniques to simulate dust particles and dynamic lights. We started to evaluate using normals to recreate specular highlights and other interesting stuff like that. We'll see how far we'll be able to take this! Is there any interesting idea that this post has brought to your mind? What do you think about the technique? Any optimization that you may think about? It would be great to read some comments!