Full Papers 14: GPUs & Hardware

GPU Simulation and Rendering of Volumetric Effects for Computer Games and Virtual Environments

Jens Krueger,
Ruediger Westermann,
TU Muenchen

As simulation and rendering capabilities continue to increase, volumetric effects like smoke, fire or explosions will be frequently encountered in computer games and virtual environments. In this paper, we resent techniques for the visual simulation and rendering of such effects that keep up with the demands for frame rates imposed by such environments. This is achieved by leveraging functionality on recent graphics programming units (GPUs) in combination with a novel approach to model non physics-based, yet realistic variations in flow fields. We show how to use this mechanism for simulating effects as demonstrated in the Figure above. Physics-based simulation is performed on 2D proxy geometries, and simulation results are extruded to 3D using particle or texture based approaches. Our method allows the animator to model and to flexibly control the dynamic behavior of volumetric effects, and it can be used to create plausible animations of a variety of natural phenomena.

Approximate Ray-Tracing on the GPU With Distance Impostors

This paper presents a fast approximation method to obtain the
point hit by a reflection or refraction ray. The calculation is
based on the distance values stored in environment map texels.
This approximation is used to localize environment mapped
reflections and refractions, that is, to make them depend on where
they occur. On the other hand, placing the eye into the light
source, the method is also good to generate real-time caustics.
Computing a map for each refractor surface, we can even evaluate
multiple refractions without tracing rays. The method is fast and
accurate if the scene consists of larger planar faces, when the
results are similar to that of ray-tracing. On the other hand, the
method suits very well to the GPU architecture, and can render
ray-tracing and global illumination effects with few hundred
frames per second. The primary application area of the proposed
method is the introduction of these effects in games.

We present a framework for achieving user-defined on-demand displays in
setups containing bricks of movable cameras and DLP-projectors. A dynamic
calibration procedure is introduced, which handles cameras and projectors in
a unified way and allows continuous flexible setup changes, while seamless
projection alignment and blending is performed simultaneously. For
interaction, an intuitive laser pointer based technique is developed, which
can be combined with real-time 3D information acquired from the scene. All
these tasks can be performed concurrently with the display of a user-chosen
application in a non-disturbing way. This is achieved by using an
imperceptible structured light approach enabling pixel-based surface light
control suited for a wide range of computer graphics and vision algorithms.
To ensure scalability of light control in the same working space, multiple
projectors are multiplexed.