Modeling and simulating the wind within a real-time interactive virtual environment has potential for increasing immersion on several levels. From a visual perspective, accessing simulated wind field data can assist in creating a richer dynamic visual experience with leaves blowing through the environment, trees reacting to the wind, or autonomous pedestrians leaning into strong wind gusts. Such visual elements work to create a more dynamic, ambient environment consistent with real environments. With regard to physical interaction, users can feel the wind within the environment on their bodies and relate the visual aspects of the wind's effect (leaves, flags, etc...) to their own actions within the virtual environment. Perhaps the best illustration of this combined visual and physical interaction comes from an urban environment simulation in which the user feels the wind produced in the wake of a bus driving by them as the user stands on a virtual curb.

To obtain this system, we are applying graphics hardware (GPUs) to the simulation of wind and particle dispersion modeling in urban areas. We have implemented a Lagrangian particle dispersion model based on the Quick Urban and Industrial Complex (QUIC) Dispersion ModelingSystem on the GPU. With our system, the GPU simulations outperformed the CPU simulations by over an order of magnitude in an urbanized domain with buildings. Our primary challenge has been in exploring visualization and interaction methods that assist the computational fluid dynamics engineers that work on the project. In particular, these engineers have found that real-time visualization and interaction with their flow models (something they have not been able to do until now) has been extremely beneficial for debugging and understanding turbulence models. Our efforts have focused on finding better means to visualize and interact with the components of the flow model.

General Research Summary

My research efforts are highly interdisciplinary and focus on the problems associated with making human interaction with virtual environments more natural and realistic. This work is greatly enhanced by collaboration with colleagues from psychology, computer science, and engineering. There are two approaches I take with my research. The first is to examine human perception in virtual environments by measuring responses to visual, haptic, or motoric tasks. The second approach investigates the software and algorithms necessary for creating virtual environments in which many dynamic, virtual entities engage and interact with users of the virtual environment.

The motivation for this work is knowing that virtual environments can serve as laboratories for the exploration and study of human behavior. Virtual environments afford a significant amount of control over experiment variables and provide ideal conditions under which to conduct studies that cannot be carried out in the physical world. Examples of such research include understanding the effects of distractions while driving, child bicycle safety issues, or flight training. Virtual environments also contribute to the study of basic human perception by allowing for the manipulation of environment cues that would be difficult to control in the real world. Thoroughly understanding the various components of virtual environment hardware and software, including the limitations, will increase their utility and applicability for use in research, training, and education.