The investigation of neuronal accounts of cognition is closely linked to collaboration between behavioral experiments, theory and application and supports the process of moving from pure behaviorist correlation analysis to gaining a real understanding of the underlying mechanisms. Cognition builds upon the individual behavioral history, and the understanding of cognition is based on neuronal principles.
The study of human behavior incorporates in particular interactive, dynamically changing scenarios with multiple human individuals. Both the acquisition of behavioral data of human subjects, the modeling of behavior, as well as the evaluation in interactive scenarios, makes it necessary to generate simulated images of reality. Simulations allow the investigator to precisely control the structure of the environment the subject interacts with. Furthermore, situations that would be too dangerous in the real world (e.g. near-crash driving situations) can be investigated using virtual reality.
By nature, simulated reality frameworks are designed to simulate naturalistic environments. Within these environments, ecologically relevant stimuli embedded in a meaningful and controlled context can be presented. The quality of experimental data acquired within the simulated environment depends not to the last on the degree of immersion of the human subject.
Driving experiments usually attempt to relate observable driver behavior to cognitive inputs. The precise visual (retinal) input of a driver in a driving simulator depends also on the exact position of his head with respect to the screen (Noth et al., 2010). The major meaning of ego motion feedback can be considered as a continuous calibration here.
In a virtual cooperation scenario, consistency matters - if an operator perceives an object at 1 m distance, moving 20 cm towards it should decrease the perceived distance to 80 cm, moving to the side of an object which occludes another one should reveal the latter (Pretto et al., 2009).
The ego-motion feedback mitigates the cues that remind operators of the fact that they are in a virtual and not in the real world. The way the appearance of a virtual object changes due to a lateral head movement is identical to its real counterpart, which means that even relations between real and virtual objects remain (Creem-Regehr et al., 2005; Cutting, 1997).
In this contribution we introduce a head tracking system which is utilized to incorporate human ego motion in simulated environments improving immersion in the context of a human-robot collaborative task and in an interactive driving simulator.
For both cases, we explain how the ego motion feedback leads to a more precise comprehension of the virtual scene and how the aspect of immersion influences the feeling of being “really” inside of the virtual scene and the weakening of the awareness of the border between the real and the virtual world.