Downloads of Software Tools

Software Tools of the Department "Human Perception, Cognition and Action"

Face Database
This database contains images of 7 views of 200 laser-scanned (Cyberware TM) heads without hair. The 200 head models were newly synthesized by morphing real scans to avoid close resemblances to individuals who might not want to appear on your computer screen or in your scientific publications. Currently, there are 5 sets of full 3D head models available. Please understand that we cannot provide more 3D head models as we are obliged to protect the identity of the other people.

Face Video Database
This database contains videos of facial action units which were recorded starting in autumn of 2003 at the MPI for Biological Cybernetics in the Face and Object Recognition Group, department Prof. Bülthoff, using the Videolab facilities created by Mario Kleiner and Christian Wallraven.

Psychtoolbox-3
Psychophysics Toolbox Version 3 (PTB-3) is a free set of Matlab and GNU/Octave functions for vision research. It makes it easy to synthesize and show accurately controlled visual and auditory stimuli and interact with the observer. It has at least fifteen thousand active users (see Overview), an active forum, and is highly cited. PTB-3 is based on the Psychophysics Toolbox Version 2 (PTB-2) but its Matlab extensions (in C) were rewritten to be more modular and use OpenGL. This Wiki describes the current version (PTB-3), which runs with Matlab 7.x and Octave 3.2.x on Mac OSX, Linux and Windows (see System Requirements). The old PTB-2, for Mac OS 9 and Windows, still used in many labs, remains available but is no longer developed or supported.

LibGaze
We present a mobile system for tracking the gaze of an observer in real-time as they move around freely and interact with a wall-sized display. The system combines a head-mounted eye tracker with a mo- tion capture system for tracking markers attached to the eye tracker. Our open-source software library libGaze provides routines for calibrating the sys- tem and computing the viewer’s position and gaze direction in real-time. The modular architecture of our system supports simple replacement of each of the main components with alternative technology. We use the system to perform a psychophysical user-study, designed to measure how users visually explore large displays. We find that observers use head move- ments during gaze shifts, even when these are well within the range that can be com- fortably reached by eye movements alone. This suggests that free movement is important in nor- mal gaze behaviour,motivating further applications in which the tracked user is free to move.

Software Tools of the Research Group Computational Vision and Neuroscience