Navigation in large virtual reality applications is often done by unnatural input devices like keyboard, mouse,
gamepad and similar devices. A more natural approach would be letting the user walk through the virtual
world as if it was a physical place. This involves tracking the position and orientation of the participant over
a large area. We propose a pure optical tracking system that only uses off-the-shelf components like cameras
and LED ropes. The construction of the scene doesn’t require any off-line calibration or difficult positioning,
which makes it easy to build and indefinitely scalable in both size and users.
The proposed algorithms have been implemented and tested in a virtual and a room-sized lab set-up. The first
results from our tracker are promising and can compete with many (expensive) commercial trackers.