Keywords

Abstract

A desirable mechanism to control a computer would be to use a multifaceted, three-dimensional interactive display system that provides varying levels of control of an API via a "visually bounded freespace" containing projected imagery for interactions. This work presents such a system and its proof-of-concept, which includes two components: the 3D Projected Imagery Freespace Control Unit (PIFCU)-the built hardware used to create 3D imagery and function as the system’s control unit, and the Gesture-Controlled 3D Interface Freespace (GCIF)-the term used for this “sensorized” volume of space in which the 3D images and gestures will reside. This system is intended to ultimately remove the need for keyboards and similar interface hardware. The hardware of this device consists of an array of linearly adjoining slices of concave mirrored surfaces with openings at the top and bottom. This combination advances related work such that the imagery can be perceived as three-dimensional images in free-space, and the computer is controlled using hand gestures recognized by sensors. The benefits to other professions include providing new methods of construction, navigation, gaming, as well as presenting a solution for certain physical limitations that traditional computing experiences currently exert on users (i.e. rigid hand angles for typing).