Abstract

The applications for handheld computers have evolved from very simple schedulers or note editors to more complex applications where high-level interaction tasks are required. Despite this evolution, the input devices for interaction with handhelds are still limited to a few buttons and styluses associated with sensitive screens.In this paper we focus on the visualization of large documents (e.g. maps) that cannot be displayed in their entirety on the small-size screens. We present a new task-adapted and device-adapted interface called TangiMap.TangiMap is a three degrees of freedom camera-based interface where the user interacts by moving a tangible interface behind the handheld computer. TangiMap benefits from two-handed interaction providing a kinaesthetic feedback and a frame of reference.We undertook an experiment to compare TangiMap with a classical stylus interface for a two-dimensional target searching task. The results showed that TangiMap was faster and that the user preferences were largely in its favor.