Gestures without Libraries,
Toolkits or Training: A $1 Recognizer for User Interface Prototypes

Jacob O.
Wobbrock
The Information School
University of Washington

Andrew D. Wilson
Microsoft Research

Yang Li
Computer Science & Engineering
University of Washington

Although mobile, tablet, large display, and
tabletop computers increasingly present opportunities for using pen,
finger, and wand gestures in user interfaces, implementing gesture
recognition largely has been the privilege of pattern matching
experts, not user interface prototypers. Although some user
interface libraries and toolkits offer gesture recognizers, such
infrastructure is often unavailable in design-oriented environments
like Flash, scripting environments like JavaScript, or brand new
off-desktop prototyping environments. To enable novice programmers
to incorporate gestures into their UI prototypes, we present a "$1
recognizer" that is easy, cheap, and usable almost anywhere in about
100 lines of code. In a study comparing our $1 recognizer, Dynamic
Time Warping, and the Rubine classifier on user-supplied gestures,
we found that $1 obtains over 97% accuracy with only 1 loaded
template and 99% accuracy with 3+ loaded templates. These results
were nearly identical to DTW and superior to Rubine. In addition, we
found that medium-speed gestures, in which users balanced speed and
accuracy, were recognized better than slow or fast gestures for all
three recognizers. We also discuss the effect that the number of
templates or training examples has on recognition, the score falloff
along recognizers' N-best lists, and results for individual
gestures. We include detailed pseudocode of the $1 recognizer to aid
development, inspection, extension, and testing.