An American Sign Language (ASL) recognition system developed based on multidimensional
Hidden Markov Models (HMM) is presented in this paper. A CybergloveTM
sensory glove and a Flock of BirdsR motion tracker are used to extract the features of
ASL gestures. The data obtained from the strain gages in the glove defines the hand
shape while the data from the motion tracker describes the trajectory of hand movement.
Our objective is to continuously recognize ASL gestures using these input devices in real
time. With the features extracted from the sensory data, we specify multi-dimensional
states for ASL signs in the HMM processor. The system gives an average of 95% correct
recognition for the 26 alphabets and 36 basic handshapes in the ASL after it has been
trained with 8 samples. New gestures can be accommodated in the system with an interactive
learning processor. The developed system forms a sound foundation for continuous
recognition of ASL full signs.

Received August 16, 2005; accepted January 17, 2006.
Communicated by Jhing-Fa Wang, Pau-Choo Chung and Mark Billinghurst.
*This research was partially supported by the National Science Foundation award (DMI- 0079404) and the
Ford Foundation grant, as well as by the Intelligent Systems Center at the University of Missouri-Rolla.