abstract = "In this paper we present a framework for 3D hand
tracking and dynamic gesture recognition using a single
camera. Hand tracking is performed in a two step
process: we first generate 3D hand posture hypothesis
using geometric and kinematics inverse transformations,
and then validate the hypothesis by projecting the
postures on the image plane and comparing the projected
model with the ground truth using a probabilistic
observation model. Dynamic gesture recognition is
performed using a Dynamic Bayesian Network model. The
framework uses elements of soft computing to resolve
the ambiguity inherent in vision-based tracking by
producing a fuzzy hand posture output by the hand
tracking module and feeding back potential posture
hypothesis from the gesture recognition module.",