This paper presents some results of a research work concerning algo- rithms and computational models for real-time analysis of expressive gesture in full-body human movement. As a main concrete result of our research work, we present a collection of algorithms and related software modules for the EyesWeb open architecture (freely available from www.eyesweb.org). These software modules, collected in the EyesWeb Expressive Gesture Processing Li- brary, have been used in real scenarios and applications, mainly in the fields of performing arts, therapy and rehabilitation, museum interactive installations, and other immersive augmented reality and cooperative virtual environment ap- plications. The work has been carried out at DIST – InfoMus Lab in the frame- work of the EU IST Project MEGA (Multisensory Expressive Gesture Applica- tions, www.megaproject.org).