Music is a complex multimodal medium experienced not only via sounds but also through body movement. Musical instruments can be seen as technological objects coupled with a repertoire of gestures. We present technical and conceptual issues related to the digital representation and mediation of body movement in musical performance. The paper reports on a case study of a musical performance where motion sensor technologies tracked the movements of the musicians while they played their instruments. Motion data were used to control the electronic elements of the piece in real time. It is suggested that computable motion descriptors and machine learning techniques are useful tools for interpreting motion data in a meaningful manner. However, qualitative insights regarding how human body movement is understood and experienced are necessary to inform further development of motion-capture technologies for expressive purposes. Thus, musical performances provide an effective test bed for new modalities of human–computer interaction.