Recent advances in sensor technology provide new opportunities for applications that utilize the user's movement as an input source. This thesis focuses on movement analysis, which is a sub-area of a larger field concerned with the interpretation of sensor signals of human movement. Movement analysis can be used to provide feedback for training motor tasks, which is of interest for application areas such as rehabilitation, sports, and ergonomics.

Consciously controllable, goal-directed movements, which we call primary movements, lead to slight movements in other parts of the body that are beyond conscious control (secondary movements). Secondary movements are generated due to the mechanical interaction with the environment and physiological dependencies of the body. This thesis contributes methods to distinguish between primary and secondary movements. This is necessary in order to provide high-quality feedback for motor tasks that show a significant amount of secondary movement. This can be the case when the secondary movements are large because of large reaction forces (e.g., when shooting a ball) or when the primary movements to execute the task are small (e.g., in various forms of handcraft or musical instrument performance). Furthermore, a precise distinction between primary and secondary movement can be necessary to check whether the user keeps a part of the body still (e.g., as required by a gymnastic exercise). Apart from sensor-based feedback, our results can also be used to improve current gesture recognition methods by ignoring secondary movement in the sensor signal to avoid that a secondary movement is misinterpreted as an execution of a gesture.

The effectiveness of the proposed methods is shown in the context of pianist arm movements, which are particularly challenging to analyze. The distinction between primary and secondary movement in one joint of the arm is based on the measured movement in that particular joint, an estimation of key reaction force from MIDI data, and the movement in the other joints of the arm. In order to know on which arm the estimated key reaction force acts, it is necessary to determine which hand has played a note. For that purpose two methods are introduced: One method is based on MIDI; the other one uses data from inertial sensors in combination to MIDI. A third method based on Computer Vision, which was originally developed for sign language recognition, is evaluated here for tracking pianist hands.

Based on the analysis methods, two pedagogical applications were developed: One application supports an existing piano pedagogical movement notation and checks whether the player's movement conforms to the indicated movement. A user study with piano students of a music university shows that potential users judge that the system is useful for the training of technique. The second application visualizes the sensor data and allows synchronizing different performances of the same piece, making it easy to spot differences where a closer examination may be beneficial.