This project investigates the relationship between cortical activity (Electroencephalography, EEG), eye movements and mental representation structures (Structural Dimensional Analysis of Mental Representation, SDA-M) as a complex measure of ongoing cognitive processes during human-machine interaction (HMI) - an approach that has so far received little attention. We envision an interactive, dynamic and adaptive system that facilitates and supports HMI in a way that is exactly tuned to a person’s individual needs and therefore provides a natural and intuitive interaction experience. To this end, we will introduce a novel multi-modal brain-machine interface (BMI) as a means for the detection and interpretation of object and task-related processing difficulties in interaction situations. This requires developing novel online classifier tools based on the integrated analysis of EEG and eye movement data. The BMI will provide cues for distinguishing between relevant and irrelevant objects, and also between objects that are easily processed versus those that cause processing difficulties. The SDA-M method serves as a diagnostic tool for identifying such cognitive processing difficulties as well as a measure for learning. This novel approach allows for adapting HMI based on these measures, providing optimized feedback in terms of modality, level and frequency (repetitions).