Abstract

The objective of this work is of improving the efficacy, acceptance, adaptability and overall performance of Human-Machine Interaction (HMI) applications using a context-based approach. In HMI, we aim to define a general human model that may lead to principles and algorithms allowing more natural and effective interaction between humans and artificial agents. This is paramount for applications in the field of Active and Assisted Living (AAL). The challenge of user acceptance is of vital importance for future solutions, and still one of the major reasons for reluctance to adopt cyber-physical systems in this domain. Our hypothesis is that, we can overcome limitations of current interaction functionalities by integrating contextual information to improve algorithms accuracy when performing under very different conditions and to adapt interfaces and interaction patterns according user intentions and emotional states.