Zusammenfassung

Facial expressions form one of the most important and powerful communication systems of human social interaction. They express a large range of emotions but also convey more general, communicative signals. To date, research has mostly focused on the static, emotional aspect of facial expression processing, using only a limited set of “generic” or “universal” expression photographs, such as a happy or sad face. That facial expressions carry communicative aspects beyond emotion and that they transport meaning in the temporal domain, however, has so far been largely neglected. In order to enable a deeper understanding of facial expression processing with a focus on both emotional and communicative aspects of facial expressions in a dynamic context, it is essential to first construct a database that contains such material using a well-controlled setup. We here present the novel MPI facial expression database, which contains 20 native German participants performing 58 expressions based on pre-defined context scenarios, making it the most extensive database of its kind to date. Three experiments were performed to investigate the validity of the scenarios and the recognizability of the expressions. In Experiment 1, 10 participants were asked to freely name the facial expressions that would be elicited given the scenarios. The scenarios were effective: 82 of the answers matched the intended expressions. In Experiment 2, 10 participants had to identify 55 expression videos of 10 actors. We found that 34 expressions could be identified reliably without any context. Finally, in Experiment 3, 20 participants had to group the 55 expression videos of 10 actors based on similarity. Out of the 55 expressions, 45 formed consistent groups, which highlights the impressive variety of conversational expressions categories we use. Interestingly, none of the experiments found any advantage for the universal expressions, demonstrating the robustness with which we interpret conversational facial expressions.