How to do Multimodal Detection of Affective States?

I am presenting a short talk at the 42nd Conference of Research and Development by Tecnologico de Monterrey in Monterrey (CIDTEC), Nuevo Leon, Mexico. January 2012.

Summary

The human-element is crucial for designing and implementing interactive intelligent systems, and therefore on instructional design. This document provides a description for detection of affective states and a description of devices, methodologies and tools necessary for automatic detection of affective states. Automatic detection of affective states requires that the computer sense information that is complex and diverse, it can range from brain-waves signals, and biofeedback readings to face-based and gesture emotion recognition to posture and pressure sensing. Obtaining, processing and understanding that information, to create systems that improve learning, requires the use of several sensing devices (and their perceiving algorithms) and the application of software tools. We provide enough information and examples to enable researchers to start their own investigations of the cognitive-affective elements of learning. Even though we do not present an exhaustive list of all the methods available for collecting, manipulating, analyzing and interpreting affective sensor data, this document describe the basis of a multimodal approach that researchers can use to launch their own research efforts. This research was supported by the Office of Naval Research under Grant N00014-10-1-0143 awarded to Dr. Robert Atkinson and by the National Science Foundation, Award 0705554, IIS/HCC Affective Learning Companions: Modeling and supporting emotion during teaching Dr. Beverly Woolf and Dr. Winslow Burleson.