IEEE Transactions on Affective Computinghttp://www.computer.org
The IEEE Transactions on Affective Computing is intended to be a cross disciplinary and international archive journal aimed at disseminating results of research on the design of systems that can recognize, interpret, and simulate human emotions and related affective phenomena. The journal will publish original research on the principles and theories explaining why and how affective factors condition interaction between humans and technology, on how affective sensing and simulation techniques can inform our understanding of human affective processes, and on the design, implementation and evaluation of systems that carefully consider affect among the factors that influence their usability. Surveys of existing work will be considered for publication when they propose a new viewpoint on the history and the perspective on this domain.en-usMon, 3 Nov 2014 15:35:40 GMThttp://csdl.computer.org/common/images/logos/tac.gifIEEE Computer SocietyList of recently published journal articleshttp://www.computer.org/tac
PrePrint: Affective Visual Perception using Machine Pareidolia of Facial Expressionshttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2347960
This article presents a computer vision approach that can detect and classify abstract face-like patterns, including subliminal faces within a scene. This can be regarded as a way of simulating the phenomenon of pareidolia, that is, the tendency of humans to ‘see faces’ in random structures such as clouds or rocks. The paper describes the system consisting of a component-based face detector and an expression classifier. The face detector creates a number of component images from the original image at different resolutions. A component image is a binary edge image where the edges are segmented into components using a labelling method with a border-following technique. The component images are then overlaid to produce a component height map where large and notable components across all resolutions have high values, while specular and noisy components have low values. The method retains three shape components, representing two eyes and a mouth, that have height map values that are larger than the noise cut-off value. Support vector machines using scale invariant feature vectors are applied for ranking these three shape components by their geometry and size, and their shape semblance to human faces in the training data. The outcome is a facial expression analysis system that uses face components, with the potential to estimate an emotional expression value for a scene by producing an array of emotion scores corresponding to Ekman’s seven Universal Facial Expressions of Emotion. An advantage of this technique, when compared to a holistic method, is that the face components are explicitly isolated. This supports a process of abstraction that can facilitate the detection of distorted and minimal face-like patterns.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2347960PrePrint: Don’t Classify Ratings of Affect; Rank them!http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2352268
How should affect be appropriately annotated and how should machine learning best be employed to learn to map manifestations of affect to affect annotations? These are the critical affective computing questions that this paper attempts to address by comparing the efficacy of affect modelling when fed by a number of different representations of affect annotations. In particular, we compare several different binary-class and pairwise preference representations for automatically learning from ratings. The representations are compared and tested on three datasets. The first is a synthetic dataset that generates ratings based on a number of attributes. The other two datasets contain physiological and contextual user attributes, and speech attributes, respectively; these attributes are coupled with ratings of various affective and cognitive states. The main results of the paper suggest that ratings (when used) should be naturally transformed to ordinal representations for obtaining more reliable and generalisable models of affect. The findings of this paper have a direct impact on affect modelling but, most importantly, challenge the current state-of-practice in affective computing and psychometrics at large.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2352268PrePrint: Intra-class Variation Reduction Using Training Expression Images for Sparse Representation Based Facial Expression Recognitionhttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2346515
http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2346515PrePrint: Feature Extraction and Selection for Emotion Recognition from EEGhttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2339834
Emotion recognition from EEG signals allows the direct assessment of the “inner” state of a user, which is considered an important factor in Human-Machine-Interaction. Many methods for feature extraction have been studied and the selection of both appropriate features and electrode locations is usually based on neuro-scientific findings. Their suitability for emotion recognition, however, has been tested using a small amount of distinct feature sets and on different, usually small datasets. A major limitation is that no systematic comparison of features exists. Therefore, we review feature extraction methods for emotion recognition from EEG based on 33 studies. An experiment is conducted comparing these features using machine learning techniques for feature selection on a self recorded dataset. Results are presented with respect to performance of different feature selection methods, usage of selected feature types, and selection of electrode locations. Features selected by multivariate methods slightly outperform univariate methods. Advanced feature extraction techniques are found to have advantages over commonly used spectral power bands. Results also suggest preference to locations over parietal and centro-parietal lobes.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2339834PrePrint: More Personality in Personality Computinghttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2341252
By explicitly describing what has been done in the past, surveys implicitly outline what can (and sometimes should) be done in the future. The insightful commentary by Wright contributes significantly to this latter aspect, especially when it comes to aligning Personality Computing with the latest developments in Personality Science. This response article tries to progress in such a direction by discussing on Wright’s suggestions from a computing science point of view.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2341252PrePrint: Humans vs. Computers: Impact of Emotion Expressions on People’s Decision Makinghttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2332471
Recent research in perception and theory of mind reveals that people show different behavior and lower activation of brain regions associated with mentalizing (i.e., the inference of other’s mental states) when engaged in decision making with computers, when compared to humans. These findings are important for affective computing because they suggest people’s decisions might be influenced differently according to whether they believe emotional expressions shown in computers are being generated by algorithms or humans. To test this, we had people engage in a social dilemma (Experiment 1) or negotiation (Experiment 2) with virtual humans that were either perceived to be agents (i.e., controlled by computers) or avatars (i.e., controlled by humans). The results showed that such perceptions have a deep impact on people’s decisions: in Experiment 1, people cooperated more with virtual humans that showed cooperative facial displays (e.g., joy after mutual cooperation) than competitive displays (e.g., joy when the participant was exploited) but, the effect was stronger with avatars (d = .601) than with agents (d = .360); in Experiment 2, people conceded more to angry than neutral virtual humans but, again, the effect was much stronger with avatars (d = 1.162) than with agents (d = .066). Participants also showed less anger towards avatars and formed more positive impressions of avatars when compared to agents.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2332471PrePrint: Joint Attention Simulation using Eye-Tracking and Virtual Humanshttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2335740
This article analyses the issues pertaining to the simulation of joint attention with virtual humans. Gaze represents a powerful communication channel illustrated by the pivotal role of joint attention in social interactions. To our knowledge, there have been only few attempts to simulate gazing patterns associated with joint attention as a mean for developing empathic virtual agents. Eye-tracking technologies now enable creating non-invasive gaze-contingent systems that empower the user with the ability to lead a virtual human’s focus of attention in real-time. Although gaze control can be deliberate, most of our visual behaviors in everyday life are not. This article reports empirical data suggesting that users only have partial awareness of controlling gaze-contingent displays. The technical challenges induced by detecting the user’s focus of attention in virtual reality are reviewed and several solutions are compared. We designed and tested a platform for creating virtual humans endowed with the ability to follow the user’s attention. The article discusses the advantages of simulating joint attention for improving interpersonal skills and user engagement. Joint attention plays a major role in the development of autism. The platform we designed is intended for research and treatment of autism and tests included participants with this disorder.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2335740PrePrint: Correcting Time-Continuous Emotional Labels by Modeling the Reaction Lag of Evaluatorshttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2334294
An appealing scheme to characterize expressive behaviors is the use of emotional dimensions such as activation (calm versus active) and valence (negative versus positive). These descriptors offer many advantages to describe the wide spectrum of emotions. Due to the continuous nature of fast-changing expressive vocal and gestural behaviors, it is desirable to continuously track these emotional traces, capturing subtle and localized events (e.g., with FEELTRACE). However, timecontinuous annotations introduce challenges that affect the reliability of the labels. In particular, an important issue is the evaluators’ reaction lag caused by observing, appraising, and responding to the expressive behaviors. An empirical analysis demonstrates that this delay varies from one to six seconds, depending on the annotator, expressive dimension, and actual behaviors. Our experiments show accuracy improvements even with fixed delays (1-3 seconds). This paper proposes to compensate for this reaction lag by finding the time-shift that maximizes the mutual information between the expressive behaviors and the continuous-time annotations. The approach is implemented by making different assumptions about the evaluators’ reaction lag. The benefits of compensating for the delay is demonstrated with emotion classification experiments. On average, the classifiers trained with facial and speech features show more than 7% relative improvements over baseline classifiers trained and tested without shifting the time-continuous annotations.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2334294PrePrint: Ambulatory Assessment of Affect: Survey of Sensor Systems for Monitoring of Autonomic Nervous System’s Activation in Emotionhttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2332157
Advances in miniaturized computing, storage and communication resources for personal wearable electronics devices, as well as the availability of diverse sensors for state assessment enable the development of a wide variety of wearable Body Area Network (BAN) systems for psychophysiological measurements. These systems pave the way for acquisition of quality data relevant for research studies, amongst others, on the autonomic nervous system (ANS) activation in emotion. We present a high-level overview of BAN and its features, and we review 173 publications that report research studies on emotion activation and particularly 15 BAN systems employed in these studies. We discuss each BAN in terms of its capacity for ambulatory, i.e., out of the laboratory, assessment of the ANS activation in emotion. Finally, we highlight the design challenges to be addressed to make BAN systems effective for a wide range of applications to support users’ wellbeing and overall Quality of Life improvement. This paper provides knowledge to those interested in (ambulatory) assessment of the ANS activation on the set of systems currently used in research, and it aims to highlight opportunities for scientists and practitioners in, amongst others, the affective computing domain, enabling them to reflect upon their BAN requirements and study designs.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2332157PrePrint: A Survey of Personality Computinghttp://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2330816
Personality is a psychological construct aimed at explaining the wide variety of human behaviors in terms of a few, stable and measurable individual characteristics. In this respect, any technology involving understanding, prediction and synthesis of human behavior is likely to benefit from Personality Computing approaches, i.e. from technologies capable of dealing with human personality. This paper is a survey of such technologies and it aims at providing not only a solid knowledge base about the state-of-the-art, but also a conceptual model underlying the three main problems addressed in the literature, namely Automatic Personality Recognition (inference of the true personality of an individual from behavioral evidence), Automatic Personality Perception (inference of personality others attribute to an individual based on her observable behavior) and Automatic Personality Synthesis (generation of artificial personalities via embodied agents). Furthermore, the article highlights the issues still open in the field and identifies potential application areas.http://doi.ieeecomputersociety.org/10.1109/TAFFC.2014.2330816