چکیده انگلیسی

Abstract
Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at occipital sites (Experiment 2). These data suggest that processing of these facial expressions undergoes development between 7 and 12 months: while 7-month-olds exhibit heightened sensitivity to happy faces, 12-month-olds resemble adults in their heightened sensitivity to angry faces. In Experiment 3 infants’ visual preference was assessed behaviorally, revealing that the differences in ERPs observed at 7 and 12 months do not simply reflect differences in visual preference.

مقدمه انگلیسی

. Introduction
Facial expressions are an important way to communicate emotions in social interactions (Izard, 1991). Recognizing facial expressions permits us to detect another person’s emotional state and provides cues on how to respond in these social situations. Facial expressions are central to non-verbal social exchange as markers of (a) internal states and (b) intentions (Schupp et al., 2004). For example, across cultures the internal state of anger is externally expressed as frowning brows, staring eyes, and a shut mouth with tense lips (Ekman & Friesen, 1975), which in turn signals readiness for physical or symbolic attack on an observer. Because angry faces signal potential negative consequences to the observer, they are regarded as threatening. This dual role of facial expressions also applies to happy expressions which are considered to be friendly faces. In the current study the developmental course of processing happy (friendly) and angry (threatening) facial expressions was examined.
Previous research with adults has revealed that negative events generally evoke stronger and more rapid physiological, cognitive, emotional, and social responses than do neutral or positive events (Taylor, 1991). Consistent with these findings, adults detect angry schematic faces more readily than happy schematic faces (Öhmann, Lundqvist, & Esteves, 2001). Also, fear conditioning to angry faces is more resistant to extinction than fear conditioning to happy faces (Öhmann & Mineka, 2001). From an evolutionary perspective, it has been argued that because it is difficult to reverse the consequences of an injurious or fatal assault, the process of natural selection may have resulted in a propensity to react more strongly to negative than to positive stimuli (Cacioppo & Berntson, 1999). This heightened sensitivity to negative information, termed ‘negativity bias,’ is a reliable psychological phenomenon in adults (Cacioppo, Gardner, & Berntson, 1999).
In accordance with the notion of a negativity bias, a recent event-related brain potential (ERP) study (Schupp et al., 2004) provides evidence that adults show an enhanced neural processing of angry facial expressions. In this study adults were shown angry, happy, and neutral faces. Angry faces compared to happy and neutral faces were found to elicit an Early Posterior Negativity (EPN) with a maximum over occipital sites peaking around 300 ms after stimulus onset. The EPN has been shown to appear uniformly to evolutionarily significant stimuli as a negative shift in the waveform, indicating emotional enhancement of sensory processing in the visual cortex (Junghöfer et al., 2001 and Schupp et al., 2003). Similar ERP effects are observed when participants have to direct their attention selectively to particular features (e.g., color or orientation) of a visual stimulus (Michie et al., 1999). This has been taken as support for the hypothesis that sensory encoding of evolutionarily significant affective stimuli is enhanced by ‘natural’ selective attention (Lang, Bradley, & Cuthbert, 1997).
It has been suggested that the amygdala might be one particularly relevant structure of a distributed network that is devoted to determining the significance of external stimuli and regulating the enhancement processes at a sensory cortical level (Davis and Whalen, 2001 and Pessoa et al., 2002). Although amygdala activation has been most commonly reported in response to facial expressions of fear (Davis & Whalen, 2001) there is evidence suggesting that it is also responsive to facial expressions of anger (Adolphs et al., 1995, Morris et al., 1998 and Whalen et al., 2001). The enhancement of sensory processing in the visual cortex by the amygdala might rely on direct projections, connections to anterior attention networks, or ascending neuromodulatory systems (Davis & Whalen, 2001). However, more direct evidence is needed to support any of these hypothesized mechanisms.
From a developmental perspective, it is well established that infants reliably discriminate a range of static facial expressions of emotion (e.g., Caron et al., 1985, LaBabera et al., 1976, Serrano et al., 1995, Striano et al., 2002 and Young-Browne et al., 1997). For example, behavioral studies have shown that by 3 months of age, infants visually discriminate happy from angry facial expressions (Barrera & Maurer, 1981). Then, at 7 months of age, infants detect similarity of happy faces over changing identities and discriminate this category of happy expressions from angry expressions (Kestenbaum & Nelson, 1990). It is not until 10 months of age (Ludemann, 1991) that infants are able to categorize expressions more generally according to affective tone (positive versus negative). By 12 months of age, human infants use others’ positive and negative facial expressions to disambiguate uncertain situations and to regulate their behavior accordingly (Baldwin and Moses, 1996 and Sorce et al., 1985).
While behavioral measures have revealed much about which emotions infants can discriminate, the processes underlying these abilities are much less understood. Researchers have begun to measure ERPs to provide information about the ongoing neurocognitive processes that occur while an infant is responding to an event rather than measuring only the final behavioral response.
In an ERP study examining infants’ processing of pictures of facial expressions (Nelson & de Haan, 1996), 7-month-olds watched happy versus fearful faces in a first experiment. Results revealed that fearful faces elicited an enhanced negative component (Nc) peaking around 500 ms. The Nc has its maximum at frontal and central sites and has been thought of as an obligatory attentional response sensitive to stimulus familiarity (Courchesne et al., 1981 and Snyder et al., 2002). Dipole modeling has revealed that the cortical sources of the Nc can be localized in the anterior cingulate and other prefrontal regions (Reynolds & Richards, 2005). Therefore, it has been argued that 7-month-old infants in Nelson and de Haan’s study allocated more attentional processing resources to the unfamiliar fearful than to the familiar happy expression as indicated by an enhanced Nc. In a second experiment Nelson and de Haan showed another group of 7-month-olds fearful versus angry faces and there was no difference in their ERP response. One possibility is that infants in Nelson and de Haan (1996) second experiment, although able to discriminate between the angry and fearful expressions, did not display different brain responses to the two expressions because they perceived the signal value of both expressions as “negative.” Another plausible explanation is that infants perceived both expressions as equally unfamiliar and therefore the Nc, which is sensitive to the familiarity of a stimulus, did not differ between conditions.
In the current research, we attempted to extend Nelson and de Haan (1996) findings by comparing infants’ responses to happy and angry faces. While behavioral research with infants points to an early ability to discriminate happy and angry facial expressions, only little is known about how the neural processing of these facial expressions as revealed by ERP might develop during infancy.
To examine the development of processing of happy and angry facial expressions, 7-month-old infants (Experiment 1) and 12-month-old infants (Experiment 2) were tested using ERP measures. The relation between the ERP response and infants’ looking behavior is yet to be determined. Therefore, in a behavioral experiment (Experiment 3), 7- and 12-month-old infants’ discrimination of happy and angry facial expressions was further examined using a preferential looking paradigm.