Katherine M Becker1, Donald C Rojas1; 1Colorado State University

Emotion perception occurs via the simultaneous integration of affective vocal and facial information. However, it’s unclear how these modalities interact and influence perception when they’re integrated. Humans possess a special network of neural structures dedicated to the recognition of facial expressions, which is distinct from brain regions devoted to face detection and prosody recognition. This study examined the neural correlates of these responses by measuring changes in brain activity (n=26) using electroencephalography (39 electrodes). Prosodic stimuli consisted of emotional vocalizations of the vowel /a/ produced in neutral, angry, and happy tones. Face stimuli were made by morphing images of an actor portraying a happy and angry face to create a continuum of faces that varied from 100% happy to 100% angry. These stimuli were used to create seven conditions, three bimodal (face and voice), three voice only (one for each prosody) and one face only condition. Both the bimodal happy (310-355ms) and angry (345-365ms) conditions exhibited a greater positivity than the neutral condition in right motor areas. The happy condition exhibited a greater negativity in frontal (480-680ms) and central parietal areas (500-680ms) than the neutral condition. Bimodal angry stimuli showed a larger negativity bilaterally in occipital areas (100-440ms) compared to happy stimuli. Face only stimuli exhibited a greater negativity in right motor and parietal areas (760-820ms) than the happy condition. Conversely, angry stimuli showed a greater positivity in left parietal (180-220ms) and bilateral occipito-parietal areas (220-290ms). These findings indicate that occipito-parietal, frontal, and motor areas are involved in emotion perception.