Music can be thought of as a form of emotional communication, with which the performer conveys an emotional state to the listener. This “language” is remarkably powerful – it can evoke strong emotions, and make your heart race or send tingles down your spine. And it is universal – the emotional content of a piece of music can be understood by anyone, regardless of cultural background.

Are the emotions evoked by piece of music similar to, and can they influence, other emotional experiences? The answer to these questions is unclear. But a new study, which has just been published in Neuroscience Letters, provides both behavioural and physiological evidence that the emotions evoked by music can be transferred to the sense of vision, and can influence how the emotions in facial expressions are perceived.

For their study, Nidhya Logeswaran and Joydeep Bhattacharya, of Goldsmiths College in London and the Austrian Academy of Sciences, respectively, performed two separate experiments. In the first, 30 participants were presented with a series of happy or sad musical excerpts, each lasting 15 seconds. After each piece of music, the participants were shown a photograph of a face, expressing either a happy, sad, or neutral expression. The photographs were flashed on a screen for 1 second, after which the participants were asked to rate the emotion on a 7-piont scale, where 1 denotes extremely sad and 7 extremely happy.

Thus, the visual emotional stimuli – the photos of faces – were “primed” by an emotional state conveyed by a piece of music. All the participants correctly identified the emotions expressed by the faces in the photographs presented to them. However, happy faces primed by a happy piece of music were rated as happier than when primed by sad music. Conversely, sad faces primed by a piece of sad music were rated as sadder than those primed with a happy piece of music. Finally, neutral faces were rated higher when primed by a happy piece of music and lower when primed by a sad piece.

The size of the priming effect for neutral faces was found to be almost twice that of the effect for happy and sad faces. This may be because neutral faces contain less information than those expressing one emotion or the other, and hence are somewhat ambiguous. We know that the brain integrates information from different senses to construct representations of the external and internal worlds; thus, in the absence of relavent visual information, it may therefore become more reliant on information from other senses when generating these representations.

The second experiment, involving 15 different participants, was designed in the same way, but this time, Logeswaran and Bhattacharya used electroencephalogram (EEG) to record event-related potentials – the electrical activity in the brain associated with perception of the auditory and visual stimuli presented. A number of earlier EEG studies have shown that our emotional responses to music are associated with dinstinct patterns of brain activation – a happy piece of music, or one that we like, causes an increase in activity recorded from electrodes lying over the left frontal and temporal lobes, while pieces of music which which evoke negative emotions are associated with increased activity in the right frontal and temporal lobe. A similar lateralized effect has also been observed for emotionally significant visual stimuli.

In their analyses, the researchers compared pairs of conditions in which photographs of the same facial emotion were primed differently. In both conditions, a signal associated with the later stages of face processing, called the face-specific N170 component, was measured from electrodes overlying the occipital and parietal lobes. initially from frontal and central electrodes, followed by more activity in the electrodes located toward the back of the skull. However, a difference was observed between those in conditions in which neutral faces was primed by happy piece of music and those in which they were primed by sad music. The former condition, but not the latter, was associated with a significant effect in frontal and central electrodes during the 50-150 millisecond time window.

This study demonstrates that the music playe to the participants influenced the way in which they perceived the emtional content of the faces shown immediately afterwards, such that the emotional rating of the faces in the photographs was biased toward the direction of the emotions expressed in the music. It also supports the earlier finding that happy and sad emotions evoked by music produce different physiological responses in the brain, and suggests that “binding” of auditory and visual emotional cues occurs at an early stage of neural processing.

This is one of several recent studies which demonstrate that activity in one sensory system can modulate activity in another. Last month, researchers from MIT showed that an optical illusion called the motion after-effect can be transferred to the sense of touch, and just before that, Italian researchers showed that the sense of touch can be influenced by how we perceive others. It was already known that music can influence the perception of emotions in visual stimuli when presented simultaneously, but this new study is the first to show the emotions evoked by music can affect the perception of emotional content in visual stimuli presented afterwards. These new findings also suggest that emotional processing takes place outside of conscious awareness, rather than being based on judgements and decisions.

Comments

Hi Moscot, I think emotion researchers would not be surprised by these findings. Mood induction affects perception of emotional stimuli, whether this be an identification of emotional states in faces, stories, videos, and even music. As you know, this is part of the standard experimental protocol in affect research. What I found intriguing is the authors idea that this represent some sort of cross-sensory transfer. I guess it may be an issue of semantics, but I see this as a simple effect of mood induction. Participants experienced a mood induction paradigm (listening to sad song for example) and then displayed the expected priming bias (rating sad faces sadder). The path is from music -> mood induction –> bias. Thus that mood induction is the mediator. It seems from that summary that the authors discussed a more direct pathway of auditory perception directly affecting visual perception- but I’m not sure I see the need for such extrapolation when the data is so consistent with standard affect induction paradigms. Thoughts?

As someone who writes music for film, i can easily say that composers in Hollywood have known this for a long time. The right score can enhance the emotion of a scene or an actor’s performance, while the wrong one can ruin it.

Frequently, I’ll throw up the “wrong” piece of music against a scene just to see how it would change it. A lot of the time it’ll work, but will create a bizarre shift in the tone of the scene. There’re even some great parodies on YouTube that exploit this phenomenon.

When I was a drama student, we were shown two movie clips which included a close-up of a face. In the first, the camera focused on a plate of delicious looking roast dinner, and then panned to a face. In the second, it focused on a salad with a slug in it, then onto the face. We were asked to discuss the way the actor had managed to convey his reactions in each instance. The lecturer had told us that the lecture was about displaying emotion in film as opposed to stage (bearing in mind that a raised eyebrow in a close up might travel more than a metre on the screen).

In fact it was to demonstrate audience suggestibility. The close up on the face was exactly the same one in both cases, spliced seamlessly into the clips.

I don’t think it comes as a surprise that context matters as much as content. Teemu Arina has been saying this since he was about 16 (okay, he’s only about 26 now, but still!)

Interesting research, but this from your introduction seems a strong claim:

“[Music] is universal – the emotional content of a piece of music can be understood by anyone, regardless of cultural background.”

This is evidently false.

Two people can have strong, orthogonal emotional reactions to the same piece of music, and often do. If they didn’t, music reviews would be largely the same. I’ve walked into organ recitals and been completely bamboozled by a very alien sequence of (to me) apparently unrelated notes. Whenever any new music genre is introduced, people line up to point out how devoid of emotion it is, whereas those within the culture are strongly affected by it.

Balkan folk. Indian classical music. Skiffle. Difficult for an outsider to comprehend, full of ‘emotional’ expression.

@Nestor: The findings aren’t especially surprising – nothing about the surprises me any more – but I do find them interesting, and that’s why I wrote about them. Of course the mood induction occurs, but I don’t see how this precludes cross modal transfer. The way I see it, the fact that music influences perception of faces necessitates a link between the auditory and visual system. Furthermore, a growing number of studies show the importance of multisensory integration in perception of external and internal stimuli (see the links in the last paragraph).

@WlasticPlanet: Fascinating! I’d love to see some of the YouTube vids you mention.

@Karyn: Are you sure what you describe is called the Forer effect? According to Wikipedia, the Forer effect is “the observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people.” Besides, although the effect you describe is similar, it involves two visual stimuli.

“And it is universal – the emotional content of a piece of music can be understood by anyone, regardless of cultural background.” Western Music, maybe. But this is a false claim if compared to some world musics.

Take the Qur’an for example. When this is sung anybody who lacks knowledge of its cultural and religious meaning would consider this music, it is in fact a reading or calling, and in Islam never considered music.

I wonder if the accuracy of characterizing an emotion within a facial expression is secondarily influenced by personality; that is, pessimists have greater accuracy when primed through sad music, and optimists have greater accuracy when primed through happy music? Just a thought.

Well if music induces emotional states that consequently skew perceptions, it follows that aggression and hostility may in part be consequent to aggressive/hostile music. So can we extrapolate these findings to the negative effect, of say death metal music on relational behaviour?

I would have thought that that was a fairly narrow application of the term Forer effect, but I am not about to argue with wikipedia.

My point is that interpretation is dependent on context. In your example, visual interpretation was affected by auditory context. In mine, it was impacted by visual context. Forer’s was also based on suggestibility due to context. He conducted nonsense personality tests on people and then presented them all with exactly the same results. A high proportion of them declared them to be accurate. Because of the pseudo-scientific context, they read into the results that which was not there to be seen.

In his workshops, Itiel Dror gets delegates to create two graphic representations of a single set of data, one to be used to support each of two diametrically opposing views. It can be done. To an extent, learning designers may do it fairly often… possibly even unwittingly.

Is this not something successfully exploited by spin doctors and propagandists?

This results indicate that is not music that changes our perception directly.

Why? The main changes appear in brain elicit specific mood. Music and other stimulus (e.g. money) produce some changes in reward system and other structures in brain that administer our cognition (the frontal cortex) and reaction furthermore. So the results of the researches are another proof that emotions itself can change the estimation of situations around us.

This way evaluation of photos can easily be explained. We can also suppose that people after listening optimistic music can be more “objective” as left frontal cortex (the more activated in this case) is responsible for analitic thinking.

Moreover some time ago Davidson and Fox discovered that left hemisphere is responsible for positive and left for negative emotions, so this part was also nothing new.

e: It is said that “heavy” music (with strong rhythm) can lower an IQ, but I have never read the full article about it, just a note.

So wybory, listening to Stravinsky, Japanese Kyoto drummers makes you dumb? Rhythm is the basis for music yet music has been shown to increase intelligence.

As a musician, I agree that music is a universal language. It is musical “taste” or preferences that differ. I prefer heavy brass pieces in classical music (and yes, I also like hard rock) while my husband would prefer that I played a quieter instrument than the trombone. Can he feel the heartbreak in Shostakovich’s fifth? Most definitely, although he prefers not to.

While I have no doubt that music can trigger emotional response, I wonder about the assumption that music is inherently emotional that a lot of folks (and a lot of studies) make. As the article you cited to Alex points out, this association of music and emotion is not true across cultures (even though, as the study finds, emotional content of Western music can be ID’d (if ID is asked for) by non-westerners). So, my question is: do you know of any work that investigates the specific processes through which musical stimuli is translated into emotional response in listeners?

As a composer, I’m particularly interested in how this process might work.

Music is so important in our every day life and i totally agree it plays a very important role in how we perceive something. I just think about seeing jack nicholson’s smile in a movie. For the exact same smile, in a comedy or an horror movie they won’t play the same music and it can really help create the wanted feeling for the viewer. It may not be the perfect example, but i really think music helps focus the viewer on what you want to tell him, in movies, games, theatre. There are a lot of applications too in real life. Very interesting article!

“It was already known that music can influence the perception of emotions in visual stimuli when presented simultaneously”

I am currently in the process of creating my Master’s Thesis and would greatly benefit from reading articles which could be cited supporting the above claim. If you have any information about such research please let me know. Thank you,
Tyler

The site is currently under maintenance. New comments have been disabled during this time, please check back soon.