Music can be thought of as a form of emotional communication, with which the performer conveys an emotional state to the listener. This “language” is remarkably powerful – it can evoke strong emotions, and make your heart race or send tingles down your spine. And it is universal – the emotional content of a piece of music can be understood by anyone, regardless of cultural background.
Are the emotions evoked by piece of music similar to, and can they influence, other emotional experiences? The answer to these questions is unclear. But a new study, which has just been published in Neuroscience Letters, provides both behavioural and physiological evidence that the emotions evoked by music can be transferred to the sense of vision, and can influence how the emotions in facial expressions are perceived.
For their study, Nidhya Logeswaran and Joydeep Bhattacharya, of Goldsmiths College in London and the Austrian Academy of Sciences, respectively, performed two separate experiments. In the first, 30 participants were presented with a series of happy or sad musical excerpts, each lasting 15 seconds. After each piece of music, the participants were shown a photograph of a face, expressing either a happy, sad, or neutral expression. The photographs were flashed on a screen for 1 second, after which the participants were asked to rate the emotion on a 7-piont scale, where 1 denotes extremely sad and 7 extremely happy.
Thus, the visual emotional stimuli – the photos of faces – were “primed” by an emotional state conveyed by a piece of music. All the participants correctly identified the emotions expressed by the faces in the photographs presented to them. However, happy faces primed by a happy piece of music were rated as happier than when primed by sad music. Conversely, sad faces primed by a piece of sad music were rated as sadder than those primed with a happy piece of music. Finally, neutral faces were rated higher when primed by a happy piece of music and lower when primed by a sad piece.
The size of the priming effect for neutral faces was found to be almost twice that of the effect for happy and sad faces. This may be because neutral faces contain less information than those expressing one emotion or the other, and hence are somewhat ambiguous. We know that the brain integrates information from different senses to construct representations of the external and internal worlds; thus, in the absence of relavent visual information, it may therefore become more reliant on information from other senses when generating these representations.
The second experiment, involving 15 different participants, was designed in the same way, but this time, Logeswaran and Bhattacharya used electroencephalogram (EEG) to record event-related potentials – the electrical activity in the brain associated with perception of the auditory and visual stimuli presented. A number of earlier EEG studies have shown that our emotional responses to music are associated with dinstinct patterns of brain activation – a happy piece of music, or one that we like, causes an increase in activity recorded from electrodes lying over the left frontal and temporal lobes, while pieces of music which which evoke negative emotions are associated with increased activity in the right frontal and temporal lobe. A similar lateralized effect has also been observed for emotionally significant visual stimuli.
In their analyses, the researchers compared pairs of conditions in which photographs of the same facial emotion were primed differently. In both conditions, a signal associated with the later stages of face processing, called the face-specific N170 component, was measured from electrodes overlying the occipital and parietal lobes. initially from frontal and central electrodes, followed by more activity in the electrodes located toward the back of the skull. However, a difference was observed between those in conditions in which neutral faces was primed by happy piece of music and those in which they were primed by sad music. The former condition, but not the latter, was associated with a significant effect in frontal and central electrodes during the 50-150 millisecond time window.
This study demonstrates that the music playe to the participants influenced the way in which they perceived the emtional content of the faces shown immediately afterwards, such that the emotional rating of the faces in the photographs was biased toward the direction of the emotions expressed in the music. It also supports the earlier finding that happy and sad emotions evoked by music produce different physiological responses in the brain, and suggests that “binding” of auditory and visual emotional cues occurs at an early stage of neural processing.
This is one of several recent studies which demonstrate that activity in one sensory system can modulate activity in another. Last month, researchers from MIT showed that an optical illusion called the motion after-effect can be transferred to the sense of touch, and just before that, Italian researchers showed that the sense of touch can be influenced by how we perceive others. It was already known that music can influence the perception of emotions in visual stimuli when presented simultaneously, but this new study is the first to show the emotions evoked by music can affect the perception of emotional content in visual stimuli presented afterwards. These new findings also suggest that emotional processing takes place outside of conscious awareness, rather than being based on judgements and decisions.
- How we perceive others influences our sense of touch
- The waterfall illusion can be transferred between vision and touch
Logeswaran, N. & Bhattacharya, J. (2009). Crossmodal transfer of emotion by music Neurosci. Lett. 455: 129-133. DOI: 10.1016/j.neulet.2009.03.044.