Events

Past Event

Matthew Sachs - Spatial and Temporal Patterns of Brain Activity Associated With Emotions in Music

October 11, 2019
3:00 PM - 4:00 PM
Event time is displayed in your time zone.
Columbia University, 622 Dodge Hall, 2960 Broadway, New York, NY 10027

At the colloquium organized by Columbia University's Department of Music, Dr. Matthew Sachs will present the results of three studies that focus on how the brain represents emotions and feelings in response to music. 

Speaker:
Matthew Sachs, Presidential Scholar in Society and Neuroscience, Columbia University

Abstract: The ability to both perceive and experience emotions in response to music underlies its universality and ubiquity across cultures and time. This ability also allows music to be a useful tool for uncovering how the brain represents affective experiences. In a series of three studies, I employ data-driven, multivariate statistical techniques to capture patterns of neural information in response to musical stimuli. In Study 1, I show that neural patterns in the auditory, somoatosensory, and insular cortices represent specific categories of emotions perceived through music and that these representations extend to non-musical stimuli as well. In the next two studies, I shift from perception to experience, focusing on the emergence of enjoyment in response to sad pieces of music, in which the emotion that is perceived by the listener may not match the emotion that is felt. In Study 2, I show that people who find sad music enjoyable tend to score higher on a specific sub-trait of empathy called Fantasy. In Study 3, I build off these findings with a neuroimaging study in which participants listened to a full-length piece of sad music. Using a data-driven approach to assess synchronization of brain activity across participants, I show that, while listening to sad music, high Fantasy individuals have greater synchronization in regions of the brain involved in processing emotions and simulating the emotions of others. Furthermore, when evaluating synchronization dynamically, increased enjoyment of the piece of music predicted similar patterns of activity across people in the basal ganglia, orbitofrontal, and auditory cortices. The results presented across three studies provide a more nuanced understanding of the spatial and temporal neural representations of emotions and feeling and illuminate the ways in which music is able to co-opt and these neural mechanisms.