Colloquium: Friday, Oct. 11, 3:00 pm, in 622 Dodge Hall
Matthew Sachs, PhD (Presidential Scholar in Society and Neuroscience, Center for Science & Society) "Spatial and Temporal Patterns of Brain Activity Associated with Emotions in Music"
Dr. Sachs is a neuroscientist whose research focuses on understanding the neural and behavioral mechanisms involved in emotions and feeling in response to music. He received his PhD from the University of Southern California’s Brain and Creativity Institute, directed by Dr. Antonio Damasio, and his B.A. from Harvard University. Matthew’s projects involve applying data-driven, multivariate models to capture the patterns of neural activity that accompany uniquely human experiences with music, such as feelings of chills, pleasurable sadness, and nostalgia. Matthew is the third Robert A. Burt Presidential Scholar in Society and Neuroscience.
Abstract: The ability to both perceive and experience emotions in response to music underlies its universality and ubiquity across cultures and time. This ability also allows music to be a useful tool for uncovering how the brain represents affective experiences. In a series of three studies, I employ data-driven, multivariate statistical techniques to capture patterns of neural information in response to musical stimuli. In Study 1, I show that neural patterns in the auditory, somoatosensory, and insular cortices represent specific categories of emotions perceived through music and that these representations extend to non-musical stimuli as well. In the next two studies, I shift from perception to experience, focusing on the emergence of enjoyment in response to sad pieces of music, in which the emotion that is perceived by the listener may not match the emotion that is felt. In Study 2, I show that people who find sad music enjoyable tend to score higher on a specific sub-trait of empathy called Fantasy. In Study 3, I build off these findings with a neuroimaging study in which participants listened to a full-length piece of sad music. Using a data-driven approach to assess synchronization of brain activity across participants, I show that, while listening to sad music, high Fantasy individuals have greater synchronization in regions of the brain involved in processing emotions and simulating the emotions of others. Furthermore, when evaluating synchronization dynamically, increased enjoyment of the piece of music predicted similar patterns of activity across people in the basal ganglia, orbitofrontal, and auditory cortices. The results presented across three studies provide a more nuanced understanding of the spatial and temporal neural representations of emotions and feeling and illuminate the ways in which music is able to co-opt and these neural mechanisms.