THIS ARTICLE PRESENTS A NOVEL METHODOLOGY TO analyze the dynamics of emotional responses to music. It consists of a computational investigation based on spatiotemporal neural networks, which "mimic" human affective responses to music and predict the responses to novel music sequences. The results provide evidence suggesting that spatiotemporal patterns of sound resonate with affective features underlying judgments of subjective feelings (arousal and valence). A significant part of the listener's affective response is predicted from a set of six psychoacoustic features of sound—âloudness, tempo, texture, mean pitch, pitch variation, and sharpness. A detailed analysis of the network parameters and dynamics also allows us to identify the role of specific psychoacoustic variables (e.g., tempo and loudness) in music emotional appraisal. This work contributes new evidence and insights to the study of musical emotions, with particular relevance to the music perception and cognition research community.
- ©Â© 2009 By the Regents of the University of California