RETRIEVING MUSICAL INFORMATION FROM NEURAL DATA: HOW COGNITIVE FEATURES ENRICH ACOUSTIC ONES

Ellie Bean Abrams, Eva Muñoz Vidal, Claire Pelofi, Pablo Ripollés

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Various features ± from low-level acoustics, to higher-level statistical regularities, to memory associations ± contribute to the experience of musical enjoyment and pleasure. Recent work suggests that musical surprisal, that is, the unexpectedness of a musical event given its context, may directly predict listeners’ experiences of pleasure and enjoyment during music listening. Understanding how surprisal shapes listeners’ preferences for certain musical pieces has implications for music recommender systems, which are typically content- (both acoustic or semantic) or metadata-based. Here we test a recently developed computational algorithm, called the Dynamic-Regularity Extraction (D-REX) model, that uses Bayesian inference to predict the surprisal that humans experience while listening to music. We demonstrate that the brain tracks musical surprisal as modeled by D-REX by conducting a decoding analysis on the neural signal (collected through magnetoencephalography) of participants listening to music. Thus, we demonstrate the validity of a computational model of musical surprisal, which may remarkably inform the next generation of recommender systems. In addition, we present an open-source neural dataset which will be available for future research to foster approaches combining MIR with cognitive neuroscience, an approach we believe will be a key strategy in characterizing people’s reactions to music.

Original languageEnglish (US)
Title of host publicationProceedings of the 23rd International Society for Music Information Retrieval Conference, ISMIR 2022
EditorsPreeti Rao, Hema Murthy, Ajay Srinivasamurthy, Rachel Bittner, Rafael Caro Repetto, Masataka Goto, Xavier Serra, Marius Miron
PublisherInternational Society for Music Information Retrieval
Pages160-168
Number of pages9
ISBN (Electronic)9781732729926
StatePublished - 2022
Event23rd International Society for Music Information Retrieval Conference, ISMIR 2022 - Hybrid, Bengaluru, India
Duration: Dec 4 2022Dec 8 2022

Publication series

NameProceedings of the 23rd International Society for Music Information Retrieval Conference, ISMIR 2022

Conference

Conference23rd International Society for Music Information Retrieval Conference, ISMIR 2022
Country/TerritoryIndia
CityHybrid, Bengaluru
Period12/4/2212/8/22

ASJC Scopus subject areas

  • Music
  • Information Systems
  • Artificial Intelligence
  • Human-Computer Interaction
  • Signal Processing

Fingerprint

Dive into the research topics of 'RETRIEVING MUSICAL INFORMATION FROM NEURAL DATA: HOW COGNITIVE FEATURES ENRICH ACOUSTIC ONES'. Together they form a unique fingerprint.

Cite this