TY - JOUR
T1 - Dynamic models for musical rhythm perception and coordination
AU - Large, Edward W.
AU - Roman, Iran
AU - Kim, Ji Chul
AU - Cannon, Jonathan
AU - Pazdera, Jesse K.
AU - Trainor, Laurel J.
AU - Rinzel, John
AU - Bose, Amitabha
N1 - Publisher Copyright:
Copyright © 2023 Large, Roman, Kim, Cannon, Pazdera, Trainor, Rinzel and Bose.
PY - 2023
Y1 - 2023
N2 - Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, “the beat.” Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
AB - Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, “the beat.” Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
KW - Bayesian modeling
KW - beat perception
KW - dynamical systems
KW - entrainment
KW - music
KW - neuro-mechanistic modeling
KW - synchronization
UR - http://www.scopus.com/inward/record.url?scp=85161026898&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85161026898&partnerID=8YFLogxK
U2 - 10.3389/fncom.2023.1151895
DO - 10.3389/fncom.2023.1151895
M3 - Review article
AN - SCOPUS:85161026898
SN - 1662-5188
VL - 17
JO - Frontiers in Computational Neuroscience
JF - Frontiers in Computational Neuroscience
M1 - 1151895
ER -