Schema formation in a neural population subspace underlies learning-to-learn in flexible sensorimotor problem-solving

Vishwa Goudar, Barbara Peysakhovich, David J. Freedman, Elizabeth A. Buffalo, Xiao Jing Wang

Research output: Contribution to journalArticlepeer-review

Abstract

Learning-to-learn, a progressive speedup of learning while solving a series of similar problems, represents a core process of knowledge acquisition that draws attention in both neuroscience and artificial intelligence. To investigate its underlying brain mechanism, we trained a recurrent neural network model on arbitrary sensorimotor mappings known to depend on the prefrontal cortex. The network displayed an exponential time course of accelerated learning. The neural substrate of a schema emerges within a low-dimensional subspace of population activity; its reuse in new problems facilitates learning by limiting connection weight changes. Our work highlights the weight-driven modifications of the vector field, which determines the population trajectory of a recurrent network and behavior. Such plasticity is especially important for preserving and reusing the learned schema in spite of undesirable changes of the vector field due to the transition to learning a new problem; the accumulated changes across problems account for the learning-to-learn dynamics.

Original languageEnglish (US)
Pages (from-to)879-890
Number of pages12
JournalNature Neuroscience
Volume26
Issue number5
DOIs
StatePublished - May 2023

ASJC Scopus subject areas

  • General Neuroscience

Fingerprint

Dive into the research topics of 'Schema formation in a neural population subspace underlies learning-to-learn in flexible sensorimotor problem-solving'. Together they form a unique fingerprint.

Cite this