Stimulus-driven and spontaneous dynamics in excitatory-inhibitory recurrent neural networks for sequence representation

Alfred Rajakumar, John Rinzel, Zhe S. Chen

Research output: Contribution to journalArticlepeer-review

Abstract

Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics (“neural sequences”) of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale’s principlewill help elucidate the neural representations andmechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a time-warped input for sequence representation. Interestingly, a learned sequence can repeat periodically when the RNN evolved beyond the duration of a single sequence. The eigenspectrum of the learned recurrent connectivity matrix with growing or damping modes, together with the RNN’s nonlinearity, were adequate to generate a limit cycle attractor. We further examined the stability of dynamic attractors while training the RNN to learn two sequences. Together, our results provide a general framework for understanding neural sequence representation in the excitatory-inhibitory RNN.

Original languageEnglish (US)
Pages (from-to)2603-2645
Number of pages43
JournalNeural computation
Volume33
Issue number10
DOIs
StatePublished - Sep 16 2021

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Stimulus-driven and spontaneous dynamics in excitatory-inhibitory recurrent neural networks for sequence representation'. Together they form a unique fingerprint.

Cite this