TY - GEN
T1 - Impression learning
T2 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
AU - Bredenberg, Colin
AU - Lyo, Benjamin S.H.
AU - Simoncelli, Eero P.
AU - Savin, Cristina
N1 - Funding Information:
We thank Camille Rullán Buxó, Caroline Haimerl, Owen Marschall, Pedro Herrero-Vidal, Siavash Golkar, David Lipshutz, Yanis Bahroun, Tiberiu Tesileanu, Eilif Muller, Blake Richards, Guillaume Lajoie, Maximilian Puelma Touzel, and Alexandre Payeur for helpful discussions and feedback on earlier versions of this manuscript. We gratefully acknowledge the Howard Hughes Medical Institute and the Simons Foundation for their support of this work. CS is supported by National Institute of Mental Health Award 1R01MH125571-01, by the National Science Foundation under NSF Award No.1922658 and a Google faculty award.
Publisher Copyright:
© 2021 Neural information processing systems foundation. All rights reserved.
PY - 2021
Y1 - 2021
N2 - Understanding how the brain constructs statistical models of the sensory world remains a longstanding challenge for computational neuroscience. Here, we derive an unsupervised local synaptic plasticity rule that trains neural circuits to infer latent structure from sensory stimuli via a novel loss function for approximate online Bayesian inference. The learning algorithm is driven by a local error signal computed between two factors that jointly contribute to neural activity: stimulus drive and internal predictions - the network's 'impression' of the stimulus. Physiologically, we associate these two components with the basal and apical dendrites of pyramidal neurons, respectively. We show that learning can be implemented online, is capable of capturing temporal dependencies in continuous input streams, and generalizes to hierarchical architectures. Furthermore, we demonstrate both analytically and empirically that the algorithm is more data-efficient than a three-factor plasticity alternative, enabling it to learn statistics of high-dimensional, naturalistic inputs. Overall, the model provides a bridge from mechanistic accounts of synaptic plasticity to algorithmic descriptions of unsupervised probabilistic learning and inference.
AB - Understanding how the brain constructs statistical models of the sensory world remains a longstanding challenge for computational neuroscience. Here, we derive an unsupervised local synaptic plasticity rule that trains neural circuits to infer latent structure from sensory stimuli via a novel loss function for approximate online Bayesian inference. The learning algorithm is driven by a local error signal computed between two factors that jointly contribute to neural activity: stimulus drive and internal predictions - the network's 'impression' of the stimulus. Physiologically, we associate these two components with the basal and apical dendrites of pyramidal neurons, respectively. We show that learning can be implemented online, is capable of capturing temporal dependencies in continuous input streams, and generalizes to hierarchical architectures. Furthermore, we demonstrate both analytically and empirically that the algorithm is more data-efficient than a three-factor plasticity alternative, enabling it to learn statistics of high-dimensional, naturalistic inputs. Overall, the model provides a bridge from mechanistic accounts of synaptic plasticity to algorithmic descriptions of unsupervised probabilistic learning and inference.
UR - http://www.scopus.com/inward/record.url?scp=85131822141&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131822141&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85131822141
T3 - Advances in Neural Information Processing Systems
SP - 11717
EP - 11729
BT - Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021
A2 - Ranzato, Marc'Aurelio
A2 - Beygelzimer, Alina
A2 - Dauphin, Yann
A2 - Liang, Percy S.
A2 - Wortman Vaughan, Jenn
PB - Neural information processing systems foundation
Y2 - 6 December 2021 through 14 December 2021
ER -