Global convergence of neuron birth-death dynamics

Grant M. Rotskoff, Samy Jelassi, Joan Bruna, Eric Vanden-Eijnden

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Neural networks with a large number of units admit a mean-field description, which has recently served as a theoretical explanation for the favorable training properties of "ovcrparamctcrizcd" models. In this regime, gradient descent obeys a deterministic partial differential equation (PDE) that converges to a globally optimal solution for networks with a single hidden layer under appropriate assumptions. In this work, we propose a non-local mass transport dynamics that leads to a modified PDE with the same minimizer. We implement this non-local dynamics as a stochastic neuronal birth-death process and we prove that it accelerates the rate of convergence in the mean-field limit. We subsequently realize this PDE with two classes of numerical schemes that converge to the mean-field equation, each of which can easily be implemented for neural networks with finite numbers of units. We illustrate our algorithms with two models to provide intuition for the mechanism through which convergence is accelerated.

Original languageEnglish (US)
Title of host publication36th International Conference on Machine Learning, ICML 2019
PublisherInternational Machine Learning Society (IMLS)
Number of pages10
ISBN (Electronic)9781510886988
StatePublished - Jan 1 2019
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: Jun 9 2019Jun 15 2019

Publication series

Name36th International Conference on Machine Learning, ICML 2019


Conference36th International Conference on Machine Learning, ICML 2019
Country/TerritoryUnited States
CityLong Beach

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Human-Computer Interaction


Dive into the research topics of 'Global convergence of neuron birth-death dynamics'. Together they form a unique fingerprint.

Cite this