TY - JOUR
T1 - High-dimensional limit theorems for SGD
T2 - Effective dynamics and critical scaling
AU - Arous, Gérard Ben
AU - Gheissari, Reza
AU - Jagannath, Aukosh
N1 - Publisher Copyright:
© 2023 The Authors. Communications on Pure and Applied Mathematics published by Courant Institute of Mathematics and Wiley Periodicals LLC.
PY - 2024/3
Y1 - 2024/3
N2 - We study the scaling limits of stochastic gradient descent (SGD) with constant step-size in the high-dimensional regime. We prove limit theorems for the trajectories of summary statistics (i.e., finite-dimensional functions) of SGD as the dimension goes to infinity. Our approach allows one to choose the summary statistics that are tracked, the initialization, and the step-size. It yields both ballistic (ODE) and diffusive (SDE) limits, with the limit depending dramatically on the former choices. We show a critical scaling regime for the step-size, below which the effective ballistic dynamics matches gradient flow for the population loss, but at which, a new correction term appears which changes the phase diagram. About the fixed points of this effective dynamics, the corresponding diffusive limits can be quite complex and even degenerate. We demonstrate our approach on popular examples including estimation for spiked matrix and tensor models and classification via two-layer networks for binary and XOR-type Gaussian mixture models. These examples exhibit surprising phenomena including multimodal timescales to convergence as well as convergence to sub-optimal solutions with probability bounded away from zero from random (e.g., Gaussian) initializations. At the same time, we demonstrate the benefit of overparametrization by showing that the latter probability goes to zero as the second layer width grows.
AB - We study the scaling limits of stochastic gradient descent (SGD) with constant step-size in the high-dimensional regime. We prove limit theorems for the trajectories of summary statistics (i.e., finite-dimensional functions) of SGD as the dimension goes to infinity. Our approach allows one to choose the summary statistics that are tracked, the initialization, and the step-size. It yields both ballistic (ODE) and diffusive (SDE) limits, with the limit depending dramatically on the former choices. We show a critical scaling regime for the step-size, below which the effective ballistic dynamics matches gradient flow for the population loss, but at which, a new correction term appears which changes the phase diagram. About the fixed points of this effective dynamics, the corresponding diffusive limits can be quite complex and even degenerate. We demonstrate our approach on popular examples including estimation for spiked matrix and tensor models and classification via two-layer networks for binary and XOR-type Gaussian mixture models. These examples exhibit surprising phenomena including multimodal timescales to convergence as well as convergence to sub-optimal solutions with probability bounded away from zero from random (e.g., Gaussian) initializations. At the same time, we demonstrate the benefit of overparametrization by showing that the latter probability goes to zero as the second layer width grows.
UR - http://www.scopus.com/inward/record.url?scp=85173530701&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85173530701&partnerID=8YFLogxK
U2 - 10.1002/cpa.22169
DO - 10.1002/cpa.22169
M3 - Article
AN - SCOPUS:85173530701
SN - 0010-3640
VL - 77
SP - 2030
EP - 2080
JO - Communications on Pure and Applied Mathematics
JF - Communications on Pure and Applied Mathematics
IS - 3
ER -