Rayleigh-Gauss-Newton optimization with enhanced sampling for variational Monte Carlo

Robert J. Webber, Michael Lindsey

Research output: Contribution to journalArticlepeer-review


Variational Monte Carlo (VMC) is an approach for computing ground-state wave functions that has recently become more powerful due to the introduction of neural network-based wave-function parametrizations. However, efficiently training neural wave functions to converge to an energy minimum remains a difficult problem. In this work, we analyze optimization and sampling methods used in VMC and introduce alterations to improve their performance. First, based on theoretical convergence analysis in a noiseless setting, we motivate a new optimizer that we call the Rayleigh-Gauss-Newton method, which can improve upon gradient descent and natural gradient descent to achieve superlinear convergence at no more than twice the computational cost. Second, to realize this favorable comparison in the presence of stochastic noise, we analyze the effect of sampling error on VMC parameter updates and experimentally demonstrate that it can be reduced by the parallel tempering method. In particular, we demonstrate that RGN can be made robust to energy spikes that occur when the sampler moves between metastable regions of configuration space. Finally, putting theory into practice, we apply our enhanced optimization and sampling methods to the transverse-field Ising and XXZ models on large lattices, yielding ground-state energy estimates with remarkably high accuracy after just 200 parameter updates.

Original languageEnglish (US)
Article number033099
JournalPhysical Review Research
Issue number3
StatePublished - Jul 2022

ASJC Scopus subject areas

  • Physics and Astronomy(all)


Dive into the research topics of 'Rayleigh-Gauss-Newton optimization with enhanced sampling for variational Monte Carlo'. Together they form a unique fingerprint.

Cite this