Subspace Inference for Bayesian Deep Learning

Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson

Research output: Contribution to journalConference articlepeer-review

Abstract

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this paper, we construct low-dimensional subspaces of parameter space, such as the first principal components of the stochastic gradient descent (SGD) trajectory, which contain diverse sets of high performing models. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces accurate predictions and well-calibrated predictive uncertainty for both regression and image classification.

Original languageEnglish (US)
Pages (from-to)1169-1179
Number of pages11
JournalProceedings of Machine Learning Research
Volume115
StatePublished - 2019
Event35th Uncertainty in Artificial Intelligence Conference, UAI 2019 - Tel Aviv, Israel
Duration: Jul 22 2019Jul 25 2019

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Subspace Inference for Bayesian Deep Learning'. Together they form a unique fingerprint.

Cite this