TY - CONF
T1 - Variational sequential Monte Carlo
AU - Naesseth, Christian A.
AU - Linderman, Scott W.
AU - Ranganath, Rajesh
AU - Blei, David M.
N1 - Funding Information:
Christian A. Naesseth is supported by CADICS, a Linnaeus Center, funded by the Swedish Research Council (VR). Scott W. Linderman is supported by the Simons Foundation SCGB-418011. This work is supported by ONR N00014-11-1-0651, DARPA PPAML FA8750-14-2-0009, the Alfred P. Sloan Foundation, and the John Simon Guggenheim Foundation.
Funding Information:
This research was supported in part by NSF grant SCH-1344668. We thank Shipra Agrawal, David Blei and Daniel J. Hsu for discussions that helped motivate this work. We also thank Hang Su, Edward Peng Yu, and all the reviewers for their feedback and comments.
Publisher Copyright:
Copyright 2018 by the author(s).
PY - 2018
Y1 - 2018
N2 - Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.
AB - Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.
UR - http://www.scopus.com/inward/record.url?scp=85057232074&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85057232074&partnerID=8YFLogxK
M3 - Paper
AN - SCOPUS:85057232074
SP - 968
EP - 977
T2 - 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
Y2 - 9 April 2018 through 11 April 2018
ER -