Abstract
Gaussian state space models have been used for decades as generative models of sequential data. They admit an intuitive probabilistic interpretation, have a simple functional form, and enjoy widespread adoption. We introduce a unified algorithm to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks. Our learning algorithm simultaneously learns a compiled inference network and the generative model, leveraging a structured variational approximation parameterized by recurrent neural networks to mimic the posterior distribution. We apply the learning algorithm to both synthetic and real-world datasets, demonstrating its scalability and versatility. We find that using the structured approximation to the posterior results in models with significantly higher held-out likelihood.
Original language | English (US) |
---|---|
Pages | 2101-2109 |
Number of pages | 9 |
State | Published - 2017 |
Event | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 - San Francisco, United States Duration: Feb 4 2017 → Feb 10 2017 |
Other
Other | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 |
---|---|
Country/Territory | United States |
City | San Francisco |
Period | 2/4/17 → 2/10/17 |
ASJC Scopus subject areas
- Artificial Intelligence