Abstract
We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent "black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.
Original language | English (US) |
---|---|
Pages (from-to) | 762-771 |
Number of pages | 10 |
Journal | Journal of Machine Learning Research |
Volume | 38 |
State | Published - 2015 |
Event | 18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States Duration: May 9 2015 → May 12 2015 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence
- Control and Systems Engineering
- Statistics and Probability