Sketching for latent dirichlet-categorical models

Joseph Tassarotti, Jean Baptiste Tristan, Michael Wick

Research output: Contribution to conferencePaperpeer-review


Recent work has explored transforming data sets into smaller, approximate summaries in order to scale Bayesian inference. We examine a related problem in which the parameters of a Bayesian model are very large and expensive to store in memory, and propose more compact representations of parameter values that can be used during inference. We focus on a class of graphical models that we refer to as latent Dirichlet-Categorical models, and show how a combination of two sketching algorithms known as count-min sketch and approximate counters provide an efficient representation for them. We show that this sketch combination - which, despite having been used before in NLP applications, has not been previously analyzed - enjoys desirable properties. We prove that for this class of models, when the sketches are used during Markov Chain Monte Carlo inference, the equilibrium of sketched MCMC converges to that of the exact chain as sketch parameters are tuned to reduce the error rate.

Original languageEnglish (US)
StatePublished - 2020
Event22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan
Duration: Apr 16 2019Apr 18 2019


Conference22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability


Dive into the research topics of 'Sketching for latent dirichlet-categorical models'. Together they form a unique fingerprint.

Cite this