Unsupervised learning of noisy-or Bayesian networks

Yoni Halpern, David Sontag

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper considers the problem of learning the parameters in Bayesian networks of discrete variables with known structure and hidden variables. Previous approaches in these settings typically use expectation maximization; when the network has high treewidth, the required expectations might be approximated using Monte Carlo or variational methods. We show how to avoid inference altogether during learning by giving a polynomial-time algorithm based on the method-of-moments, building upon recent work on learning discrete-valued mixture models. In particular, we show how to learn the parameters for a family of bipartite noisy-or Bayesian networks. In our experimental results, we demonstrate an application of our algorithm to learning QMR-DT, a large Bayesian network used for medical diagnosis. We show that it is possible to fully learn the parameters of QMR-DT even when only the findings are observed in the training data (ground truth diseases unknown).

Original languageEnglish (US)
Title of host publicationUncertainty in Artificial Intelligence - Proceedings of the 29th Conference, UAI 2013
Pages272-281
Number of pages10
StatePublished - 2013
Event29th Conference on Uncertainty in Artificial Intelligence, UAI 2013 - Bellevue, WA, United States
Duration: Jul 11 2013Jul 15 2013

Other

Other29th Conference on Uncertainty in Artificial Intelligence, UAI 2013
Country/TerritoryUnited States
CityBellevue, WA
Period7/11/137/15/13

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Unsupervised learning of noisy-or Bayesian networks'. Together they form a unique fingerprint.

Cite this