Sampling methods for unsupervised learning

Rob Fergus, Andrew Zisserman, Pietro Perona

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present an algorithm to overcome the local maxima problem in estimating the parameters of mixture models. It combines existing approaches from both EM and a robust fitting algorithm, RANSAC, to give a data-driven stochastic learning scheme. Minimal subsets of data points, sufficient to constrain the parameters of the model, are drawn from proposal densities to discover new regions of high likelihood. The proposal densities are learnt using EM and bias the sampling toward promising solutions. The algorithm is computationally efficient, as well as effective at escaping from local maxima. We compare it with alternative methods, including EM and RANSAC, on both challenging synthetic data and the computer vision problem of alpha-matting.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 17 - Proceedings of the 2004 Conference, NIPS 2004
PublisherNeural information processing systems foundation
ISBN (Print)0262195348, 9780262195348
StatePublished - 2005
Event18th Annual Conference on Neural Information Processing Systems, NIPS 2004 - Vancouver, BC, Canada
Duration: Dec 13 2004Dec 16 2004

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other18th Annual Conference on Neural Information Processing Systems, NIPS 2004
Country/TerritoryCanada
CityVancouver, BC
Period12/13/0412/16/04

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Sampling methods for unsupervised learning'. Together they form a unique fingerprint.

Cite this