Gaussian margin machines

Koby Crammer, Mehryar Mohri, Fernando Pereira

Research output: Contribution to journalConference articlepeer-review

Abstract

We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

Original languageEnglish (US)
Pages (from-to)105-112
Number of pages8
JournalJournal of Machine Learning Research
Volume5
StatePublished - 2009
Event12th International Conference on Artificial Intelligence and Statistics, AISTATS 2009 - Clearwater, FL, United States
Duration: Apr 16 2009Apr 18 2009

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Gaussian margin machines'. Together they form a unique fingerprint.

Cite this