Abstract
We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.
Original language | English (US) |
---|---|
Pages (from-to) | 105-112 |
Number of pages | 8 |
Journal | Journal of Machine Learning Research |
Volume | 5 |
State | Published - 2009 |
Event | 12th International Conference on Artificial Intelligence and Statistics, AISTATS 2009 - Clearwater, FL, United States Duration: Apr 16 2009 → Apr 18 2009 |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence