Boosting and Other Machine Learning Algorithms

Harris Drucker, Corinna Cortes, L. D. Jackel, Yann LeCun, Vladimir Vapnik

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In an optical character recognition problem, we compare (as a function of training set size) the performance of three neural network based ensemble methods (two versions of boosting and a committee of neural networks trained independently) to that of a single network. In boosting, the number of patterns actually used for training is a subset of all potential training patterns. Based on either a fixed computational cost or training set size criterion, some version of boosting is best We also compare (for a fixed training set size) boosting to the following algorithms: optimal margin classifiers, tangent distance, local learning, k-nearest neighbor, and a large weight sharing network with the boosting algorithm showing the best performance.

Original languageEnglish (US)
Title of host publicationProceedings of the 11th International Conference on Machine Learning, ICML 1994
EditorsWilliam W. Cohen, Haym Hirsh
PublisherMorgan Kaufmann Publishers, Inc.
Pages53-61
Number of pages9
ISBN (Electronic)1558603352, 9781558603356
DOIs
StatePublished - 1994
Event11th International Conference on Machine Learning, ICML 1994 - New Brunswick, United States
Duration: Jul 10 1994Jul 13 1994

Publication series

NameProceedings of the 11th International Conference on Machine Learning, ICML 1994

Conference

Conference11th International Conference on Machine Learning, ICML 1994
Country/TerritoryUnited States
CityNew Brunswick
Period7/10/947/13/94

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Software
  • Theoretical Computer Science
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Boosting and Other Machine Learning Algorithms'. Together they form a unique fingerprint.

Cite this