Sparse grid classifiers as base learners for AdaBoost

Alexander Heinecke, Benjamin Peherstorfer, Dirk Pflüger, Zhongwen Song

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We consider a classification method based on sparse grids which scales only linearly in the number of data points and is thus well-suited for huge amounts of data. In order to obtain competitive results, such sparse grid classifiers are usually enhanced by locally refining the underlying regular sparse grid. However, in order to parallelize the corresponding adaptive algorithms a thorough knowledge of the hardware is necessary. Instead of improving the performance by refining the sparse grid, we construct a team of classifiers relying just on regular sparse grids and employ them as base learners within AdaBoost. Our examples with synthetic and real-world datasets show that we can achieve similar or better results than with locally refined sparse grids or libSVM, with respect to both runtime and accuracy.

Original languageEnglish (US)
Title of host publicationProceedings of the 2012 International Conference on High Performance Computing and Simulation, HPCS 2012
Pages161-166
Number of pages6
DOIs
StatePublished - 2012
Event2012 10th Annual International Conference on High Performance Computing and Simulation, HPCS 2012 - Madrid, Spain
Duration: Jul 2 2012Jul 6 2012

Publication series

NameProceedings of the 2012 International Conference on High Performance Computing and Simulation, HPCS 2012

Other

Other2012 10th Annual International Conference on High Performance Computing and Simulation, HPCS 2012
CountrySpain
CityMadrid
Period7/2/127/6/12

Keywords

  • AdaBoost
  • classification
  • data mining
  • parallelization
  • shared memory platforms

ASJC Scopus subject areas

  • Modeling and Simulation

Fingerprint Dive into the research topics of 'Sparse grid classifiers as base learners for AdaBoost'. Together they form a unique fingerprint.

Cite this