Ensemble nystrom method

Sanjiv Kumar, Mehryar Mohri, Ameet Talwalkar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A crucial technique for scaling kernel methods to very large data sets reaching or exceeding millions of instances is based on low-rank approximation of kernel matrices. We introduce a new family of algorithms based on mixtures of Nystrom approximations, ensemble Nystrom algorithms, that yield more accurate low-rank approximations than the standard Nystrom method. We give a detailed study of variants of these algorithms based on simple averaging, an exponential weight method, or regression-based methods. We also present a theoretical analysis of these algorithms, including novel error bounds guaranteeing a better convergence rate than the standard Nystrom method. Finally, we report results of extensive experiments with several data sets containing up to 1M points demonstrating the significant improvement over the standard Nystrom approximation.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
PublisherNeural Information Processing Systems
Pages1060-1068
Number of pages9
ISBN (Print)9781615679119
StatePublished - 2009
Event23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 - Vancouver, BC, Canada
Duration: Dec 7 2009Dec 10 2009

Publication series

NameAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference

Other

Other23rd Annual Conference on Neural Information Processing Systems, NIPS 2009
Country/TerritoryCanada
CityVancouver, BC
Period12/7/0912/10/09

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Ensemble nystrom method'. Together they form a unique fingerprint.

Cite this