A la carte | learning fast kernels

Zichao Yang, Alexander J. Smola, Le Song, Andrew Gordon Wilson

Research output: Contribution to journalConference articlepeer-review

Abstract

Kernel methods have great promise for learn-ing rich statistical representations of large modern datasets. However, compared to neu-ral networks, kernel methods have been per-ceived as lacking in scalability and exibil-ity. We introduce a family of fast, exi-ble, lightly parametrized and general purpose kernel learning methods, derived from Fast-food basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(mlog d) time and O(m) memory, for m basis functions and d input di-mensions. We show that the proposed meth-ods can learn a wide class of kernels, outper-forming the alternatives in accuracy, speed, and memory consumption.

Original languageEnglish (US)
Pages (from-to)1098-1106
Number of pages9
JournalJournal of Machine Learning Research
Volume38
StatePublished - 2015
Event18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States
Duration: May 9 2015May 12 2015

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A la carte | learning fast kernels'. Together they form a unique fingerprint.

Cite this