Abstract
Kernel methods have great promise for learn-ing rich statistical representations of large modern datasets. However, compared to neu-ral networks, kernel methods have been per-ceived as lacking in scalability and exibil-ity. We introduce a family of fast, exi-ble, lightly parametrized and general purpose kernel learning methods, derived from Fast-food basis function expansions. We provide mechanisms to learn the properties of groups of spectral frequencies in these expansions, which require only O(mlog d) time and O(m) memory, for m basis functions and d input di-mensions. We show that the proposed meth-ods can learn a wide class of kernels, outper-forming the alternatives in accuracy, speed, and memory consumption.
Original language | English (US) |
---|---|
Pages (from-to) | 1098-1106 |
Number of pages | 9 |
Journal | Journal of Machine Learning Research |
Volume | 38 |
State | Published - 2015 |
Event | 18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States Duration: May 9 2015 → May 12 2015 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Statistics and Probability
- Artificial Intelligence