Random Fourier features for kernel ridge regression: Approximation bounds and statistical guarantees

Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectra) matrix approximation bounds imply statistical guarantees for kernel ridge regression.

    Original languageEnglish (US)
    Title of host publication34th International Conference on Machine Learning, ICML 2017
    PublisherInternational Machine Learning Society (IMLS)
    Pages370-404
    Number of pages35
    ISBN (Electronic)9781510855144
    StatePublished - 2017
    Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
    Duration: Aug 6 2017Aug 11 2017

    Publication series

    Name34th International Conference on Machine Learning, ICML 2017
    Volume1

    Other

    Other34th International Conference on Machine Learning, ICML 2017
    Country/TerritoryAustralia
    CitySydney
    Period8/6/178/11/17

    ASJC Scopus subject areas

    • Computational Theory and Mathematics
    • Human-Computer Interaction
    • Software

    Fingerprint

    Dive into the research topics of 'Random Fourier features for kernel ridge regression: Approximation bounds and statistical guarantees'. Together they form a unique fingerprint.

    Cite this