Implicit regularization of random feature models

Arthur Jacot, Berfin Simsek, Francesco Spadaro, Clement Hongler, Franck Gabriel

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Random Feature (RF) models are used as efficient parametric approximations of kernel methods. We investigate, by means of random matrix theory, the connection between Gaussian RF models and Kernel Ridge Regression (KRR). For a Gaussian RF model with P features, N data points, and a ridge ?, we show that the average (i.e. expected) RF predictor is close to a KRR predictor with an effective ridge λ. We show that and λ & ? monotonically as P grows, thus revealing the implicit regularization effect of finite RF sampling. We then compare the risk (i.e. test error) of the λ- KRR predictor with the average risk of the ?-RF predictor and obtain a precise and explicit bound on their difference. Finally, we empirically find an extremely good agreement between the test errors of the average ?-RF predictor and λ-KRR predictor.

Original languageEnglish (US)
Title of host publication37th International Conference on Machine Learning, ICML 2020
EditorsHal Daume, Aarti Singh
PublisherInternational Machine Learning Society (IMLS)
Pages4581-4590
Number of pages10
ISBN (Electronic)9781713821120
StatePublished - 2020
Event37th International Conference on Machine Learning, ICML 2020 - Virtual, Online
Duration: Jul 13 2020Jul 18 2020

Publication series

Name37th International Conference on Machine Learning, ICML 2020
VolumePartF168147-6

Conference

Conference37th International Conference on Machine Learning, ICML 2020
CityVirtual, Online
Period7/13/207/18/20

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Implicit regularization of random feature models'. Together they form a unique fingerprint.

Cite this