Kernel Interpolation for Scalable Online Gaussian Processes

Samuel Stanton, Wesley J. Maddox, Ian Delbridge, Andrew Gordon Wilson

Research output: Contribution to journalConference articlepeer-review

Abstract

Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion. However, updating a GP posterior to accommodate even a single new observation after having observed n points incurs at least O(n) computations in the exact setting. We show how to use structured kernel interpolation to efficiently reuse computations for constant-time O(1) online updates with respect to the number of points n, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting. Code is available at https://github.com/wjmaddox/online_gp.

Original languageEnglish (US)
Pages (from-to)3133-3141
Number of pages9
JournalProceedings of Machine Learning Research
Volume130
StatePublished - 2021
Event24th International Conference on Artificial Intelligence and Statistics, AISTATS 2021 - Virtual, Online, United States
Duration: Apr 13 2021Apr 15 2021

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Kernel Interpolation for Scalable Online Gaussian Processes'. Together they form a unique fingerprint.

Cite this