Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions

Philip Greengard, Michael O’Neil

Research output: Contribution to journalArticlepeer-review


In this work, we introduce a reduced-rank algorithm for Gaussian process regression. Our numerical scheme converts a Gaussian process on a user-specified interval to its Karhunen–Loève expansion, the L2-optimal reduced-rank representation. Numerical evaluation of the Karhunen–Loève expansion is performed once during precomputation and involves computing a numerical eigendecomposition of an integral operator whose kernel is the covariance function of the Gaussian process. The Karhunen–Loève expansion is independent of observed data and depends only on the covariance kernel and the size of the interval on which the Gaussian process is defined. The scheme of this paper does not require translation invariance of the covariance kernel. We also introduce a class of fast algorithms for Bayesian fitting of hyperparameters and demonstrate the performance of our algorithms with numerical experiments in one and two dimensions. Extensions to higher dimensions are mathematically straightforward but suffer from the standard curses of high dimensions.

Original languageEnglish (US)
Article number94
JournalStatistics and Computing
Issue number5
StatePublished - Oct 2022


  • Eigenfunction expansions
  • Gaussian processes
  • Karhunen–Loève expansions
  • Reduced-rank regression

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Computational Theory and Mathematics


Dive into the research topics of 'Efficient reduced-rank methods for Gaussian processes with eigenfunction expansions'. Together they form a unique fingerprint.

Cite this