Abstract
In this work, we introduce a reduced-rank algorithm for Gaussian process regression. Our numerical scheme converts a Gaussian process on a user-specified interval to its Karhunen–Loève expansion, the L2-optimal reduced-rank representation. Numerical evaluation of the Karhunen–Loève expansion is performed once during precomputation and involves computing a numerical eigendecomposition of an integral operator whose kernel is the covariance function of the Gaussian process. The Karhunen–Loève expansion is independent of observed data and depends only on the covariance kernel and the size of the interval on which the Gaussian process is defined. The scheme of this paper does not require translation invariance of the covariance kernel. We also introduce a class of fast algorithms for Bayesian fitting of hyperparameters and demonstrate the performance of our algorithms with numerical experiments in one and two dimensions. Extensions to higher dimensions are mathematically straightforward but suffer from the standard curses of high dimensions.
Original language | English (US) |
---|---|
Article number | 94 |
Journal | Statistics and Computing |
Volume | 32 |
Issue number | 5 |
DOIs | |
State | Published - Oct 2022 |
Keywords
- Eigenfunction expansions
- Gaussian processes
- Karhunen–Loève expansions
- Reduced-rank regression
ASJC Scopus subject areas
- Theoretical Computer Science
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Computational Theory and Mathematics