Product kernel interpolation for scalable gaussian processes

Jacob R. Gardner, Geoff Pleiss, Ruihan Wu, Kilian Q. Weinberger, Andrew Gordon Wilson

Research output: Contribution to conferencePaper

Abstract

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs.

Original languageEnglish (US)
Pages1407-1416
Number of pages10
StatePublished - 2018
Event21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spain
Duration: Apr 9 2018Apr 11 2018

Conference

Conference21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
CountrySpain
CityPlaya Blanca, Lanzarote, Canary Islands
Period4/9/184/11/18

ASJC Scopus subject areas

  • Statistics and Probability
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Product kernel interpolation for scalable gaussian processes'. Together they form a unique fingerprint.

  • Cite this

    Gardner, J. R., Pleiss, G., Wu, R., Weinberger, K. Q., & Wilson, A. G. (2018). Product kernel interpolation for scalable gaussian processes. 1407-1416. Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain.