Bayesian nonparametric kernel-learning

Junier B. Oliva, Avinava Dubey, Andrew G. Wilson, Barnabás Póczos, Jeff Schneider, Eric P. Xing

Research output: Contribution to conferencePaperpeer-review

Abstract

Kernel methods are ubiquitous tools in machine learning. However, there is often little reason for the common practice of selecting a kernel a priori. Even if a universal approximating kernel is selected, the quality of the finite sample estimator may be greatly affected by the choice of kernel. Furthermore, when directly applying kernel methods, one typically needs to compute a N x N Gram matrix of pairwise kernel evaluations to work with a dataset of N instances. The computation of this Gram matrix precludes the direct application of kernel methods on large datasets, and makes kernel learning especially difficult. In this paper we introduce Bayesian non-parmetric kernel-learning (BaNK), a generic, data-driven framework for scalable learning of kernels. BaNK places a nonparametric prior on the spectral distribution of random frequencies allowing it to both learn kernels and scale to large datasets. We show that this framework can be used for large scale regression and classification tasks. Furthermore, we show that BaNK outperforms several other scalable approaches for kernel learning on a variety of real world datasets.

Original languageEnglish (US)
Pages1078-1086
Number of pages9
StatePublished - 2016
Event19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain
Duration: May 9 2016May 11 2016

Conference

Conference19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016
Country/TerritorySpain
CityCadiz
Period5/9/165/11/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Bayesian nonparametric kernel-learning'. Together they form a unique fingerprint.

Cite this