Learning Integral Representations of Gaussian Processes

21 Feb 2018  ·  Zilong Tan, Sayan Mukherjee ·

We propose a representation of Gaussian processes (GPs) based on powers of the integral operator defined by a kernel function, we call these stochastic processes integral Gaussian processes (IGPs). Sample paths from IGPs are functions contained within the reproducing kernel Hilbert space (RKHS) defined by the kernel function, in contrast sample paths from the standard GP are not functions within the RKHS. We develop computationally efficient non-parametric regression models based on IGPs. The main innovation in our regression algorithm is the construction of a low dimensional subspace that captures the information most relevant to explaining variation in the response. We use ideas from supervised dimension reduction to compute this subspace. The result of using the construction we propose involves significant improvements in the computational complexity of estimating kernel hyper-parameters as well as reducing the prediction variance.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here