1 code implementation • NeurIPS 2019 • Aaron Klein, Zhenwen Dai, Frank Hutter, Neil Lawrence, Javier Gonzalez
Despite the recent progress in hyperparameter optimization (HPO), available benchmarks that resemble real-world scenarios consist of a few and very large problem instances that are expensive to solve.
no code implementations • 24 Apr 2019 • Yu Chen, Tom Diethe, Neil Lawrence
Conventional models tend to forget the knowledge of previous tasks while learning a new task, a phenomenon known as catastrophic forgetting.
1 code implementation • 18 Mar 2019 • Kurt Cutajar, Mark Pullin, Andreas Damianou, Neil Lawrence, Javier González
Multi-fidelity methods are prominently used when cheaply-obtained, but possibly biased and noisy, observations must be effectively combined with limited or expensive true data in order to construct reliable models.
no code implementations • 12 Mar 2019 • Tom Diethe, Tom Borchert, Eno Thereska, Borja Balle, Neil Lawrence
This paper describes a reference architecture for self-maintaining systems that can learn continually, as data arrives.
no code implementations • ICML 2018 • Xiaoyu Lu, Javier Gonzalez, Zhenwen Dai, Neil Lawrence
We tackle the problem of optimizing a black-box objective function defined over a highly-structured input space.
no code implementations • 3 Jan 2018 • Mu Niu, Pokman Cheung, Lizhen Lin, Zhenwen Dai, Neil Lawrence, David Dunson
in-GPs respect the potentially complex boundary or interior conditions as well as the intrinsic geometry of the spaces.
no code implementations • 25 Oct 2016 • Alexander Grigorievskiy, Neil Lawrence, Simo Särkkä
We propose a parallelizable sparse inverse formulation Gaussian process (SpInGP) for temporal models.
no code implementations • 17 Oct 2016 • Mu Niu, Zhenwen Dai, Neil Lawrence, Kolja Becker
The spatio-temporal field of protein concentration and mRNA expression are reconstructed without explicitly solving the partial differential equation.
no code implementations • 30 Jun 2016 • Fariba Yousefi, Zhenwen Dai, Carl Henrik Ek, Neil Lawrence
Unsupervised learning on imbalanced data is challenging because, when given imbalanced data, current model is often dominated by the major category and ignores the categories with small amount of data.
no code implementations • 19 Nov 2015 • Zhenwen Dai, Andreas Damianou, Javier González, Neil Lawrence
We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model.
no code implementations • 10 May 2015 • Zhenwen Dai, James Hensman, Neil Lawrence
The Gaussian process latent variable model (GP-LVM) is a popular approach to non-linear probabilistic dimensionality reduction.
no code implementations • 18 Oct 2014 • Zhenwen Dai, Andreas Damianou, James Hensman, Neil Lawrence
In this work, we present an extension of Gaussian process (GP) models with sophisticated parallelization and GPU acceleration.
no code implementations • 16 Aug 2011 • Miguel Lázaro-Gredilla, Steven Van Vaerenbergh, Neil Lawrence
In this work we introduce a mixture of GPs to address the data association problem, i. e. to label a group of observations according to the sources that generated them.