1 code implementation • 25 Jul 2019 • Karol Kurach, Anton Raichuk, Piotr Stańczyk, Michał Zając, Olivier Bachem, Lasse Espeholt, Carlos Riquelme, Damien Vincent, Marcin Michalski, Olivier Bousquet, Sylvain Gelly
Recent progress in the field of reinforcement learning has been accelerated by virtual learning environments such as video games, where novel algorithms and ideas can be quickly tested in a safe and reproducible manner.
no code implementations • ICLR 2019 • Karol Kurach, Mario Lucic, Xiaohua Zhai, Marcin Michalski, Sylvain Gelly
Generative adversarial networks (GANs) are a class of deep generative models which aim to learn a target distribution in an unsupervised fashion.
no code implementations • ICLR 2019 • Sjoerd van Steenkiste, Karol Kurach, Sylvain Gelly
In this work we propose to structure the generator of a GAN to consider objects and their relations explicitly, and generate images by means of composition.
no code implementations • ICLR Workshop DeepGenStruct 2019 • Thomas Unterthiner, Sjoerd van Steenkiste, Karol Kurach, Raphaël Marinier, Marcin Michalski, Sylvain Gelly
While recent generative models of video have had some success, current progress is hampered by the lack of qualitative metrics that consider visual quality, temporal coherence, and diversity of samples.
3 code implementations • 3 Dec 2018 • Thomas Unterthiner, Sjoerd van Steenkiste, Karol Kurach, Raphael Marinier, Marcin Michalski, Sylvain Gelly
To this extent we propose Fr\'{e}chet Video Distance (FVD), a new metric for generative models of video, and StarCraft 2 Videos (SCV), a benchmark of game play from custom starcraft 2 scenarios that challenge the current capabilities of generative models of video.
4 code implementations • 19 Nov 2018 • Maciej Zamorski, Maciej Zięba, Piotr Klukowski, Rafał Nowak, Karol Kurach, Wojciech Stokowiec, Tomasz Trzciński
Deep generative architectures provide a way to model not only images but also complex, 3-dimensional objects, such as point clouds.
no code implementations • ICLR 2019 • Sjoerd van Steenkiste, Karol Kurach, Jürgen Schmidhuber, Sylvain Gelly
We present a minimal modification of a standard generator to incorporate this inductive bias and find that it reliably learns to generate images as compositions of objects.
5 code implementations • ICLR 2019 • Karol Kurach, Mario Lucic, Xiaohua Zhai, Marcin Michalski, Sylvain Gelly
Generative adversarial networks (GANs) are a class of deep generative models which aim to learn a target distribution in an unsupervised fashion.
no code implementations • 29 Mar 2018 • Sylvain Gelly, Karol Kurach, Marcin Michalski, Xiaohua Zhai
We propose a new learning paradigm called Deep Memory.
9 code implementations • NeurIPS 2018 • Mario Lucic, Karol Kurach, Marcin Michalski, Sylvain Gelly, Olivier Bousquet
Generative adversarial networks (GAN) are a powerful subclass of generative models.
no code implementations • 10 Jun 2017 • Olivier Bousquet, Sylvain Gelly, Karol Kurach, Marc Schoenauer, Michele Sebag, Olivier Teytaud, Damien Vincent
This paper aims at one-shot learning of deep neural nets, where a highly parallel setting is considered to address the algorithm calibration problem - selecting the best neural architecture and learning hyper-parameter values depending on the dataset at hand.
no code implementations • 10 Jun 2017 • Olivier Bousquet, Sylvain Gelly, Karol Kurach, Olivier Teytaud, Damien Vincent
The selection of hyper-parameters is critical in Deep Learning.
no code implementations • 23 May 2017 • Karol Kurach, Sylvain Gelly, Michal Jastrzebski, Philip Haeusser, Olivier Teytaud, Damien Vincent, Olivier Bousquet
Generic text embeddings are successfully used in a variety of tasks.
no code implementations • 15 Jun 2016 • Anjuli Kannan, Karol Kurach, Sujith Ravi, Tobias Kaufmann, Andrew Tomkins, Balint Miklos, Greg Corrado, Laszlo Lukacs, Marina Ganea, Peter Young, Vivek Ramavajjala
In this paper we propose and investigate a novel end-to-end method for automatically generating short email responses, called Smart Reply.
no code implementations • 9 Feb 2016 • Marcin Andrychowicz, Karol Kurach
In this paper, we propose and investigate a novel memory architecture for neural networks called Hierarchical Attentive Memory (HAM).
4 code implementations • 21 Nov 2015 • Arvind Neelakantan, Luke Vilnis, Quoc V. Le, Ilya Sutskever, Lukasz Kaiser, Karol Kurach, James Martens
This success is partially attributed to architectural innovations such as convolutional and long short-term memory networks.
no code implementations • 19 Nov 2015 • Karol Kurach, Marcin Andrychowicz, Ilya Sutskever
In this paper, we propose and investigate a new neural network architecture called Neural Random Access Machine.
1 code implementation • NeurIPS 2014 • Wojciech Zaremba, Karol Kurach, Rob Fergus
In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities.