no code implementations • ICCV 2023 • Vadim Sushko, Ruyu Wang, Juergen Gall
The task of few-shot GAN adaptation aims to adapt a pre-trained GAN model to a small dataset with very few training images.
no code implementations • 2 Dec 2022 • Edgar Schönfeld, Julio Borges, Vadim Sushko, Bernt Schiele, Anna Khoreva
Prior work has extensively studied the latent space structure of GANs for unconditional image synthesis, enabling global editing of generated images by the unsupervised discovery of interpretable latent directions.
1 code implementation • 15 Sep 2022 • Vadim Sushko, Dan Zhang, Juergen Gall, Anna Khoreva
To this end, inspired by the recent architectural developments of single-image GANs, we introduce our OSMIS model which enables the synthesis of segmentation masks that are precisely aligned to the generated images in the one-shot regime.
no code implementations • 14 Jun 2021 • Ekaterina Borodich, Aleksandr Beznosikov, Abdurakhmon Sadiev, Vadim Sushko, Nikolay Savelyev, Martin Takáč, Alexander Gasnikov
Personalized Federated Learning (PFL) has witnessed remarkable advancements, enabling the development of innovative machine learning applications that preserve the privacy of training data.
1 code implementation • 12 May 2021 • Vadim Sushko, Juergen Gall, Anna Khoreva
Training GANs in low-data regimes remains a challenge, as overfitting often leads to memorization or training divergence.
1 code implementation • 24 Mar 2021 • Vadim Sushko, Dan Zhang, Juergen Gall, Anna Khoreva
In this work, we introduce SIV-GAN, an unconditional generative model that can generate new scene compositions from a single training image or a single video clip.
1 code implementation • ICLR 2021 • Vadim Sushko, Edgar Schönfeld, Dan Zhang, Juergen Gall, Bernt Schiele, Anna Khoreva
By providing stronger supervision to the discriminator as well as to the generator through spatially- and semantically-aware discriminator feedback, we are able to synthesize images of higher fidelity with better alignment to their input label maps, making the use of the perceptual loss superfluous.