1 code implementation • 11 Mar 2024 • Jae-Jun Lee, Sung Whan Yoon
Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks.
1 code implementation • 22 May 2023 • Sang-Yeong Jo, Sung Whan Yoon
Handling out-of-distribution samples is a long-lasting challenge for deep visual models.
Ranked #18 on Domain Generalization on TerraIncognita
no code implementations • 14 Feb 2022 • Jun Seo, Young-Hyun Park, Sung Whan Yoon, Jaekyun Moon
The task-conditioned feature transformation allows an effective utilization of the semantic information in novel classes to generate tight segmentation masks.
1 code implementation • ICML 2020 • Sung Whan Yoon, Do-Yeon Kim, Jun Seo, Jaekyun Moon
The base and novel classifiers quickly adapt to a given task by utilizing the TAR.
no code implementations • 18 Mar 2020 • Jun Seo, Sung Whan Yoon, Jaekyun Moon
Our method employs explicit task-conditioning in which unlabeled sample clustering for the current task takes place in a new projection space different from the embedding feature space.
no code implementations • 25 Sep 2019 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
Our method employs explicit task-conditioning in which unlabeled sample clustering for the current task takes place in a new projection space different from the embedding feature space.
1 code implementation • 16 May 2019 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
The training loss is obtained based on a distance metric between the query and the reference vectors in the projection space.
no code implementations • 4 Jun 2018 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
We propose a meta-learning algorithm utilizing a linear transformer that carries out null-space projection of neural network outputs.