no code implementations • 14 Feb 2022 • Jun Seo, Young-Hyun Park, Sung Whan Yoon, Jaekyun Moon
The task-conditioned feature transformation allows an effective utilization of the semantic information in novel classes to generate tight segmentation masks.
no code implementations • NeurIPS 2021 • YoungHyun Park, Dong-Jun Han, Do-Yeon Kim, Jun Seo, Jaekyun Moon
Of central issues that may limit a widespread adoption of FL is the significant communication resources required in the exchange of updated model parameters between the server and individual clients over many communication rounds.
no code implementations • 22 Oct 2020 • Jun Seo, Young-Hyun Park, Sung-Whan Yoon, Jaekyun Moon
Few-shot learning allows machines to classify novel classes using only a few labeled samples.
1 code implementation • ICML 2020 • Sung Whan Yoon, Do-Yeon Kim, Jun Seo, Jaekyun Moon
The base and novel classifiers quickly adapt to a given task by utilizing the TAR.
no code implementations • 18 Mar 2020 • Jun Seo, Sung Whan Yoon, Jaekyun Moon
Our method employs explicit task-conditioning in which unlabeled sample clustering for the current task takes place in a new projection space different from the embedding feature space.
no code implementations • 18 Mar 2020 • Young-Hyun Park, Jun Seo, Jaekyun Moon
Since there is no existing dataset for few-shot semantic edge detection, we construct two new datasets, FSE-1000 and SBD-$5^i$, and evaluate the performance of the proposed CAFENet on them.
no code implementations • 25 Sep 2019 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
Our method employs explicit task-conditioning in which unlabeled sample clustering for the current task takes place in a new projection space different from the embedding feature space.
1 code implementation • 16 May 2019 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
The training loss is obtained based on a distance metric between the query and the reference vectors in the projection space.
no code implementations • 4 Jun 2018 • Sung Whan Yoon, Jun Seo, Jaekyun Moon
We propose a meta-learning algorithm utilizing a linear transformer that carries out null-space projection of neural network outputs.