1 code implementation • 27 Jul 2022 • Donglin Xie, Ruonan Yu, Gongfan Fang, Jie Song, Zunlei Feng, Xinchao Wang, Li Sun, Mingli Song
The goal of FedSA is to train a student model for a new task with the help of several decentralized teachers, whose pre-training tasks and data are different and agnostic.
no code implementations • 31 Mar 2022 • Cheng Dai, Yingqiao Lin, Fan Li, Xiyao Li, Donglin Xie
In Domain Generalization (DG) tasks, models are trained by using only training data from the source domains to achieve generalization on an unseen target domain, this will suffer from the distribution shift problem.
Ranked #9 on Domain Generalization on VLCS
2 code implementations • NeurIPS 2021 • Gongfan Fang, Yifan Bao, Jie Song, Xinchao Wang, Donglin Xie, Chengchao Shen, Mingli Song
Knowledge distillation~(KD) aims to craft a compact student model that imitates the behavior of a pre-trained teacher in a target domain.