no code implementations • 24 Mar 2024 • Libo Huang, Zhulin An, Yan Zeng, Chuanguang Yang, Xinqiang Yu, Yongjun Xu
Exemplar-Free Class Incremental Learning (efCIL) aims to continuously incorporate the knowledge from new classes while retaining previously learned information, without storing any old-class exemplars (i. e., samples).
1 code implementation • 24 Jul 2023 • Chuanguang Yang, Zhulin An, Libo Huang, Junyu Bi, Xinqiang Yu, Han Yang, Boyu Diao, Yongjun Xu
The unified method is applied to distill several student models trained on CC3M+12M.
no code implementations • 19 Jun 2023 • Chuanguang Yang, Xinqiang Yu, Zhulin An, Yongjun Xu
Knowledge Distillation (KD) aims to optimize a lightweight network from the perspective of over-parameterized training.