1 code implementation • 2 Feb 2024 • Michael Livanos, Ian Davidson, Stephen Wong
Knowledge distillation is a simple but powerful way to transfer knowledge between a teacher model to a student model.
no code implementations • 10 Dec 2021 • Jiahao Huang, Weiping Ding, Jun Lv, Jingwen Yang, Hao Dong, Javier Del Ser, Jun Xia, Tiaojuan Ren, Stephen Wong, Guang Yang
The dual discriminator design aims to improve the edge information in MRI reconstruction.