no code implementations • 16 Sep 2021 • Minghao Gao, Hailun Zhang, Yige Yan
Knowledge distillation methods are proved to be promising in improving the performance of neural networks and no additional computational expenses are required during the inference time.