Search Results for author: Jinhyuk Park

Found 1 papers, 1 papers with code

Prune Your Model Before Distill It

1 code implementation30 Sep 2021 Jinhyuk Park, Albert No

Recent results suggest that the student-friendly teacher is more appropriate to distill since it provides more transferable knowledge.

Knowledge Distillation Neural Network Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.