Search Results for author: Lars N. Andersen

Found 1 papers, 1 papers with code

Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation

1 code implementation NeurIPS 2021 Kenneth Borup, Lars N. Andersen

Knowledge distillation is classically a procedure where a neural network is trained on the output of another network along with the original targets in order to transfer knowledge between the architectures.

Self-Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.