no code implementations • 30 Nov 2020 • Jeong-Hoe Ku, Jihun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee
This paper aims to provide a selective survey about knowledge distillation(KD) framework for researchers and practitioners to take advantage of it for developing new optimized models in the deep neural network field.
no code implementations • 13 Aug 2020 • Jihun Oh, SangJeong Lee, Meejeong Park, Pooni Walagaurav, Kiseok Kwon
As a result, our proposed method achieved a top-1 accuracy of 69. 78% ~ 70. 96% in MobileNets and showed robust performance in varying network models and tasks, which is competitive to channel-wise quantization results.
no code implementations • 6 Aug 2020 • Abhinav Mehrotra, Łukasz Dudziak, Jinsu Yeo, Young-Yoon Lee, Ravichander Vipperla, Mohamed S. Abdelfattah, Sourav Bhattacharya, Samin Ishtiaq, Alberto Gil C. P. Ramos, SangJeong Lee, Daehyun Kim, Nicholas D. Lane
Increasing demand for on-device Automatic Speech Recognition (ASR) systems has resulted in renewed interests in developing automatic model compression techniques.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +4