Paper

Adaptive Precision Training: Quantify Back Propagation in Neural Networks with Fixed-point Numbers

Adaptive Precision Training: Quantify Back Propagation in Neural Networks with Fixed-point Numbers. Recent emerged quantization technique has been applied to inference of deep neural networks for fast and efficient execution. However, directly applying quantization in training can cause significant accuracy loss, thus remaining an open challenge.

Results in Papers With Code
(↓ scroll down to see all results)