no code implementations • 27 Nov 2018 • Natan Liss, Chaim Baskin, Avi Mendelson, Alex M. Bronstein, Raja Giryes
While most works use uniform quantizers for both parameters and activations, it is not always the optimal one, and a non-uniform quantizer need to be considered.
1 code implementation • ICLR 2019 • Chaim Baskin, Natan Liss, Yoav Chai, Evgenii Zheltonozhskii, Eli Schwartz, Raja Giryes, Avi Mendelson, Alexander M. Bronstein
Convolutional Neural Networks (CNN) are very popular in many fields including computer vision, speech recognition, natural language processing, to name a few.
no code implementations • 29 Apr 2018 • Chaim Baskin, Eli Schwartz, Evgenii Zheltonozhskii, Natan Liss, Raja Giryes, Alex M. Bronstein, Avi Mendelson
We present a novel method for neural network quantization that emulates a non-uniform $k$-quantile quantizer, which adapts to the distribution of the quantized parameters.
no code implementations • 31 Jul 2017 • Chaim Baskin, Natan Liss, Evgenii Zheltonozhskii, Alex M. Bronshtein, Avi Mendelson
Using quantized values enables the use of FPGAs to run NNs, since FPGAs are well fitted to these primitives; e. g., FPGAs provide efficient support for bitwise operations and can work with arbitrary-precision representation of numbers.