no code implementations • 27 Jan 2023 • David van Son, Floran de Putter, Sebastian Vogel, Henk Corporaal
Bayesian Optimization Mixed-Precision Neural Architecture Search (BOMP-NAS) is an approach to quantization-aware neural architecture search (QA-NAS) that leverages both Bayesian optimization (BO) and mixed-precision quantization (MP) to efficiently search for compact, high performance deep neural networks.
no code implementations • 3 Dec 2022 • Mayank Senapati, Manil Dev Gomony, Sherif Eissa, Charlotte Frenkel, Henk Corporaal
Neuromorphic computing using biologically inspired Spiking Neural Networks (SNNs) is a promising solution to meet Energy-Throughput (ET) efficiency needed for edge computing devices.
no code implementations • 22 Aug 2022 • Gagandeep Singh, Dionysios Diamantopoulos, Juan Gómez-Luna, Sander Stuijk, Henk Corporaal, Onur Mutlu
The key idea of LEAPER is to transfer an ML-based performance and resource usage model trained for a low-end edge environment to a new, high-end cloud environment to provide fast and accurate predictions for accelerator implementation.
no code implementations • 24 Jun 2022 • Floran de Putter, Henk Corporaal
To reduce the accuracy gap between binary and full-precision networks, many repair methods have been proposed in the recent past, which we have classified and put into a single overview in this chapter.
1 code implementation • 15 May 2022 • Gagandeep Singh, Rakesh Nadig, Jisung Park, Rahul Bera, Nastaran Hajinazar, David Novo, Juan Gómez-Luna, Sander Stuijk, Henk Corporaal, Onur Mutlu
We introduce Sibyl, the first technique that uses reinforcement learning for data placement in hybrid storage systems.
1 code implementation • NeurIPS 2021 • Wei Sun, Aojun Zhou, Sander Stuijk, Rob Wijnhoven, Andrew Oakleigh Nelson, Hongsheng Li, Henk Corporaal
However, the existing N:M algorithms only address the challenge of how to train N:M sparse neural networks in a uniform fashion (i. e. every layer has the same N:M sparsity) and suffer from a significant accuracy drop for high sparsity (i. e. when sparsity > 80\%).
no code implementations • 24 Apr 2020 • Barry de Bruin, Zoran Zivkovic, Henk Corporaal
We demonstrate that 16-bit accumulators are able to obtain a classification accuracy within 1\% of the floating-point baselines on the CIFAR-10 and ILSVRC2012 image classification benchmarks.
no code implementations • 18 Feb 2019 • Bojian Yin, Siebren Schaafsma, Henk Corporaal, H. Steven Scholte, Sander M. Bohte
While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, compared to humans, much more sensitive to image degradation.