no code implementations • 29 Apr 2022 • Thorir Mar Ingolfsson, Mark Vero, Xiaying Wang, Lorenzo Lamberti, Luca Benini, Matteo Spallanzani
The computational demands of neural architecture search (NAS) algorithms are usually directly proportional to the size of their target search spaces.
no code implementations • CVPR 2022 • Matteo Spallanzani, Gian Paolo Leonardi, Luca Benini
When testing ANA on the CIFAR-10 image classification benchmark, we find that the major impact on task accuracy is not due to the qualitative shape of the regularisations but to the proper synchronisation of the different STE variants used in a network, in accordance with the theoretical results.
1 code implementation • 7 Mar 2022 • Menelaos Kanakis, Simon Maurer, Matteo Spallanzani, Ajad Chhatkuli, Luc van Gool
Efficient detection and description of geometric regions in images is a prerequisite in visual systems for localization and mapping.
no code implementations • 27 Jan 2021 • Frank Hannig, Paolo Meloni, Matteo Spallanzani, Matthias Ziegler
This volume contains the papers accepted at the first DATE Friday Workshop on System-level Design Methods for Deep Learning on Heterogeneous Architectures (SLOHA 2021), held virtually on February 5, 2021.
no code implementations • 3 Nov 2020 • Gian Paolo Leonardi, Matteo Spallanzani
Research in computational deep learning has directed considerable efforts towards hardware-oriented optimisations for deep neural networks, via the simplification of the activation functions, or the quantization of both activations and weights.
1 code implementation • 24 May 2019 • Matteo Spallanzani, Lukas Cavigelli, Gian Paolo Leonardi, Marko Bertogna, Luca Benini
We present a theoretical and experimental investigation of the quantization problem for artificial neural networks.