no code implementations • 20 Apr 2023 • Tao Sun, Bojian Yin, Sander Bohte
Spiking neural networks (SNNs) have gained attention as models of sparse and event-driven communication of biological neurons, and as such have shown increasing promise for energy-efficient applications in neuromorphic hardware.
no code implementations • 20 Dec 2021 • Bojian Yin, Federico Corradi, Sander M. Bohte
When combined with a novel dynamic spiking neuron model, the Liquid-Time-Constant neuron, we show that SNNs trained with FPTT outperform online BPTT approximations, and approach or exceed offline BPTT accuracy on temporal classification tasks.
no code implementations • 12 Mar 2021 • Bojian Yin, Federico Corradi, Sander M. Bohte
Inspired by more detailed modeling of biological neurons, Spiking neural networks (SNNs) have been investigated both as more biologically plausible and potentially more powerful models of neural computation, and also with the aim of extracting biological neurons' energy efficiency; the performance of such networks however has remained lacking compared to classical artificial neural networks (ANNs).
Ranked #5 on Audio Classification on SSC
1 code implementation • 24 May 2020 • Bojian Yin, Federico Corradi, Sander M. Bohté
The emergence of brain-inspired neuromorphic computing as a paradigm for edge AI is motivating the search for high-performance and efficient spiking neural networks to run on this hardware.
no code implementations • 18 Feb 2019 • Bojian Yin, Siebren Schaafsma, Henk Corporaal, H. Steven Scholte, Sander M. Bohte
While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, compared to humans, much more sensitive to image degradation.
1 code implementation • ICLR 2018 • Bojian Yin, Marleen Balvert, Davide Zambrano, Alexander Schönhuth, Sander Bohte
The folding structure of the DNA molecule combined with helper molecules, also referred to as the chromatin, is highly relevant for the functional properties of DNA.