1 code implementation • 27 May 2023 • Yuhang Li, Abhishek Moitra, Tamar Geller, Priyadarshini Panda
Although the efficiency of SNNs can be realized on the In-Memory Computing (IMC) architecture, we show that the energy cost and latency of SNNs scale linearly with the number of timesteps used on IMC hardware.
1 code implementation • 2 Apr 2023 • Yuhang Li, Tamar Geller, Youngeun Kim, Priyadarshini Panda
However, we observe that the information capacity in SNNs is affected by the number of timesteps, leading to an accuracy-efficiency tradeoff.
1 code implementation • 11 Mar 2022 • Yuhang Li, Youngeun Kim, Hyoungseob Park, Tamar Geller, Priyadarshini Panda
In an effort to minimize this generalization gap, we propose Neuromorphic Data Augmentation (NDA), a family of geometric augmentations specifically designed for event-based datasets with the goal of significantly stabilizing the SNN training and reducing the generalization gap between training and test performance.
Ranked #1 on Event data classification on CIFAR10-DVS (using extra training data)