no code implementations • 7 May 2024 • Mohit Upadhyay, Rohan Juneja, Weng-Fai Wong, Li-Shiuan Peh
Attention mechanisms are becoming increasingly popular, being used in neural network models in multiple domains such as natural language processing (NLP) and vision applications, especially at the edge.
no code implementations • 20 Feb 2024 • Zhanglu Yan, Weiran Chu, Yuhua Sheng, Kaiwen Tang, Shida Wang, Yanfeng Liu, Weng-Fai Wong
The NCS optimization problem is to find an NCS that maximizes gene expression.
no code implementations • 16 Aug 2023 • Zhanglu Yan, Shida Wang, Kaiwen Tang, Weng-Fai Wong
In light of the increasing adoption of edge computing in areas such as intelligent furniture, robotics, and smart homes, this paper introduces HyperSNN, an innovative method for control tasks that uses spiking neural networks (SNNs) in combination with hyperdimensional computing.
no code implementations • 9 May 2023 • Myat Thu Linn Aung, Daniel Gerlinghoff, Chuping Qu, Liwei Yang, Tian Huang, Rick Siow Mong Goh, Tao Luo, Weng-Fai Wong
Brain-inspired spiking neural networks (SNNs) replace the multiply-accumulate operations of traditional neural networks by integrate-and-fire neurons, with the goal of achieving greater energy efficiency.
1 code implementation • 26 Jan 2023 • Zhanglu Yan, Shida Wang, Kaiwen Tang, Weng-Fai Wong
Hyperdimensional computing (HDC) is a method to perform classification that uses binary vectors with high dimensions and the majority rule.
1 code implementation • 10 Nov 2022 • Daniel Gerlinghoff, Tao Luo, Rick Siow Mong Goh, Weng-Fai Wong
Spiking neural networks (SNNs) are a viable alternative to conventional artificial neural networks when resource efficiency and computational complexity are of importance.
no code implementations • 27 Oct 2022 • Zhanglu Yan, Jun Zhou, Weng-Fai Wong
The maximum number of spikes in this time window is also the latency of the network in performing a single inference, as well as determines the overall energy efficiency of the model.
no code implementations • 1 Dec 2021 • Zhehui Wang, Tao Luo, Rick Siow Mong Goh, Wei zhang, Weng-Fai Wong
In-memory deep learning has already demonstrated orders of magnitude higher performance density and energy efficiency.
no code implementations • 29 Sep 2021 • Tao Luo, Zhehui Wang, Daniel Gerlinghoff, Rick Siow Mong Goh, Weng-Fai Wong
In this paper, we propose BLUnet, a table lookup-based DNN model with bit-serialized input to overcome this challenge.
no code implementations • 25 May 2021 • Tao Luo, Wai Teng Tang, Matthew Kay Fei Lee, Chuping Qu, Weng-Fai Wong, Rick Goh
DTNN achieved significant energy saving (19. 4X and 64. 9X improvement on ResNet-18 and VGG-11 with ImageNet, respectively) with negligible loss of accuracy.
1 code implementation • 25 Nov 2019 • Bo Wang, Jun Zhou, Weng-Fai Wong, Li-Shiuan Peh
We show that conventional artificial neural networks (ANN) such as multilayer perceptron, convolutional neural networks, as well as the latest residual neural networks can be mapped successfully onto Shenjing, realizing ANNs with SNN's energy efficiency.