no code implementations • ECCV 2020 • Zilong Ji, Xiaolong Zou, Xiaohan Lin, Xiao Liu, Tiejun Huang, Si Wu
By iteratively learning with the two strategies, the attentive regions are gradually shifted from the background to the foreground and the features become more discriminative.
no code implementations • NeurIPS 2021 • Xingsi Dong, Tianhao Chu, Tiejun Huang, Zilong Ji, Si Wu
To elucidate the underlying mechanism clearly, we first study continuous attractor neural networks (CANNs), and find that noisy neural adaptation, exemplified by spike frequency adaptation (SFA) in this work, can generate Lévy flights representing transitions of the network state in the attractor space.
no code implementations • 23 Aug 2020 • Zilong Ji, Xiaolong Zou, Tiejun Huang, Si Wu
In this study, we build a computational model to elucidate the computational advantages associated with the interactions between two pathways.
no code implementations • 20 Dec 2019 • Zilong Ji, Xiaolong Zou, Tiejun Huang, Si Wu
The proposed model consists of two alternate processes, progressive clustering and episodic training.
1 code implementation • NeurIPS 2019 • Xiao Liu, Xiaolong Zou, Zilong Ji, Gengshuo Tian, Yuanyuan Mi, Tiejun Huang, K. Y. Michael Wong, Si Wu
Experimental data has revealed that in addition to feedforward connections, there exist abundant feedback connections in a neural pathway.
no code implementations • 25 Sep 2019 • Zilong Ji, Xiaolong Zou, Tiejun Huang, Si Wu
Using the benchmark dataset Omniglot, we show that our model outperforms other unsupervised few-shot learning methods to a large extend and approaches to the performances of supervised methods.
no code implementations • 28 Jul 2019 • Yuanyuan Mi, Xiaohan Lin, Xiaolong Zou, Zilong Ji, Tiejun Huang, Si Wu
Spatiotemporal information processing is fundamental to brain functions.