no code implementations • 25 Jan 2024 • Chuankun Li, Shuai Li, Yanbo Gao, Ping Chen, Jian Li, Wanqing Li
To address this problem, the overfitting mechanism behind the unsupervised learning for skeleton based action recognition is first investigated.
no code implementations • 21 Sep 2023 • Yanbo Gao, Wenjia Huang, Shuai Li, Hui Yuan, Mao Ye, Siwei Ma
Similar as the traditional video coding, LVC inherits motion estimation/compensation, residual coding and other modules, all of which are implemented with neural networks (NNs).
no code implementations • 24 Jun 2021 • Jian Yue, Yanbo Gao, Shuai Li, Hui Yuan, Frédéric Dufaux
To the best of our knowledge, we are the first one that clearly characterizes the video filtering process from the above global appearance and local coding distortion restoration aspects with experimental verification, providing a clear pathway to developing filter techniques.
no code implementations • 22 Jan 2021 • Chuankun Li, Shuai Li, Yanbo Gao, Xiang Zhang, Wanqing Li
The self-attention based graph convolutional network has a dynamic self-attention mechanism to adaptively exploit the relationships of all hand joints in addition to the fixed topology and local feature extraction in the GCN.
no code implementations • 1 Nov 2020 • Beidi Zhao, Shuai Li, Yanbo Gao, Chuankun Li, Wanqing Li
Smartphone sensors based human activity recognition is attracting increasing interests nowadays with the popularization of smartphones.
1 code implementation • 11 Oct 2019 • Shuai Li, Wanqing Li, Chris Cook, Yanbo Gao
Recurrent neural networks (RNNs) are known to be difficult to train due to the gradient vanishing and exploding problems and thus difficult to learn long-term patterns and construct deep networks.
11 code implementations • CVPR 2018 • Shuai Li, Wanqing Li, Chris Cook, Ce Zhu, Yanbo Gao
Experimental results have shown that the proposed IndRNN is able to process very long sequences (over 5000 time steps), can be used to construct very deep networks (21 layers used in the experiment) and still be trained robustly.
Ranked #10 on Language Modelling on Penn Treebank (Character Level)
no code implementations • 16 Jun 2017 • Shuai Li, Wanqing Li, Chris Cook, Ce Zhu, Yanbo Gao
Such a network with learnable pooling function is referred to as a fully trainable network (FTN).