no code implementations • 19 Jul 2022 • Zhifeng Qiu, Wanxin Zeng, Dahua Liao, Ning Gui
Guided by the integrated information from the multi-self-supervised learning model, a batch-attention mechanism is designed to generate feature weights according to batch-based feature selection patterns to alleviate the impacts introduced by a handful of noisy data.