1 code implementation • 13 Feb 2022 • Haixu Wu, Jialong Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long
By respectively conserving the incoming flow of sinks for source competition and the outgoing flow of sources for sink allocation, Flow-Attention inherently generates informative attentions without using specific inductive biases.
Ranked #4 on D4RL on D4RL
3 code implementations • ICLR 2022 • Jiehui Xu, Haixu Wu, Jianmin Wang, Mingsheng Long
Unsupervised detection of anomaly points in time series is a challenging problem, which requires the model to derive a distinguishable criterion.
2 code implementations • NeurIPS 2021 • Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long
Going beyond Transformers, we design Autoformer as a novel decomposition architecture with an Auto-Correlation mechanism.