1 code implementation • 9 Apr 2024 • Xiaolong Tang, Meina Kan, Shiguang Shan, Zhilong Ji, Jinfeng Bai, Xilin Chen
The proposed Historical Prediction Attention together with the Agent Attention and Mode Attention is further formulated as the Triple Factorized Attention module, serving as the core design of HPNet. Experiments on the Argoverse and INTERACTION datasets show that HPNet achieves state-of-the-art performance, and generates accurate and stable future trajectories.
no code implementations • 7 Mar 2023 • Xiaolong Tang, Tianheng Hu, Yufeng Shi
We introduce information capacity, a metric that represents the amount of information contained in a filter.