Humans in 4D: Reconstructing and Tracking Humans with Transformers

We present an approach to reconstruct humans and track them over time. At the core of our approach, we propose a fully "transformerized" version of a network for human mesh recovery. This network, HMR 2.0, advances the state of the art and shows the capability to analyze unusual poses that have in the past been difficult to reconstruct from single images. To analyze video, we use 3D reconstructions from HMR 2.0 as input to a tracking system that operates in 3D. This enables us to deal with multiple people and maintain identities through occlusion events. Our complete approach, 4DHumans, achieves state-of-the-art results for tracking people from monocular video. Furthermore, we demonstrate the effectiveness of HMR 2.0 on the downstream task of action recognition, achieving significant improvements over previous pose-based action recognition approaches. Our code and models are available on the project website: https://shubham-goel.github.io/4dhumans/.

PDF Abstract ICCV 2023 PDF ICCV 2023 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
3D Human Pose Estimation 3DPW HMR 2.0 PA-MPJPE 44.4 # 29
MPJPE 69.8 # 22
MPVPE 82.2 # 19
3D Human Pose Estimation Human3.6M HMR 2.0a Average MPJPE (mm) 44.8 # 105
PA-MPJPE 33.6 # 17
Pose Tracking PoseTrack2018 4DHumans + ViTDet MOTA 61.9 # 3
IDF1 79.3 # 1
IDs 367 # 2

Methods