Exploring Classification Equilibrium in Long-Tailed Object Detection

ICCV 2021  ·  Chengjian Feng, Yujie Zhong, Weilin Huang ·

The conventional detectors tend to make imbalanced classification and suffer performance drop, when the distribution of the training data is severely skewed. In this paper, we propose to use the mean classification score to indicate the classification accuracy for each category during training. Based on this indicator, we balance the classification via an Equilibrium Loss (EBL) and a Memory-augmented Feature Sampling (MFS) method. Specifically, EBL increases the intensity of the adjustment of the decision boundary for the weak classes by a designed score-guided loss margin between any two classes. On the other hand, MFS improves the frequency and accuracy of the adjustment of the decision boundary for the weak classes through over-sampling the instance features of those classes. Therefore, EBL and MFS work collaboratively for finding the classification equilibrium in long-tailed detection, and dramatically improve the performance of tail classes while maintaining or even improving the performance of head classes. We conduct experiments on LVIS using Mask R-CNN with various backbones including ResNet-50-FPN and ResNet-101-FPN to show the superiority of the proposed method. It improves the detection performance of tail classes by 15.6 AP, and outperforms the most recent long-tailed object detectors by more than 1 AP. Code is available at https://github.com/fcjian/LOCE.

PDF Abstract ICCV 2021 PDF ICCV 2021 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Object Detection LVIS v1.0 val R101-MaskRCNN-LOCE box AP 29 # 11
Instance Segmentation LVIS v1.0 val R101-MaskRCNN-LOCE mask AP 28 # 14
Instance Segmentation LVIS v1.0 val R50-MaskRCNN-LOCE mask AP 26.6 # 18
Object Detection LVIS v1.0 val R50-MaskRCNN-LOCE box AP 27.4 # 12

Methods