Paper

OTFace: Hard Samples Guided Optimal Transport Loss for Deep Face Representation

Face representation in the wild is extremely hard due to the large scale face variations. To this end, some deep convolutional neural networks (CNNs) have been developed to learn discriminative feature by designing properly margin-based losses, which perform well on easy samples but fail on hard samples. Based on this, some methods mainly adjust the weights of hard samples in training stage to improve the feature discrimination. However, these methods overlook the feature distribution property which may lead to better results since the miss-classified hard samples may be corrected by using the distribution metric. This paper proposes the hard samples guided optimal transport (OT) loss for deep face representation, OTFace for short. OTFace aims to enhance the performance of hard samples by introducing the feature distribution discrepancy while maintain the performance on easy samples. Specifically, we embrace triplet scheme to indicate hard sample groups in one mini-batch during training. OT is then used to characterize the distribution differences of features from the high level convolutional layer. Finally, we integrate the margin-based-softmax (e.g. ArcFace or AM-Softmax) and OT to guide deep CNN learning. Extensive experiments are conducted on several benchmark databases. The quantitative results demonstrate the advantages of the proposed OTFace over state-of-the-art methods.

Results in Papers With Code
(↓ scroll down to see all results)