Paper

Multi-Class Data Description for Out-of-distribution Detection

The capability of reliably detecting out-of-distribution samples is one of the key factors in deploying a good classifier, as the test distribution always does not match with the training distribution in most real-world applications. In this work, we present a deep multi-class data description, termed as Deep-MCDD, which is effective to detect out-of-distribution (OOD) samples as well as classify in-distribution (ID) samples. Unlike the softmax classifier that only focuses on the linear decision boundary partitioning its latent space into multiple regions, our Deep-MCDD aims to find a spherical decision boundary for each class which determines whether a test sample belongs to the class or not. By integrating the concept of Gaussian discriminant analysis into deep neural networks, we propose a deep learning objective to learn class-conditional distributions that are explicitly modeled as separable Gaussian distributions. Thereby, we can define the confidence score by the distance of a test sample from each class-conditional distribution, and utilize it for identifying OOD samples. Our empirical evaluation on multi-class tabular and image datasets demonstrates that Deep-MCDD achieves the best performances in distinguishing OOD samples while showing the classification accuracy as high as the other competitors.

Results in Papers With Code
(↓ scroll down to see all results)