Outlier Exposure with Confidence Control for Out-of-Distribution Detection

Aristotelis-Angelos Papadopoulos, Mohammad Reza Rajati, Nazim Shaikh, Jiamian Wang

Deep neural networks have achieved great success in classification tasks during the last years. However, one major problem to the path towards artificial intelligence is the inability of neural networks to accurately detect samples from novel class distributions and therefore, most of the existent classification algorithms assume that all classes are known prior to the training stage. In this work, we propose a methodology for training a neural network that allows it to efficiently detect out-of-distribution (OOD) examples without compromising much of its classification accuracy on the test examples from known classes. Based on the Outlier Exposure (OE) technique, we propose a novel loss function, Outlier Exposure with Confidence Control (OECC), that achieves state-of-the-art results in out-of-distribution detection with OE both on image and text classification tasks without requiring access to OOD samples. Additionally, we experimentally show that the combination of OECC with the Mahalanobis distance-based classifier achieves state-of-the-art results in the OOD detection task.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment