Outlier Exposure with Confidence Control for Out-of-Distribution Detection

Overview

PWC PWC PWC PWC

OOD-detection-using-OECC

This repository contains the essential code for the paper Outlier Exposure with Confidence Control for Out-of-Distribution Detection. Accepted as a Journal article in Neurocomputing, 2021.

1. What is Outlier Exposure with Confidence Control (OECC)?

Outlier Exposure with Confidence Control (OECC) is a technique that helps a Deep Neural Network (DNN) learn how to distinguish in- and out-of-distribution (OOD) data without requiring access to OOD samples. This technique has been shown that it can generalize to new distibutions. To learn how to distinguish in- and out-of-distribution samples, OECC makes a DNN to be highly uncertain for OOD samples by producing a uniform distribution at the output of the softmax layer. At the same time, it also makes it to make predictions for in-distribution samples with an average confidence close to its training accuracy, i.e. it controls its confidence.

The overall OECC loss function outperforms the previous SOTA results in OOD detection with OE both in image and text classification tasks. Additionally, we experimentally show in the paper that by combining OECC with SOTA post-training methods for OOD detection like the Mahalanobis Detector or the Gramian Matrices, one can achieve SOTA results in the OOD detection task.

2. Visualize the idea behind OECC

Figure. Histogram of softmax probabilities with CIFAR-10 as in-distribution data Din and Places365 as Out-of-Distribution (OOD) data Dout. Note that Din and Dout are disjoint. Left: Standard maximum softmax probability detector. Right: Maximum softmax probability detector using OECC.

3. Download Datasets

Some of the less common datasets can be downloaded by the following links: 80 Million Tiny Images, Icons-50, Textures, Chars74K, and Places365. Please also try this link in case the previous link is not working 80 Million Tiny Images.

4. How to Run

Each folder has its own separate README file with full details describing how to run the provided code.

5. Citation

If you find this useful in your research, please consider citing:

@article{PAPADOPOULOS2021138,
    title = {Outlier exposure with confidence control for out-of-distribution detection},
    journal = {Neurocomputing},
    volume = {441},
    pages = {138-150},
    year = {2021},
    issn = {0925-2312},
    doi = {https://doi.org/10.1016/j.neucom.2021.02.007},
    url = {https://www.sciencedirect.com/science/article/pii/S0925231221002393},
    author = {Aristotelis-Angelos Papadopoulos and Mohammad Reza Rajati and Nazim Shaikh and Jiamian Wang},
    keywords = {Out-of-distribution detection, Regularization, Anomaly detection, Deep neural networks, Outlier exposure, Calibration}
}

6. Code References

A part of the code has been based on the publicly available codes of Outlier Exposure and Mahalanobis.

Owner
Nazim Shaikh
Nazim Shaikh
End-To-End Crowdsourcing

End-To-End Crowdsourcing Comparison of traditional crowdsourcing approaches to a state-of-the-art end-to-end crowdsourcing approach LTNet on sentiment

Andreas Koch 1 Mar 06, 2022
Implementation of Bidirectional Recurrent Independent Mechanisms (Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules)

BRIMs Bidirectional Recurrent Independent Mechanisms Implementation of the paper Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neura

Sarthak Mittal 26 May 26, 2022
Implementations for the ICLR-2021 paper: SEED: Self-supervised Distillation For Visual Representation.

Implementations for the ICLR-2021 paper: SEED: Self-supervised Distillation For Visual Representation.

Jacob 27 Oct 23, 2022
An SMPC companion library for Syft

SyMPC A library that extends PySyft with SMPC support SyMPC /ˈsɪmpəθi/ is a library which extends PySyft ≥0.3 with SMPC support. It allows computing o

Arturo Marquez Flores 0 Oct 13, 2021
Constructing interpretable quadratic accuracy predictors to serve as an objective function for an IQCQP problem that represents NAS under latency constraints and solve it with efficient algorithms.

IQNAS: Interpretable Integer Quadratic programming Neural Architecture Search Realistic use of neural networks often requires adhering to multiple con

0 Oct 24, 2021
Consensus score for tripadvisor

ContripScore ContripScore is essentially a score that combines an Internet platform rating and a consensus rating from sentiment analysis (For instanc

Pepe 1 Jan 13, 2022
Storage-optimizer - Identify potintial optimizations on the cloud storage accounts

Storage Optimizer Identify potintial optimizations on the cloud storage accounts

Zaher Mousa 1 Feb 13, 2022
WSDM2022 Challenge - Large scale temporal graph link prediction

WSDM 2022 Large-scale Temporal Graph Link Prediction - Baseline and Initial Test Set WSDM Cup Website link Link to this challenge This branch offers A

Deep Graph Library 34 Dec 29, 2022
constructing maps of intellectual influence from publication data

Influencemap Project @ ANU Influence in the academic communities has been an area of interest for researchers. This can be seen in the popularity of a

CS Metrics 13 Jun 18, 2022
Create and implement a deep learning library from scratch.

In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The Proj

Rishabh Bali 22 Aug 23, 2022
Pytorch Implementation of DiffSinger: Diffusion Acoustic Model for Singing Voice Synthesis (TTS Extension)

DiffSinger - PyTorch Implementation PyTorch implementation of DiffSinger: Diffusion Acoustic Model for Singing Voice Synthesis (TTS Extension). Status

Keon Lee 152 Jan 02, 2023
Neurons Dataset API - The official dataloader and visualization tools for Neurons Datasets.

Neurons Dataset API - The official dataloader and visualization tools for Neurons Datasets. Introduction We propose our dataloader API for loading and

1 Nov 19, 2021
Few-shot Neural Architecture Search

One-shot Neural Architecture Search uses a single supernet to approximate the performance each architecture. However, this performance estimation is super inaccurate because of co-adaption among oper

Yiyang Zhao 38 Oct 18, 2022
Machine Learning Framework for Operating Systems - Brings ML to Linux kernel

KML: A Machine Learning Framework for Operating Systems & Storage Systems Storage systems and their OS components are designed to accommodate a wide v

File systems and Storage Lab (FSL) 186 Nov 24, 2022
Single-Stage Instance Shadow Detection with Bidirectional Relation Learning (CVPR 2021 Oral)

Single-Stage Instance Shadow Detection with Bidirectional Relation Learning (CVPR 2021 Oral) Tianyu Wang*, Xiaowei Hu*, Chi-Wing Fu, and Pheng-Ann Hen

Steve Wong 51 Oct 20, 2022
CAUSE: Causality from AttribUtions on Sequence of Events

CAUSE: Causality from AttribUtions on Sequence of Events

Wei Zhang 21 Dec 01, 2022
PyTorch Code of "Memory In Memory: A Predictive Neural Network for Learning Higher-Order Non-Stationarity from Spatiotemporal Dynamics"

Memory In Memory Networks It is based on the paper Memory In Memory: A Predictive Neural Network for Learning Higher-Order Non-Stationarity from Spati

Yang Li 12 May 30, 2022
Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21)

Learning Structural Edits via Incremental Tree Transformations Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21) 1.

NeuLab 40 Dec 23, 2022
Reading list for research topics in Masked Image Modeling

awesome-MIM Reading list for research topics in Masked Image Modeling(MIM). We list the most popular methods for MIM, if I missed something, please su

ligang 231 Dec 07, 2022
Meta Learning Backpropagation And Improving It (VSML)

Meta Learning Backpropagation And Improving It (VSML) This is research code for the NeurIPS 2021 publication Kirsch & Schmidhuber 2021. Many concepts

Louis Kirsch 22 Dec 21, 2022