Official Pytorch implementation of "Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Overview

Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Official Pytorch implementation of Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Setup

This setting requires CUDA 11. However, you can still use your own environment by installing requirements including PyTorch and Torchvision.

  1. Install conda environment and activate it
conda env create -f environment.yml
conda activate biascon
  1. Prepare dataset.
  • Biased MNIST
    By default, we set download=True for convenience.
    Thus, you only have to make the empty dataset directory with mkdir -p data/biased_mnist and run the code.

  • CelebA
    Download CelebA dataset under data/celeba

  • UTKFace
    Download UTKFace dataset under data/utk_face

  • ImageNet & ImageNet-A
    We use ILSVRC 2015 ImageNet dataset.
    Download ImageNet under ./data/imagenet and ImageNet-A under ./data/imagenet-a

Biased MNIST (w/ bias labels)

We use correlation {0.999, 0.997, 0.995, 0.99, 0.95, 0.9}.

Bias-contrastive loss (BiasCon)

python train_biased_mnist_bc.py --corr 0.999 --seed 1

Bias-balancing loss (BiasBal)

python train_biased_mnist_bb.py --corr 0.999 --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_biased_mnist_bc.py --bb 1 --corr 0.999 --seed 1

CelebA

We assess CelebA dataset with target attributes of HeavyMakeup (--task makeup) and Blonde (--task blonde).

Bias-contrastive loss (BiasCon)

python train_celeba_bc.py --task makeup --seed 1

Bias-balancing loss (BiasBal)

python train_celeba_bb.py --task makeup --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_celeba_bc.py --bb 1 --task makeup --seed 1

UTKFace

We assess UTKFace dataset biased toward Race (--task race) and Age (--task age) attributes.

Bias-contrastive loss (BiasCon)

python train_utk_face_bc.py --task race --seed 1

Bias-balancing loss (BiasBal)

python train_utk_face_bb.py --task race --seed 1

Joint use of BiasCon and BiasBal losses (BC+BB)

python train_utk_face_bc.py --bb 1 --task race --seed 1

Biased MNIST (w/o bias labels)

We use correlation {0.999, 0.997, 0.995, 0.99, 0.95, 0.9}.

Soft Bias-contrastive loss (SoftCon)

  1. Train a bias-capturing model and get bias features.
python get_biased_mnist_bias_features.py --corr 0.999 --seed 1
  1. Train a model with bias features.
python train_biased_mnist_softcon.py --corr 0.999 --seed 1

ImageNet

We use texture cluster information from ReBias (Bahng et al., 2020).

Soft Bias-contrastive loss (SoftCon)

  1. Train a bias-capturing model and get bias features.
python get_imagenet_bias_features.py --seed 1
  1. Train a model with bias features.
python train_imagenet_softcon.py --seed 1
Owner
Youngkyu
Machine Learning Engineer / Backend Engineer
Youngkyu
Automatic Video Captioning Evaluation Metric --- EMScore

Automatic Video Captioning Evaluation Metric --- EMScore Overview For an illustration, EMScore can be computed as: Installation modify the encode_text

Yaya Shi 17 Nov 28, 2022
(AAAI 2021) Progressive One-shot Human Parsing

End-to-end One-shot Human Parsing This is the official repository for our two papers: Progressive One-shot Human Parsing (AAAI 2021) End-to-end One-sh

54 Dec 30, 2022
Title: Graduate-Admissions-Predictor

The purpose of this project is create a predictive model capable of identifying the probability of a person securing an admit based on their personal profile parameters. Simplified visualisations hav

Akarsh Singh 1 Jan 26, 2022
Deep Video Matting via Spatio-Temporal Alignment and Aggregation [CVPR2021]

Deep Video Matting via Spatio-Temporal Alignment and Aggregation [CVPR2021] Paper: https://arxiv.org/abs/2104.11208 Introduction Despite the significa

76 Dec 07, 2022
Text Extraction Formulation + Feedback Loop for state-of-the-art WSD (EMNLP 2021)

ConSeC is a novel approach to Word Sense Disambiguation (WSD), accepted at EMNLP 2021. It frames WSD as a text extraction task and features a feedback loop strategy that allows the disambiguation of

Sapienza NLP group 36 Dec 13, 2022
Julia package for contraction of tensor networks, based on the sweep line algorithm outlined in the paper General tensor network decoding of 2D Pauli codes

Julia package for contraction of tensor networks, based on the sweep line algorithm outlined in the paper General tensor network decoding of 2D Pauli codes

Christopher T. Chubb 35 Dec 21, 2022
Dataset VSD4K includes 6 popular categories: game, sport, dance, vlog, interview and city.

CaFM-pytorch ICCV ACCEPT Introduction of dataset VSD4K Our dataset VSD4K includes 6 popular categories: game, sport, dance, vlog, interview and city.

96 Jul 05, 2022
Recursive Bayesian Networks

Recursive Bayesian Networks This repository contains the code to reproduce the results from the NeurIPS 2021 paper Lieck R, Rohrmeier M (2021) Recursi

Robert Lieck 11 Oct 18, 2022
InferPy: Deep Probabilistic Modeling with Tensorflow Made Easy

InferPy: Deep Probabilistic Modeling Made Easy InferPy is a high-level API for probabilistic modeling written in Python and capable of running on top

PGM-Lab 141 Oct 13, 2022
Code to reproduce the experiments from our NeurIPS 2021 paper " The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective"

Code To run: python runner.py new --save SAVE_NAME --data PATH_TO_DATA_DIR --dataset DATASET --model model_name [options] --n 1000 - train - t

Geoff Pleiss 5 Dec 12, 2022
đź’› Code and Dataset for our EMNLP 2021 paper: "Perspective-taking and Pragmatics for Generating Empathetic Responses Focused on Emotion Causes"

Perspective-taking and Pragmatics for Generating Empathetic Responses Focused on Emotion Causes Official PyTorch implementation and EmoCause evaluatio

Hyunwoo Kim 51 Jan 06, 2023
Alex Pashevich 62 Dec 24, 2022
InsTrim: Lightweight Instrumentation for Coverage-guided Fuzzing

InsTrim The paper: InsTrim: Lightweight Instrumentation for Coverage-guided Fuzzing Build Prerequisite llvm-8.0-dev clang-8.0 cmake = 3.2 Make git cl

75 Dec 23, 2022
Repository for code and dataset for our EMNLP 2021 paper - “So You Think You’re Funny?”: Rating the Humour Quotient in Standup Comedy.

AI-OpenMic Dataset The dataset is available for download via the follwing link. Repository for code and dataset for our EMNLP 2021 paper - “So You Thi

6 Oct 26, 2022
Code, pre-trained models and saliency results for the paper "Boosting RGB-D Saliency Detection by Leveraging Unlabeled RGB Images".

Boosting RGB-D Saliency Detection by Leveraging Unlabeled RGB This repository is the official implementation of the paper. Our results comming soon in

Xiaoqiang Wang 8 May 22, 2022
[ICCV 2021] A Simple Baseline for Semi-supervised Semantic Segmentation with Strong Data Augmentation

[ICCV 2021] A Simple Baseline for Semi-supervised Semantic Segmentation with Strong Data Augmentation

CodingMan 45 Dec 12, 2022
Open source implementation of AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

AceNAS This repo is the experiment code of AceNAS, and is not considered as an official release. We are working on integrating AceNAS as a built-in st

Yuge Zhang 6 Sep 07, 2022
Release of the ConditionalQA dataset

ConditionalQA Datasets accompanying the paper ConditionalQA: A Complex Reading Comprehension Dataset with Conditional Answers. Disclaimer This dataset

14 Oct 17, 2022
[NAACL & ACL 2021] SapBERT: Self-alignment pretraining for BERT.

SapBERT: Self-alignment pretraining for BERT This repo holds code for the SapBERT model presented in our NAACL 2021 paper: Self-Alignment Pretraining

Cambridge Language Technology Lab 104 Dec 07, 2022
One-line your code easily but still with the fun of doing so!

One-liner-iser One-line your code easily but still with the fun of doing so! Have YOU ever wanted to write one-line Python code, but don't have the sa

5 May 04, 2022