The implementation of "Bootstrapping Semantic Segmentation with Regional Contrast".

Overview

ReCo - Regional Contrast

This repository contains the source code of ReCo and baselines from the paper, Bootstrapping Semantic Segmentation with Regional Contrast, introduced by Shikun Liu, Shuaifeng Zhi, Edward Johns, and Andrew Davison.

Check out our project page for more qualitative results.

Datasets

ReCo is evaluated with three datasets: CityScapes, PASCAL VOC and SUN RGB-D in the full label mode, among which CityScapes and PASCAL VOC are additionally evaluated in the partial label mode.

  • For CityScapes, please download the original dataset from the official CityScapes site: leftImg8bit_trainvaltest.zip and gtFine_trainvaltest.zip. Create and extract them to the corresponding dataset/cityscapes folder.
  • For Pascal VOC, please download the original training images from the official PASCAL site: VOCtrainval_11-May-2012.tar and the augmented labels here: SegmentationClassAug.zip. Extract the folder JPEGImages and SegmentationClassAug into the corresponding dataset/pascal folder.
  • For SUN RGB-D, please download the train dataset here: SUNRGBD-train_images.tgz, test dataset here: SUNRGBD-test_images.tgz and labels here: sunrgbd_train_test_labels.tar.gz. Extract and place them into the corresponding dataset/sun folder.

After making sure all datasets having been downloaded and placed correctly, run each processing file python dataset/{DATASET}_preprocess.py to pre-process each dataset ready for the experiments. The preprocessing file also includes generating partial label for Cityscapes and Pascal dataset with three random seeds. Feel free to modify the partial label size and random seed to suit your own research setting.

For the lazy ones: just download the off-the-shelf pre-processed datasets here: CityScapes, Pascal VOC and SUN RGB-D.

Training Supervised and Semi-supervised Models

In this paper, we introduce two novel training modes for semi-supervised learning.

  1. Full Labels Partial Dataset: A sparse subset of training images has full ground-truth labels, with the remaining data unlabelled.
  2. Partial Labels Full Dataset: All images have some labels, but covering only a sparse subset of pixels.

Running the following four scripts would train each mode with supervised or semi-supervised methods respectively:

python train_sup.py             # Supervised learning with full labels.
python train_semisup.py         # Semi-supervised learning with full labels.
python train_sup_partial.py     # Supervised learning with partial labels.
python train_semisup_patial.py  # Semi-supervised learning with partial labels.

Important Flags

All supervised and semi-supervised methods can be trained with different flags (hyper-parameters) when running each training script. We briefly introduce some important flags for the experiments below.

Flag Name Usage Comments
num_labels number of labelled images in the training set, choose 0 for training all labelled images only available in the full label mode
partial percentage of labeled pixels for each class in the training set, choose p0, p1, p5, p25 for training 1, 1%, 5%, 25% labelled pixel(s) respectively only available in the partial label mode
num_negatives number of negative keys sampled for each class in each mini-batch only applied when training with ReCo loss
num_queries number of queries sampled for each class in each mini-batch only applied when training with ReCo loss
output_dim dimensionality for pixel-level representation only applied when training with ReCo loss
temp temperature used in contrastive learning only applied when training with ReCo loss
apply_aug semi-supervised methods with data augmentation, choose cutout, cutmix, classmix only available in the semi-supervised methods; our implementations for CutOut, CutMix and ClassMix
weak_threshold weak threshold delta_w in active sampling only applied when training with ReCo loss
strong_threshold strong threshold delta_s in active sampling only applied when training with ReCo loss
apply_reco toggle on or off apply our proposed ReCo loss

Training ReCo + ClassMix with the fewest full label setting in each dataset (the least appeared classes in each dataset have appeared in 5 training images):

python train_semisup.py --dataset pascal --num_labels 60 --apply_aug classmix --apply_reco
python train_semisup.py --dataset cityscapes --num_labels 20 --apply_aug classmix --apply_reco
python train_semisup.py --dataset sun --num_labels 50 --apply_aug classmix --apply_reco

Training ReCo + ClassMix with the fewest partial label setting in each dataset (each class in each training image only has 1 labelled pixel):

python train_semisup_partial.py --dataset pascal --partial p0 --apply_aug classmix --apply_reco
python train_semisup_partial.py --dataset cityscapes --partial p0 --apply_aug classmix --apply_reco
python train_semisup_partial.py --dataset sun --partial p0 --apply_aug classmix --apply_reco

Training ReCo + Supervised with all labelled data:

python train_sup.py --dataset {DATASET} --num_labels 0 --apply_reco

Training with ReCo is expected to require 12 - 16G of memory in a single GPU setting. All the other baselines can be trained under 12G in a single GPU setting.

Visualisation on Pre-trained Models

We additionally provide the pre-trained baselines and our method for 20 labelled Cityscapes and 60 labelled Pascal VOC, as examples for visualisation. The precise mIoU performance for each model is listed in the following table. The pre-trained models will produce the exact same qualitative results presented in the original paper.

Supervised ClassMix ReCo + ClassMix
CityScapes (20 Labels) 38.10 [link] 45.13 [link] 50.14 [link]
Pascal VOC (60 Labels) 36.06 [link] 53.71 [link] 57.12 [link]

Download the pre-trained models with the links above, then create and place them into the folder model_weights in this repository. Run python visual.py to visualise the results.

Other Notices

  1. We observe that the performance for the full label semi-supervised setting in CityScapes dataset is not stable across different machines, for which all methods may drop 2-5% performance, though the ranking keeps the same. Different GPUs in the same machine do not affect the performance. The performance for the other datasets in the full label mode, and the performance for all datasets in the partial label mode is consistent.
  2. Please use --seed 0, 1, 2 to accurately reproduce/compare our results with the exactly same labelled and unlabelled split we used in our experiments.

Citation

If you found this code/work to be useful in your own research, please considering citing the following:

@article{liu2021reco,
    title={Bootstrapping Semantic Segmentation with Regional Contrast},
    author={Liu, Shikun and Zhi, Shuaifeng and Johns, Edward and Davison, Andrew J},
    journal={arXiv preprint arXiv:2104.04465},
    year={2021}
}

Contact

If you have any questions, please contact [email protected].

Owner
Shikun Liu
Ph.D. Student, The Dyson Robotics Lab at Imperial College.
Shikun Liu
Use stochastic processes to generate samples and use them to train a fully-connected neural network based on Keras

Use stochastic processes to generate samples and use them to train a fully-connected neural network based on Keras which will then be used to generate residuals

Federico Lopez 2 Jan 14, 2022
AlgoVision - A Framework for Differentiable Algorithms and Algorithmic Supervision

NeurIPS 2021 Paper "Learning with Algorithmic Supervision via Continuous Relaxations"

Felix Petersen 76 Jan 01, 2023
Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style

Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style [NeurIPS 2021] Official code to reproduce the results and data p

Yash Sharma 27 Sep 19, 2022
Gender Classification Machine Learning Model using Sk-learn in Python with 97%+ accuracy and deployment

Gender-classification This is a ML model to classify Male and Females using some physical characterstics Data. Python Libraries like Pandas,Numpy and

Aryan raj 11 Oct 16, 2022
Code for the paper "Balancing Training for Multilingual Neural Machine Translation, ACL 2020"

Balancing Training for Multilingual Neural Machine Translation Implementation of the paper Balancing Training for Multilingual Neural Machine Translat

Xinyi Wang 21 May 18, 2022
Non-Imaging Transient Reconstruction And TEmporal Search (NITRATES)

Non-Imaging Transient Reconstruction And TEmporal Search (NITRATES) This repo contains the full NITRATES pipeline for maximum likelihood-driven discov

13 Nov 08, 2022
A lossless neural compression framework built on top of JAX.

Kompressor Branch CI Coverage main (active) main development A neural compression framework built on top of JAX. Install setup.py assumes a compatible

Rosalind Franklin Institute 2 Mar 14, 2022
Overview of architecture and implementation of TEDS-Net, as described in MICCAI 2021: "TEDS-Net: Enforcing Diffeomorphisms in Spatial Transformers to Guarantee TopologyPreservation in Segmentations"

TEDS-Net Overview of architecture and implementation of TEDS-Net, as described in MICCAI 2021: "TEDS-Net: Enforcing Diffeomorphisms in Spatial Transfo

Madeleine K Wyburd 14 Jan 04, 2023
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

P-tuning A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''. How to use our code We have released the code

THUDM 562 Dec 27, 2022
The official PyTorch implementation of the paper: *Xili Dai, Xiaojun Yuan, Haigang Gong, Yi Ma. "Fully Convolutional Line Parsing." *.

F-Clip — Fully Convolutional Line Parsing This repository contains the official PyTorch implementation of the paper: *Xili Dai, Xiaojun Yuan, Haigang

Xili Dai 115 Dec 28, 2022
Efficient 3D human pose estimation in video using 2D keypoint trajectories

3D human pose estimation in video with temporal convolutions and semi-supervised training This is the implementation of the approach described in the

Meta Research 3.1k Dec 29, 2022
A simple Neural Network that predicts the label for a series of handwritten digits

Neural_Network A simple Neural Network that predicts the label for a series of handwritten numbers This program tries to predict the label (1,2,3 etc.

Ty 1 Dec 18, 2021
Deep learning models for change detection of remote sensing images

Change Detection Models (Remote Sensing) Python library with Neural Networks for Change Detection based on PyTorch. ⚡ ⚡ ⚡ I am trying to build this pr

Kaiyu Li 176 Dec 24, 2022
[ACM MM 2021] Yes, "Attention is All You Need", for Exemplar based Colorization

Transformer for Image Colorization This is an implemention for Yes, "Attention Is All You Need", for Exemplar based Colorization, and the current soft

Wang Yin 30 Dec 07, 2022
Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)

FRSKD Official implementation for Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation (CVPR-2021) Requirements Pytho

75 Dec 28, 2022
Dynamic Slimmable Network (CVPR 2021, Oral)

Dynamic Slimmable Network (DS-Net) This repository contains PyTorch code of our paper: Dynamic Slimmable Network (CVPR 2021 Oral). Architecture of DS-

Changlin Li 197 Dec 09, 2022
[ICML 2021] A fast algorithm for fitting robust decision trees.

GROOT: Growing Robust Trees Growing Robust Trees (GROOT) is an algorithm that fits binary classification decision trees such that they are robust agai

Cyber Analytics Lab 17 Nov 21, 2022
Implementation of "Semi-supervised Domain Adaptive Structure Learning"

Semi-supervised Domain Adaptive Structure Learning - ASDA This repo contains the source code and dataset for our ASDA paper. Illustration of the propo

3 Dec 13, 2021
Pytorch Implementation of Various Point Transformers

Pytorch Implementation of Various Point Transformers Recently, various methods applied transformers to point clouds: PCT: Point Cloud Transformer (Men

Neil You 434 Dec 30, 2022
Probabilistic Gradient Boosting Machines

PGBM Probabilistic Gradient Boosting Machines (PGBM) is a probabilistic gradient boosting framework in Python based on PyTorch/Numba, developed by Air

Olivier Sprangers 112 Dec 28, 2022