Generative vs Discriminative: Rethinking The Meta-Continual Learning (NeurIPS 2021)

Related tags

Deep LearningGeMCL
Overview




Generative vs Discriminative: Rethinking The Meta-Continual Learning (NeurIPS 2021)

In this repository we provide PyTorch implementations for GeMCL; a generative approach for meta-continual learning. The directory outline is as follows:

root
 ├── code                 # The folder containing all pytorch implementations
       ├── datasets           # The path containing Dataset classes and train/test parameters for each dataset
            ├── omnigolot
                  ├── TrainParams.py  # omniglot training parameters configuration
                  ├── TestParams.py   # omniglot testing parameters configuration

            ├── mini-imagenet
                  ├── TrainParams.py  # mini-imagenet training parameters configuration
                  ├── TestParams.py   # mini-imagenet testing parameters configuration
            ├── cifar
                  ├── TrainParams.py  # cifar 100 training parameters configuration
                  ├── TestParams.py   # cifar 100 testing parameters configuration

       ├── model              # The path containing proposed models
       ├── train.py           # The main script for training
       ├── test.py            # The main script for testing
       ├── pretrain.py        # The main script for pre-training

 ├── datasets             # The location in which datasets are placed
       ├── omniglot
       ├── miniimagenet
       ├── cifar

 ├── experiments          # The location in which accomplished experiments are stored
       ├── omniglot
       ├── miniimagenet
       ├── cifar

In the following sections we will first provide details about how to setup the dataset. Then the instructions for installing package dependencies, training and testing is provided.

Configuring the Dataset

In this paper we have used Omniglot, CIFAR-100 and Mini-Imagenet datasets. The omniglot and cifar-100 are light-weight datasets and are automatically downloaded into datasets/omniglot/ or datasets/cifar/ whenever needed. however the mini-imagenet dataset need to be manually downloaded and placed in datasets/miniimagenet/. The following instructions will show how to properly setup this dataset:

  • First download the images from this link (provided by the owners) and the train.csv,val.csv,test.csv splits from this link.

  • Extract and place the downloaded files directly under datasets/miniimagenet/. (We expect to have train.csv, val.csv, test.csv and images folder under this path)

Reading directly from the disk every time we need this dataset is an extremely slow procedure. To solve this issue we use a preprocessing step, in which the images are first shrinked to 100 pixels in the smaller dimension (without cahnging the aspect ratio), and then converted to numpy npy format. The code for this preprocessing is provided in code directory and should be executed as follows:

cd code
python genrate_img.py ../datasets/miniimagenet ../datasets/miniimagenet

Wait until the success message for test, train and validation appears and then we are ready to go.

Installing Prerequisites

The following packages are required:

  • opencv-python==4.5.1
  • torch==1.7.1+cu101
  • tensorboard==2.4.1
  • pynvml==8.0.4
  • matplotlib==3.3.2
  • tqdm==4.55.1
  • scipy==1.6.0
  • torchvision==0.8.2+cu101

Training and Testing

The first step for training or testing is to confgure the desired parameters. We have seperated the training/testing parameters for each dataset and placed them under code/datasets/omniglot and code/datasets/miniimagenet. For example to change the number of meta-training episodes on omniglot dataset, one may do as following:

  • Open code/datasets/omniglot/TrainParams.py

  • Find the line self.meta_train_steps and change it's value.

Setting the training model is done in the same way by changing self.modelClass value. We have provided the following models in the code/model/ path:

file path model name in the paper
code/model/Bayesian.py GeMCL predictive
code/model/MAP.py GeMCL MAP
code/model/LR.py MTLR
code/model/PGLR.py PGLR
code/model/ProtoNet.py Prototypical

Training Instructions

To perform training first configure the training parameters in code/datasets/omniglot/TrainParams.py or code/datasets/miniimagenet/TrainParams.py for omniglot and mini-magenet datasets respectively. In theese files, self.experiment_name variable along with a Date prefix will determine the folder name in which training logs are stored.

Now to start training run the following command for omniglot (In all our codes the M or O flag represents mini-imagene and omniglot datasets respectively):

cd code
python train.py O

and the following for mini-imagenet:

cd code
python train.py M

The training logs and checkpoints are stored in a folder under experiments/omniglot/ or experiments/miniimagenet/ with the name specified in self.experiment_name. We have already attached some trained models with the same settings reported in the paper. The path and details for these models are as follows:

Model Path Details
experiments/miniimagenet/imagenet_bayesian_final GeMCL predictive trained on mini-imagenet
experiments/miniimagenet/imagenet_map_final GeMCL MAP trained on mini-imagenet
experiments/miniimagenet/imagenet_PGLR_final PGLR trained on mini-imagenet
experiments/miniimagenet/imagenet_MTLR_final MTLR trained on mini-imagenet
experiments/miniimagenet/imagenet_protonet_final Prototypical trained on mini-imagenet
experiments/miniimagenet/imagenet_pretrain_final pretrained model on mini-imagenet
experiments/miniimagenet/imagenet_Bayesian_OMLBackbone GeMCL predictive trained on mini-imagenet with OML backbone
experiments/miniimagenet/imagenet_random random model compatible to mini-imagenet but not trained previously
experiments/omniglot/omniglot_Bayesian_final GeMCL predictive trained on omniglot
experiments/omniglot/omniglot_MAP_final GeMCL MAP trained on omniglot
experiments/omniglot/omniglot_PGLR_final PGLR trained on omniglot
experiments/omniglot/omniglot_MTLR_final MTLR trained on omniglot
experiments/omniglot/omniglot_Protonet_final Prototypical trained on omniglot
experiments/omniglot/omniglot_Pretrain_final pretrained model on omniglot
experiments/omniglot/Omniglot_Bayesian_OMLBackbone GeMCL predictive trained on omniglot with OML backbone
experiments/omniglot/omniglot_random random model compatible to omniglot but not trained previously
experiments/omniglot/omniglot_bayesian_28 GeMCL predictive trained on omniglot with 28x28 input

Testing Instructions

To evaluate a previously trained model, we can use test.py by determining the path in which the model was stored. As an example consider the following structure for omniglot experiments.

root
 ├── experiments
       ├── omniglot
            ├── omniglot_Bayesian_final

Now to test this model run:

cd code
python test.py O ../experiments/omniglot/omniglot_Bayesian_final/

At the end of testing, the mean accuracy and std among test epsiodes will be printed.

Note: Both test.py and train.py use TrainParams.py for configuring model class. Thus before executing test.py make sure that TrainParams.py is configured correctly.

Pre-training Instructions

To perform a preitraining you can use

cd code
python pretrain.py O

The pre-training configuarations are also available in TrainParams.py.

References

TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"

Hierarchical Attention Networks for Document Classification This is an implementation of the paper Hierarchical Attention Networks for Document Classi

Quoc-Tuan Truong 83 Dec 05, 2022
Pytorch implementation of SenFormer: Efficient Self-Ensemble Framework for Semantic Segmentation

SenFormer: Efficient Self-Ensemble Framework for Semantic Segmentation Efficient Self-Ensemble Framework for Semantic Segmentation by Walid Bousselham

61 Dec 26, 2022
Code for the CVPR2022 paper "Frequency-driven Imperceptible Adversarial Attack on Semantic Similarity"

Introduction This is an official release of the paper "Frequency-driven Imperceptible Adversarial Attack on Semantic Similarity" (arxiv link). Abstrac

Leo 21 Nov 23, 2022
Scalable Multi-Agent Reinforcement Learning

Scalable Multi-Agent Reinforcement Learning 1. Featured algorithms: Value Function Factorization with Variable Agent Sub-Teams (VAST) [1] 2. Implement

3 Aug 02, 2022
UIUCTF 2021 Public Challenge Repository

UIUCTF-2021-Public UIUCTF 2021 Public Challenge Repository Notes: every challenge folder contains a challenge.yml file in the format for ctfcli, CTFd'

SIGPwny 15 Nov 03, 2022
Code for GNMR in ICDE 2021

GNMR Code for GNMR in ICDE 2021 Please unzip data files in Datasets/MultiInt-ML10M first. Run labcode_preSamp.py (with graph sampling) for ECommerce-c

7 Oct 27, 2022
Official repository for the ISBI 2021 paper Transformer Assisted Convolutional Neural Network for Cell Instance Segmentation

SegPC-2021 This is the official repository for the ISBI 2021 paper Transformer Assisted Convolutional Neural Network for Cell Instance Segmentation by

Datascience IIT-ISM 13 Dec 14, 2022
Differentiable Neural Computers, Sparse Access Memory and Sparse Differentiable Neural Computers, for Pytorch

Differentiable Neural Computers and family, for Pytorch Includes: Differentiable Neural Computers (DNC) Sparse Access Memory (SAM) Sparse Differentiab

ixaxaar 302 Dec 14, 2022
The aim of this project is to build an AI bot that can play the Wordle game, or more generally Squabble

Wordle RL The aim of this project is to build an AI bot that can play the Wordle game, or more generally Squabble I know there are more deterministic

Aditya Arora 3 Feb 22, 2022
Segmentation Training Pipeline

Segmentation Training Pipeline This package is a part of Musket ML framework. Reasons to use Segmentation Pipeline Segmentation Pipeline was developed

Musket ML 52 Dec 12, 2022
Sequence to Sequence (seq2seq) Recurrent Neural Network (RNN) for Time Series Forecasting

Sequence to Sequence (seq2seq) Recurrent Neural Network (RNN) for Time Series Forecasting Note: You can find here the accompanying seq2seq RNN forecas

Guillaume Chevalier 1k Dec 25, 2022
Code repository for our paper "Learning to Generate Scene Graph from Natural Language Supervision" in ICCV 2021

Scene Graph Generation from Natural Language Supervision This repository includes the Pytorch code for our paper "Learning to Generate Scene Graph fro

Yiwu Zhong 64 Dec 24, 2022
Video Instance Segmentation using Inter-Frame Communication Transformers (NeurIPS 2021)

Video Instance Segmentation using Inter-Frame Communication Transformers (NeurIPS 2021) Paper Video Instance Segmentation using Inter-Frame Communicat

Sukjun Hwang 81 Dec 29, 2022
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"

Medical-Transformer Pytorch Code for the paper "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation" About this repo: This repo

Jeya Maria Jose 615 Dec 25, 2022
Qlib is an AI-oriented quantitative investment platform

Qlib is an AI-oriented quantitative investment platform, which aims to realize the potential, empower the research, and create the value of AI technologies in quantitative investment.

Microsoft 10.1k Dec 30, 2022
Official Pytorch Code for the paper TransWeather

TransWeather Official Code for the paper TransWeather, Arxiv Tech Report 2021 Paper | Website About this repo: This repo hosts the implentation code,

Jeya Maria Jose 81 Dec 30, 2022
Pytorch Lightning Implementation of SC-Depth Methods.

SC_Depth_pl: This is a pytorch lightning implementation of SC-Depth (V1, V2) for self-supervised learning of monocular depth from video. In the V1 (IJ

JiaWang Bian 216 Dec 30, 2022
This repository contains the code for the paper in EMNLP 2021: "HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression".

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression This repository contains the code for the paper in EM

Chenhe Dong 2 Mar 24, 2022
Simulation-based inference for the Galactic Center Excess

Simulation-based inference for the Galactic Center Excess Siddharth Mishra-Sharma and Kyle Cranmer Abstract The nature of the Fermi gamma-ray Galactic

Siddharth Mishra-Sharma 3 Jan 21, 2022
PyTorch Implementation for Fracture Detection in Wrist Bone X-ray Images

wrist-d PyTorch Implementation for Fracture Detection in Wrist Bone X-ray Images note: Paper: Under Review at MPDI Diagnostics Submission Date: Novemb

Fatih UYSAL 5 Oct 12, 2022