Generative vs Discriminative: Rethinking The Meta-Continual Learning (NeurIPS 2021)

Related tags

Deep LearningGeMCL
Overview




Generative vs Discriminative: Rethinking The Meta-Continual Learning (NeurIPS 2021)

In this repository we provide PyTorch implementations for GeMCL; a generative approach for meta-continual learning. The directory outline is as follows:

root
 ├── code                 # The folder containing all pytorch implementations
       ├── datasets           # The path containing Dataset classes and train/test parameters for each dataset
            ├── omnigolot
                  ├── TrainParams.py  # omniglot training parameters configuration
                  ├── TestParams.py   # omniglot testing parameters configuration

            ├── mini-imagenet
                  ├── TrainParams.py  # mini-imagenet training parameters configuration
                  ├── TestParams.py   # mini-imagenet testing parameters configuration
            ├── cifar
                  ├── TrainParams.py  # cifar 100 training parameters configuration
                  ├── TestParams.py   # cifar 100 testing parameters configuration

       ├── model              # The path containing proposed models
       ├── train.py           # The main script for training
       ├── test.py            # The main script for testing
       ├── pretrain.py        # The main script for pre-training

 ├── datasets             # The location in which datasets are placed
       ├── omniglot
       ├── miniimagenet
       ├── cifar

 ├── experiments          # The location in which accomplished experiments are stored
       ├── omniglot
       ├── miniimagenet
       ├── cifar

In the following sections we will first provide details about how to setup the dataset. Then the instructions for installing package dependencies, training and testing is provided.

Configuring the Dataset

In this paper we have used Omniglot, CIFAR-100 and Mini-Imagenet datasets. The omniglot and cifar-100 are light-weight datasets and are automatically downloaded into datasets/omniglot/ or datasets/cifar/ whenever needed. however the mini-imagenet dataset need to be manually downloaded and placed in datasets/miniimagenet/. The following instructions will show how to properly setup this dataset:

  • First download the images from this link (provided by the owners) and the train.csv,val.csv,test.csv splits from this link.

  • Extract and place the downloaded files directly under datasets/miniimagenet/. (We expect to have train.csv, val.csv, test.csv and images folder under this path)

Reading directly from the disk every time we need this dataset is an extremely slow procedure. To solve this issue we use a preprocessing step, in which the images are first shrinked to 100 pixels in the smaller dimension (without cahnging the aspect ratio), and then converted to numpy npy format. The code for this preprocessing is provided in code directory and should be executed as follows:

cd code
python genrate_img.py ../datasets/miniimagenet ../datasets/miniimagenet

Wait until the success message for test, train and validation appears and then we are ready to go.

Installing Prerequisites

The following packages are required:

  • opencv-python==4.5.1
  • torch==1.7.1+cu101
  • tensorboard==2.4.1
  • pynvml==8.0.4
  • matplotlib==3.3.2
  • tqdm==4.55.1
  • scipy==1.6.0
  • torchvision==0.8.2+cu101

Training and Testing

The first step for training or testing is to confgure the desired parameters. We have seperated the training/testing parameters for each dataset and placed them under code/datasets/omniglot and code/datasets/miniimagenet. For example to change the number of meta-training episodes on omniglot dataset, one may do as following:

  • Open code/datasets/omniglot/TrainParams.py

  • Find the line self.meta_train_steps and change it's value.

Setting the training model is done in the same way by changing self.modelClass value. We have provided the following models in the code/model/ path:

file path model name in the paper
code/model/Bayesian.py GeMCL predictive
code/model/MAP.py GeMCL MAP
code/model/LR.py MTLR
code/model/PGLR.py PGLR
code/model/ProtoNet.py Prototypical

Training Instructions

To perform training first configure the training parameters in code/datasets/omniglot/TrainParams.py or code/datasets/miniimagenet/TrainParams.py for omniglot and mini-magenet datasets respectively. In theese files, self.experiment_name variable along with a Date prefix will determine the folder name in which training logs are stored.

Now to start training run the following command for omniglot (In all our codes the M or O flag represents mini-imagene and omniglot datasets respectively):

cd code
python train.py O

and the following for mini-imagenet:

cd code
python train.py M

The training logs and checkpoints are stored in a folder under experiments/omniglot/ or experiments/miniimagenet/ with the name specified in self.experiment_name. We have already attached some trained models with the same settings reported in the paper. The path and details for these models are as follows:

Model Path Details
experiments/miniimagenet/imagenet_bayesian_final GeMCL predictive trained on mini-imagenet
experiments/miniimagenet/imagenet_map_final GeMCL MAP trained on mini-imagenet
experiments/miniimagenet/imagenet_PGLR_final PGLR trained on mini-imagenet
experiments/miniimagenet/imagenet_MTLR_final MTLR trained on mini-imagenet
experiments/miniimagenet/imagenet_protonet_final Prototypical trained on mini-imagenet
experiments/miniimagenet/imagenet_pretrain_final pretrained model on mini-imagenet
experiments/miniimagenet/imagenet_Bayesian_OMLBackbone GeMCL predictive trained on mini-imagenet with OML backbone
experiments/miniimagenet/imagenet_random random model compatible to mini-imagenet but not trained previously
experiments/omniglot/omniglot_Bayesian_final GeMCL predictive trained on omniglot
experiments/omniglot/omniglot_MAP_final GeMCL MAP trained on omniglot
experiments/omniglot/omniglot_PGLR_final PGLR trained on omniglot
experiments/omniglot/omniglot_MTLR_final MTLR trained on omniglot
experiments/omniglot/omniglot_Protonet_final Prototypical trained on omniglot
experiments/omniglot/omniglot_Pretrain_final pretrained model on omniglot
experiments/omniglot/Omniglot_Bayesian_OMLBackbone GeMCL predictive trained on omniglot with OML backbone
experiments/omniglot/omniglot_random random model compatible to omniglot but not trained previously
experiments/omniglot/omniglot_bayesian_28 GeMCL predictive trained on omniglot with 28x28 input

Testing Instructions

To evaluate a previously trained model, we can use test.py by determining the path in which the model was stored. As an example consider the following structure for omniglot experiments.

root
 ├── experiments
       ├── omniglot
            ├── omniglot_Bayesian_final

Now to test this model run:

cd code
python test.py O ../experiments/omniglot/omniglot_Bayesian_final/

At the end of testing, the mean accuracy and std among test epsiodes will be printed.

Note: Both test.py and train.py use TrainParams.py for configuring model class. Thus before executing test.py make sure that TrainParams.py is configured correctly.

Pre-training Instructions

To perform a preitraining you can use

cd code
python pretrain.py O

The pre-training configuarations are also available in TrainParams.py.

References

Source code for our paper "Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for Molecular Structures"

Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for Molecular Structures Code for the Multiplex Molecular Graph Neural Network (M

shzhang 59 Dec 10, 2022
Script for getting information in discord

User-info.py Script for getting information in https://discord.com/ Instalação: apt-get update -y apt-get upgrade -y apt-get install git pkg install

Moleey 1 Dec 18, 2021
Software Platform for solving and manipulating multiparametric programs in Python

PPOPT Python Parametric OPtimization Toolbox (PPOPT) is a software platform for solving and manipulating multiparametric programs in Python. This pack

10 Sep 13, 2022
Scheme for training and applying a label propagation framework

Factorisation-based Image Labelling Overview This is a scheme for training and applying the factorisation-based image labelling (FIL) framework. Some

Wellcome Centre for Human Neuroimaging 2 Dec 17, 2021
Official repository for Few-shot Image Generation via Cross-domain Correspondence (CVPR '21)

Few-shot Image Generation via Cross-domain Correspondence Utkarsh Ojha, Yijun Li, Jingwan Lu, Alexei A. Efros, Yong Jae Lee, Eli Shechtman, Richard Zh

Utkarsh Ojha 251 Dec 11, 2022
The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training

This repository is the official PyTorch implementation of SAINT. Find the paper on arxiv SAINT: Improved Neural Networks for Tabular Data via Row Atte

Gowthami Somepalli 284 Dec 21, 2022
Code for "Universal inference meets random projections: a scalable test for log-concavity"

How to use this repository This repository contains code to replicate the results of "Universal inference meets random projections: a scalable test fo

Robin Dunn 0 Nov 21, 2021
Supervision Exists Everywhere: A Data Efficient Contrastive Language-Image Pre-training Paradigm

DeCLIP Supervision Exists Everywhere: A Data Efficient Contrastive Language-Image Pre-training Paradigm. Our paper is available in arxiv Updates ** Ou

Sense-GVT 470 Dec 30, 2022
AbelNN: Deep Learning Python module from scratch

AbelNN: Deep Learning Python module from scratch I have implemented several neural networks from scratch using only Numpy. I have designed the module

Abel 2 Apr 12, 2022
Network Compression via Central Filter

Network Compression via Central Filter Environments The code has been tested in the following environments: Python 3.8 PyTorch 1.8.1 cuda 10.2 torchsu

2 May 12, 2022
Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style

Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style [NeurIPS 2021] Official code to reproduce the results and data p

Yash Sharma 27 Sep 19, 2022
OcclusionFusion: realtime dynamic 3D reconstruction based on single-view RGB-D

OcclusionFusion (CVPR'2022) Project Page | Paper | Video Overview This repository contains the code for the CVPR 2022 paper OcclusionFusion, where we

Wenbin Lin 193 Dec 15, 2022
PyTorch implementation of "Simple and Deep Graph Convolutional Networks"

Simple and Deep Graph Convolutional Networks This repository contains a PyTorch implementation of "Simple and Deep Graph Convolutional Networks".(http

chenm 253 Dec 08, 2022
Experiments and examples converting Transformers to ONNX

Experiments and examples converting Transformers to ONNX This repository containes experiments and examples on converting different Transformers to ON

Philipp Schmid 4 Dec 24, 2022
TensorFlow implementation of ENet

TensorFlow-ENet TensorFlow implementation of ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation. This model was tested on th

Kwotsin 255 Oct 17, 2022
Multiple-criteria decision-making (MCDM) with Electre, Promethee, Weighted Sum and Pareto

EasyMCDM - Quick Installation methods Install with PyPI Once you have created your Python environment (Python 3.6+) you can simply type: pip3 install

Labrak Yanis 6 Nov 22, 2022
An end-to-end image translation model with weight-map for color constancy

CCUnet An end-to-end image translation model with weight-map for color constancy 1. Download the dataset (take Colorchecker_recommended dataset as an

Jianhui Qiu 1 Dec 21, 2021
PyTorch Implementation of CycleGAN and SSGAN for Domain Transfer (Minimal)

MNIST-to-SVHN and SVHN-to-MNIST PyTorch Implementation of CycleGAN and Semi-Supervised GAN for Domain Transfer. Prerequites Python 3.5 PyTorch 0.1.12

Yunjey Choi 401 Dec 30, 2022
This is a template for the Non-autoregressive Deep Learning-Based TTS model (in PyTorch).

Non-autoregressive Deep Learning-Based TTS Template This is a template for the Non-autoregressive TTS model. It contains Data Preprocessing Pipeline D

Keon Lee 13 Dec 05, 2022
A python library for implementing a recommender system

python-recsys A python library for implementing a recommender system. Installation Dependencies python-recsys is build on top of Divisi2, with csc-pys

Oscar Celma 1.5k Dec 17, 2022