Codes and models of NeurIPS2021 paper - DominoSearch: Find layer-wise fine-grained N:M sparse schemes from dense neural networks

Overview

DominoSearch

This is repository for codes and models of NeurIPS2021 paper - DominoSearch: Find layer-wise fine-grained N:M sparse schemes from dense neural networks

Instructions and other materials will be released soon.

Search:

git clone https://github.com/NM-sparsity/DominoSearch.git
cd DominoSearch/DominoSearch/search/script_resnet_ImageNet

We provide several search scripts for different sparse-ratio target, you can specify your own target and change the parameters accordingly. Note, you need to first specify your ImageNet dataset path

The searching phase could take 2-3 hours, then you will get searched schemes stored in a txt file, which will be needed as input for mixed-sparsity training.

Below is an example of output formate.

{'SparseConv0_3-64-(7, 7)': [16, 16], 'SparseConv1_64-64-(1, 1)': [16, 16], 'SparseConv2_64-64-(3, 3)': [4, 16], 'SparseConv3_64-256-(1, 1)': [8, 16], 'SparseConv4_64-256-(1, 1)': [8, 16], 'SparseConv5_256-64-(1, 1)': [8, 16], 'SparseConv6_64-64-(3, 3)': [4, 16], 'SparseConv7_64-256-(1, 1)': [8, 16], 'SparseConv8_256-64-(1, 1)': [8, 16], 'SparseConv9_64-64-(3, 3)': [4, 16], 'SparseConv10_64-256-(1, 1)': [8, 16], 'SparseConv11_256-128-(1, 1)': [8, 16], 'SparseConv12_128-128-(3, 3)': [2, 16], 'SparseConv13_128-512-(1, 1)': [8, 16], 'SparseConv14_256-512-(1, 1)': [4, 16], 'SparseConv15_512-128-(1, 1)': [8, 16], 'SparseConv16_128-128-(3, 3)': [4, 16], 'SparseConv17_128-512-(1, 1)': [8, 16], 'SparseConv18_512-128-(1, 1)': [8, 16], 'SparseConv19_128-128-(3, 3)': [4, 16], 'SparseConv20_128-512-(1, 1)': [8, 16], 'SparseConv21_512-128-(1, 1)': [8, 16], 'SparseConv22_128-128-(3, 3)': [2, 16], 'SparseConv23_128-512-(1, 1)': [8, 16], 'SparseConv24_512-256-(1, 1)': [4, 16], 'SparseConv25_256-256-(3, 3)': [2, 16], 'SparseConv26_256-1024-(1, 1)': [4, 16], 'SparseConv27_512-1024-(1, 1)': [4, 16], 'SparseConv28_1024-256-(1, 1)': [4, 16], 'SparseConv29_256-256-(3, 3)': [2, 16], 'SparseConv30_256-1024-(1, 1)': [4, 16], 'SparseConv31_1024-256-(1, 1)': [4, 16], 'SparseConv32_256-256-(3, 3)': [2, 16], 'SparseConv33_256-1024-(1, 1)': [4, 16], 'SparseConv34_1024-256-(1, 1)': [4, 16], 'SparseConv35_256-256-(3, 3)': [2, 16], 'SparseConv36_256-1024-(1, 1)': [4, 16], 'SparseConv37_1024-256-(1, 1)': [4, 16], 'SparseConv38_256-256-(3, 3)': [2, 16], 'SparseConv39_256-1024-(1, 1)': [4, 16], 'SparseConv40_1024-256-(1, 1)': [4, 16], 'SparseConv41_256-256-(3, 3)': [2, 16], 'SparseConv42_256-1024-(1, 1)': [4, 16], 'SparseConv43_1024-512-(1, 1)': [4, 16], 'SparseConv44_512-512-(3, 3)': [2, 16], 'SparseConv45_512-2048-(1, 1)': [4, 16], 'SparseConv46_1024-2048-(1, 1)': [2, 16], 'SparseConv47_2048-512-(1, 1)': [4, 16], 'SparseConv48_512-512-(3, 3)': [2, 16], 'SparseConv49_512-2048-(1, 1)': [4, 16], 'SparseConv50_2048-512-(1, 1)': [4, 16], 'SparseConv51_512-512-(3, 3)': [2, 16], 'SparseConv52_512-2048-(1, 1)': [4, 16], 'Linear0_2048-1000': [4, 16]}

Train:

After getting the layer-wise sparse schemes, we need to fine-tune with the schemes to recover the accuracy. The training code is based on NM-sparsity, where we made some changes to support flexible N:M schemes.

Below is an example of training layer-wise sparse resnet50 with 80% overall sparsity.

cd DominoSearch\DominoSearch\train\classification_sparsity_level\train_imagenet
 python -m torch.distributed.launch --nproc_per_node=8 ../train_imagenet.py --config ./configs/config_resnet50.yaml  --base_lr 0.01 --decay 0.0005 --epochs 120 --schemes_file ./schemes/resnet50_M16_0.80.txt --model_dir ./resnet50/resnet50_0.80_M16

Experiments

We provide the trained models of the experiments. Please check our paper for details and intepretations of the experiments.

ResNet50 experiments in section 4.1

Model Name TOP1 Accuracy Trained Model Searched schemes
resnet50 - 0.80 model size 76.7 google drive google drive
resnet50 - 0.875 model size 75.7 google drive google drive
resnet50 - 0.9375 model size 73.5 google drive google drive
resnet50 - 8x FLOPs 75.4 google drive google drive
resnet50- 16x FLOPs 73.4 google drive google drive

Ablation experiments of ResNet50 in section 5.3

Model Name TOP1 Accuracy Trained Model Train log
Ablation E3 76.1 google drive google drive
Ablation E4 76.4 google drive google drive
Ablation E6 76.6 google drive google drive
Ablation E7 75.6 google drive google drive

Citation

@inproceedings{
sun2021dominosearch,
title={DominoSearch: Find layer-wise fine-grained N:M sparse schemes from dense neural networks},
author={Wei Sun and Aojun Zhou and Sander Stuijk and Rob G. J. Wijnhoven and Andrew Nelson and Hongsheng Li and Henk Corporaal},
booktitle={Thirty-Fifth Conference on Neural Information Processing Systems},
year={2021},
url={https://openreview.net/forum?id=IGrC6koW_g}
}
Pre-trained NFNets with 99% of the accuracy of the official paper

NFNet Pytorch Implementation This repo contains pretrained NFNet models F0-F6 with high ImageNet accuracy from the paper High-Performance Large-Scale

Benjamin Schmidt 133 Dec 09, 2022
Minecraft agent to farm resources using reinforcement learning

BarnyardBot CS 175 group project using Malmo download BarnyardBot.py into the python examples directory and run 'python BarnyardBot.py' in the console

0 Jul 26, 2022
Detection of drones using their thermal signatures from thermal camera through YOLO-V3 based CNN with modifications to encapsulate drone motion

Drone Detection using Thermal Signature This repository highlights the work for night-time drone detection using a using an Optris PI Lightweight ther

Chong Yu Quan 6 Dec 31, 2022
Look Closer: Bridging Egocentric and Third-Person Views with Transformers for Robotic Manipulation

Look Closer: Bridging Egocentric and Third-Person Views with Transformers for Robotic Manipulation Official PyTorch implementation for the paper Look

Rishabh Jangir 20 Nov 24, 2022
End-To-End Memory Network using Tensorflow

MemN2N Implementation of End-To-End Memory Networks with sklearn-like interface using Tensorflow. Tasks are from the bAbl dataset. Get Started git clo

Dominique Luna 339 Oct 27, 2022
A Fast Sequence Transducer Implementation with PyTorch Bindings

transducer A Fast Sequence Transducer Implementation with PyTorch Bindings. The corresponding publication is Sequence Transduction with Recurrent Neur

Awni Hannun 184 Dec 18, 2022
3.8% and 18.3% on CIFAR-10 and CIFAR-100

Wide Residual Networks This code was used for experiments with Wide Residual Networks (BMVC 2016) http://arxiv.org/abs/1605.07146 by Sergey Zagoruyko

Sergey Zagoruyko 1.2k Dec 29, 2022
Official code for Score-Based Generative Modeling through Stochastic Differential Equations

Score-Based Generative Modeling through Stochastic Differential Equations This repo contains the official implementation for the paper Score-Based Gen

Yang Song 818 Jan 06, 2023
Code & Data for Enhancing Photorealism Enhancement

Code & Data for Enhancing Photorealism Enhancement

Intel ISL (Intel Intelligent Systems Lab) 1.1k Jan 08, 2023
Learning Compatible Embeddings, ICCV 2021

LCE Learning Compatible Embeddings, ICCV 2021 by Qiang Meng, Chixiang Zhang, Xiaoqiang Xu and Feng Zhou Paper: Arxiv We cannot release source codes pu

Qiang Meng 25 Dec 17, 2022
The code for our paper CrossFormer: A Versatile Vision Transformer Based on Cross-scale Attention.

CrossFormer This repository is the code for our paper CrossFormer: A Versatile Vision Transformer Based on Cross-scale Attention. Introduction Existin

cheerss 238 Jan 06, 2023
Context Decoupling Augmentation for Weakly Supervised Semantic Segmentation

Context Decoupling Augmentation for Weakly Supervised Semantic Segmentation The code of: Context Decoupling Augmentation for Weakly Supervised Semanti

54 Dec 12, 2022
Repository for paper "Non-intrusive speech intelligibility prediction from discrete latent representations"

Non-Intrusive Speech Intelligibility Prediction from Discrete Latent Representations Official repository for paper "Non-Intrusive Speech Intelligibili

Alex McKinney 5 Oct 25, 2022
Official PyTorch implementation of "Synthesis of Screentone Patterns of Manga Characters"

Manga Character Screentone Synthesis Official PyTorch implementation of "Synthesis of Screentone Patterns of Manga Characters" presented in IEEE ISM 2

Tsubota 2 Nov 20, 2021
Readings for "A Unified View of Relational Deep Learning for Polypharmacy Side Effect, Combination Therapy, and Drug-Drug Interaction Prediction."

Polypharmacy - DDI - Synergy Survey The Survey Paper This repository accompanies our survey paper A Unified View of Relational Deep Learning for Polyp

AstraZeneca 79 Jan 05, 2023
Unofficial Tensorflow Implementation of ConvNeXt from A ConvNet for the 2020s

Tensorflow Implementation of "A ConvNet for the 2020s" This is the unofficial Tensorflow Implementation of ConvNeXt from "A ConvNet for the 2020s" pap

DK 11 Oct 12, 2022
Code basis for the paper "Camera Condition Monitoring and Readjustment by means of Noise and Blur" (2021)

Camera Condition Monitoring and Readjustment by means of Noise and Blur This repository contains the source code of the paper: Wischow, M., Gallego, G

7 Dec 22, 2022
SmallInitEmb - LayerNorm(SmallInit(Embedding)) in a Transformer to improve convergence

SmallInitEmb LayerNorm(SmallInit(Embedding)) in a Transformer I find that when t

PENG Bo 11 Dec 25, 2022
Breaking the Dilemma of Medical Image-to-image Translation

Breaking the Dilemma of Medical Image-to-image Translation Supervised Pix2Pix and unsupervised Cycle-consistency are two modes that dominate the field

Kid Liet 86 Dec 21, 2022
Liquid Warping GAN with Attention: A Unified Framework for Human Image Synthesis

Liquid Warping GAN with Attention: A Unified Framework for Human Image Synthesis, including human motion imitation, appearance transfer, and novel view synthesis. Currently the paper is under review

2.3k Jan 05, 2023