Official PyTorch implementation of Learning Intra-Batch Connections for Deep Metric Learning (ICML 2021) published at International Conference on Machine Learning

Overview

About

This repository the official PyTorch implementation of Learning Intra-Batch Connections for Deep Metric Learning. The config files contain the same parameters as used in the paper.

We use torch 1.7.1 and torchvision 0.6.0. While the training and inference should be able to be done correctly with the newer versions of the libraries, be aware that at times the network trained and tested using versions might diverge or reach lower results. We provide a evironment.yaml file to create a corresponding conda environment.

We also support mixed-precision training via Nvidia Apex and describe how to use it in usage.

As in the paper we support training on 4 datasets: CUB-200-2011, CARS 196, Stanford Online Products and In-Shop datasets.

The majority of experiments are done using ResNet50. We provide support for the entire family of ResNet and DenseNet as well as BN-Inception.

Set up

  1. Clone and enter this repository:

     git clone https://github.com/dvl-tum/intra_batch.git
    
     cd intra_batch
    
  2. Create an Anaconda environment for this project: To set up a conda environment containing all used packages, please fist install anaconda and then run

    1.   conda env create -f environment.yml
      
    2.  conda activate intra_batch_dml
      
    3.  pip install torch-scatter==2.0.5 -f https://pytorch-geometric.com/whl/torch-1.5.0+cu102.html
      
    4. If you want to use Apex, please follow the installation instructions on https://github.com/NVIDIA/apex
  3. Download datasets: Make a data directory by typing

     mkdir data
    

    Then download the datasets using the following links and unzip them in the data directory:

    We also provide a parser for Stanford Online Products and In-Shop datastes. You can find dem in the dataset/ directory. The datasets are expected to be structured as dataset/images/class/, where dataset is either CUB-200-2011, CARS, Stanford_Online_Products or In_shop and class are the classes of a given dataset. Example for CUB-200-2011:

         CUB_200_2011/images/001
         CUB_200_2011/images/002
         CUB_200_2011/images/003
         ...
         CUB_200_2011/images/200
    
  4. Download our models: Please download the pretrained weights by using

     wget https://vision.in.tum.de/webshare/u/seidensc/intra_batch_connections/best_weights.zip
    

    and unzip them.

Usage

You can find config files for training and testing on each of the datasets in the config/ directory. For training and testing, you will have to input which one you want to use (see below). You will only be able to adapt some basic variables over the command line. For all others please refer to the yaml file directly.

Testing

To test to networks choose one of the config files for testing, e.g., config_cars_test.yaml to evaluate the performance on Cars196 and run:

python train.py --config_path config_cars_test.yaml --dataset_path <path to dataset> 

The default dataset path is data.

Training

To train a network choose one of the config files for training like config_cars_train.yaml to train on Cars196 and run:

python train.py --config_path config_cars_train.yaml --dataset_path <path to dataset> --net_type <net type you want to use>

Again, if you don't specify anything, the default setting will be used. For the net type you have the following options:

resnet18, resnet32, resnet50, resnet101, resnet152, densenet121, densenet161, densenet16, densenet201, bn_inception

If you want to use apex add --is_apex 1 to the command.

Results

[email protected] [email protected] [email protected] [email protected] NMI
CUB-200-2011 70.3 80.3 87.6 92.7 73.2
Cars196 88.1 93.3 96.2 98.2 74.8
[email protected] [email protected] [email protected] NMI
Stanford Online Products 81.4 91.3 95.9 92.6
[email protected] [email protected] [email protected] [email protected]
In-Shop 92.8 98.5 99.1 99.2

Citation

If you find this code useful, please consider citing the following paper:

@inproceedings{DBLP:conf/icml/SeidenschwarzEL21,
  author    = {Jenny Seidenschwarz and
               Ismail Elezi and
               Laura Leal{-}Taix{\'{e}}},
  title     = {Learning Intra-Batch Connections for Deep Metric Learning},
  booktitle = {Proceedings of the 38th International Conference on Machine Learning,
               {ICML} 2021, 18-24 July 2021, Virtual Event},
  series    = {Proceedings of Machine Learning Research},
  volume    = {139},
  pages     = {9410--9421},
  publisher = {{PMLR}},
  year      = {2021},
}
Owner
Dynamic Vision and Learning Group
Dynamic Vision and Learning Group
A Pytorch implementation of "Splitter: Learning Node Representations that Capture Multiple Social Contexts" (WWW 2019).

Splitter ⠀⠀ A PyTorch implementation of Splitter: Learning Node Representations that Capture Multiple Social Contexts (WWW 2019). Abstract Recent inte

Benedek Rozemberczki 201 Nov 09, 2022
Türkiye Canlı Mobese Görüntülerinde Profesyonel Nesne Takip Sistemi

Türkiye Mobese Görüntü Takip Türkiye Mobese görüntülerinde OPENCV ve Yolo ile takip sistemi Multiple Object Tracking System in Turkish Mobese with OPE

15 Dec 22, 2022
(CVPR 2021) Back-tracing Representative Points for Voting-based 3D Object Detection in Point Clouds

BRNet Introduction This is a release of the code of our paper Back-tracing Representative Points for Voting-based 3D Object Detection in Point Clouds,

86 Oct 05, 2022
Code for 1st place solution in Sleep AI Challenge SNU Hospital

Sleep AI Challenge SNU Hospital 2021 Code for 1st place solution for Sleep AI Challenge (Note that the code is not fully organized) Refer to the notio

Saewon Yang 13 Jan 03, 2022
Help you understand Manual and w/ Clutch point while driving.

简体中文 forza_auto_gear forza_auto_gear is a tool for Forza Horizon 5. It will help us understand the best gear shift point using Manual or w/ Clutch in

15 Oct 08, 2022
Codes for our paper The Stem Cell Hypothesis: Dilemma behind Multi-Task Learning with Transformer Encoders published to EMNLP 2021.

The Stem Cell Hypothesis Codes for our paper The Stem Cell Hypothesis: Dilemma behind Multi-Task Learning with Transformer Encoders published to EMNLP

Emory NLP 5 Jul 08, 2022
End-to-End Speech Processing Toolkit

ESPnet: end-to-end speech processing toolkit system/pytorch ver. 1.3.1 1.4.0 1.5.1 1.6.0 1.7.1 1.8.1 1.9.0 ubuntu20/python3.9/pip ubuntu20/python3.8/p

ESPnet 5.9k Jan 04, 2023
[ICSE2020] MemLock: Memory Usage Guided Fuzzing

MemLock: Memory Usage Guided Fuzzing This repository provides the tool and the evaluation subjects for the paper "MemLock: Memory Usage Guided Fuzzing

Cheng Wen 54 Jan 07, 2023
Neural Caption Generator with Attention

Neural Caption Generator with Attention Tensorflow implementation of "Show

Taeksoo Kim 510 Nov 30, 2022
Torch-ngp - A pytorch implementation of the hash encoder proposed in instant-ngp

HashGrid Encoder (WIP) A pytorch implementation of the HashGrid Encoder from ins

hawkey 1k Jan 01, 2023
The modify PyTorch version of Siam-trackers which are speed-up by TensorRT.

SiamTracker-with-TensorRT The modify PyTorch version of Siam-trackers which are speed-up by TensorRT or ONNX. [Updating...] Examples demonstrating how

9 Dec 13, 2022
A Lighting Pytorch Framework for Recommendation System, Easy-to-use and Easy-to-extend.

Torch-RecHub A Lighting Pytorch Framework for Recommendation Models, Easy-to-use and Easy-to-extend. 安装 pip install torch-rechub 主要特性 scikit-learn风格易用

Mincai Lai 67 Jan 04, 2023
SigOpt wrappers for scikit-learn methods

SigOpt + scikit-learn Interfacing This package implements useful interfaces and wrappers for using SigOpt and scikit-learn together Getting Started In

SigOpt 73 Sep 30, 2022
This GitHub repo consists of Code and Some results of project- Diabetes Treatment using Gold nanoparticles. These Consist of ML Models used for prediction Diabetes and further the basic theory and working of Gold nanoparticles.

GoldNanoparticles This GitHub repo consists of Code and Some results of project- Diabetes Treatment using Gold nanoparticles. These Consist of ML Mode

1 Jan 30, 2022
Generative Modelling of BRDF Textures from Flash Images [SIGGRAPH Asia, 2021]

Neural Material Official code repository for the paper: Generative Modelling of BRDF Textures from Flash Images [SIGGRAPH Asia, 2021] Henzler, Deschai

Philipp Henzler 80 Dec 20, 2022
Generative Handwriting using LSTM Mixture Density Network with TensorFlow

Generative Handwriting Demo using TensorFlow An attempt to implement the random handwriting generation portion of Alex Graves' paper. See my blog post

hardmaru 686 Nov 24, 2022
A dataset for online Arabic calligraphy

Calliar Calliar is a dataset for Arabic calligraphy. The dataset consists of 2500 json files that contain strokes manually annotated for Arabic callig

ARBML 114 Dec 28, 2022
Generate fine-tuning samples & Fine-tuning the model & Generate samples by transferring Note On

UPMT Generate fine-tuning samples & Fine-tuning the model & Generate samples by transferring Note On See main.py as an example: from model import PopM

7 Sep 01, 2022
buildseg is a building extraction plugin of QGIS based on PaddlePaddle.

buildseg buildseg is a Building Extraction plugin for QGIS based on PaddlePaddle. How to use Download and install QGIS and clone the repo : git clone

39 Dec 09, 2022
Offical implementation for "Trash or Treasure? An Interactive Dual-Stream Strategy for Single Image Reflection Separation".

Trash or Treasure? An Interactive Dual-Stream Strategy for Single Image Reflection Separation (NeurIPS 2021) by Qiming Hu, Xiaojie Guo. Dependencies P

Qiming Hu 31 Dec 20, 2022