NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Related tags

Deep LearningNeuroLKH
Overview

NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Liang Xin, Wen Song, Zhiguang Cao, Jie Zhang. NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem, 35th Conference on Neural Information Processing Systems (NeurIPS), 2021. [pdf]

Please cite our paper if this code is useful for your work.

@inproceedings{xin2021neurolkh,
    author = {Xin, Liang and Song, Wen and Cao, Zhiguang and Zhang, Jie},
    booktitle = {Advances in Neural Information Processing Systems},
    title = {NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem},
    volume = {34},
    year = {2021}
}

Quick start

To connect the deep learning model Sparse Graph Network (Python) and the Lin-Kernighan-Helsgaun Heuristic (C Programming), we implement two versions.

  • subprocess version. This version requires writting and reading data files to connect the two programming languages. To compile and test with our pretrained models for TSP instances with 100 nodes:
make
python data_generate.py -test
python test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000
  • Swig (http://www.swig.org) version. The C code is wrapped for Python. To compile and test with our pretained models for TSP instances with 100 nodes:
bash setup.sh
python data_generate.py -test
python swig_test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

Usage

Generate the training dataset

As the training for edge scores requires the edge labels, generating the training dataset will take a relatively long time (a couple of days).

python data_generate.py -train

Train the NeuroLKH Model

To train for the node penalties in the Sparse Graph Network, swig is required and the subprocess version is currently not supported. With one RTX 2080Ti GPU, the model converges in approximately 4 days.

CUDA_VISIBLE_DEVICES="0" python train.py --file_path train --eval_file_path val --eval_batch_size=100 --save_dir=saved/tsp_neurolkh --learning_rate=0.0001

Finetune the node decoder for large sizes

The finetuning process takes less than 1 minute for each size.

CUDA_VISIBLE_DEVICES="0" python finetune_node.py

Testing

Test with the pretrained model on TSP with 500 nodes:

python test.py --dataset test/500.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

We test on the TSPLIB instances with two NeuroLKH Models, NeuroLKH trained with uniformly distributed TSP instances and NeuroLKH_M trained with uniform, clustered and uniform-clustered instances (please refer to the paper for details).

python tsplib_test.py

Other Routing Problems (CVRP, PDP, CVRPTW)

Testing with pretrained models

test for CVRP with 100 customers, PDP and CVRPTW with 40 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -test
python CVRP_test.py --dataset CVRP_test/cvrp_100.pkl --model_path pretrained/cvrp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -test
python PDP_test.py --dataset PDP_test/pdp_40.pkl --model_path pretrained/pdp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -test
python CVRPTw_test.py --dataset CVRPTW_test/cvrptw_40.pkl --model_path pretrained/cvrptw_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000

Training

train for CVRP with 100-500 customers, PDP and CVRPTW with 40-200 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRP_train.py --save_dir=saved/cvrp_neurolkh
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python PDP_train.py --save_dir=saved/pdp_neurolkh
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRPTW_train.py --save_dir=saved/cvrptw_neurolkh

Dependencies

  • Python >= 3.6
  • Pytorch
  • sklearn
  • Numpy
  • tqdm
  • (Swig, optional)

Acknowledgements

Owner
xinliangedu
xinliangedu
Sequential Model-based Algorithm Configuration

SMAC v3 Project Copyright (C) 2016-2018 AutoML Group Attention: This package is a reimplementation of the original SMAC tool (see reference below). Ho

AutoML-Freiburg-Hannover 778 Jan 05, 2023
Machine learning and Deep learning models, deploy on telegram (the best social media)

Semi Intelligent BOT The project involves : Classifying fake news Classifying objects such as aeroplane, automobile, bird, cat, deer, dog, frog, horse

MohammadReza Norouzi 5 Mar 06, 2022
Paper list of log-based anomaly detection

Paper list of log-based anomaly detection

Weibin Meng 411 Dec 05, 2022
Volsdf - Volume Rendering of Neural Implicit Surfaces

Volume Rendering of Neural Implicit Surfaces Project Page | Paper | Data This re

Lior Yariv 221 Jan 07, 2023
Deformable DETR is an efficient and fast-converging end-to-end object detector.

Deformable DETR: Deformable Transformers for End-to-End Object Detection.

2k Jan 05, 2023
Efficient 3D human pose estimation in video using 2D keypoint trajectories

3D human pose estimation in video with temporal convolutions and semi-supervised training This is the implementation of the approach described in the

Meta Research 3.1k Dec 29, 2022
Food Drinks and groceries Images Multi Lingual (FooDI-ML) dataset.

Food Drinks and groceries Images Multi Lingual (FooDI-ML) dataset.

41 Jan 04, 2023
Code for "Localization with Sampling-Argmax", NeurIPS 2021

Localization with Sampling-Argmax [Paper] [arXiv] [Project Page] Localization with Sampling-Argmax Jiefeng Li, Tong Chen, Ruiqi Shi, Yujing Lou, Yong-

JeffLi 71 Dec 17, 2022
A python/pytorch utility library

A python/pytorch utility library

Jiaqi Gu 5 Dec 02, 2022
A Lightweight Hyperparameter Optimization Tool 🚀

Lightweight Hyperparameter Optimization 🚀 The mle-hyperopt package provides a simple and intuitive API for hyperparameter optimization of your Machin

136 Jan 08, 2023
Pytorch implementation of the paper "Enhancing Content Preservation in Text Style Transfer Using Reverse Attention and Conditional Layer Normalization"

Pytorch implementation of the paper "Enhancing Content Preservation in Text Style Transfer Using Reverse Attention and Conditional Layer Normalization"

Dongkyu Lee 4 Sep 18, 2022
Minimal diffusion models - Minimal code and simple experiments to play with Denoising Diffusion Probabilistic Models (DDPMs)

Minimal code and simple experiments to play with Denoising Diffusion Probabilist

Rithesh Kumar 16 Oct 06, 2022
A set of tools for creating and testing machine learning features, with a scikit-learn compatible API

Feature Forge This library provides a set of tools that can be useful in many machine learning applications (classification, clustering, regression, e

Machinalis 380 Nov 05, 2022
Code for our paper 'Generalized Category Discovery'

Generalized Category Discovery This repo is a placeholder for code for our paper: Generalized Category Discovery Abstract: In this paper, we consider

107 Dec 28, 2022
[CVPR'22] Weakly Supervised Semantic Segmentation by Pixel-to-Prototype Contrast

wseg Overview The Pytorch implementation of Weakly Supervised Semantic Segmentation by Pixel-to-Prototype Contrast. [arXiv] Though image-level weakly

Ye Du 96 Dec 30, 2022
R-Drop: Regularized Dropout for Neural Networks

R-Drop: Regularized Dropout for Neural Networks R-drop is a simple yet very effective regularization method built upon dropout, by minimizing the bidi

756 Dec 27, 2022
The spiritual successor to knockknock for PyTorch Lightning, get notified when your training ends

Who's there? The spiritual successor to knockknock for PyTorch Lightning, to get a notification when your training is complete or when it crashes duri

twsl 70 Oct 06, 2022
code for our ECCV 2020 paper "A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation"

Code for our ECCV (2020) paper A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation. Prerequisites: python == 3.6.8 pytorch ==1.1.0

32 Nov 27, 2022
🔥 Real-time Super Resolution enhancement (4x) with content loss and relativistic adversarial optimization 🔥

🔥 Real-time Super Resolution enhancement (4x) with content loss and relativistic adversarial optimization 🔥

Rishik Mourya 48 Dec 20, 2022
Aiming at the common training datsets split, spectrum preprocessing, wavelength select and calibration models algorithm involved in the spectral analysis process

Aiming at the common training datsets split, spectrum preprocessing, wavelength select and calibration models algorithm involved in the spectral analysis process, a complete algorithm library is esta

Fu Pengyou 50 Jan 07, 2023