Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX.

Overview

snc4onnx

Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX.

https://github.com/PINTO0309/simple-onnx-processing-tools

Downloads GitHub PyPI CodeQL

1. Setup

1-1. HostPC

### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& pip install -U onnx-simplifier \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U snc4onnx

1-2. Docker

### docker pull
$ docker pull pinto0309/snc4onnx:latest

### docker build
$ docker build -t pinto0309/snc4onnx:latest .

### docker run
$ docker run --rm -it -v `pwd`:/workdir pinto0309/snc4onnx:latest
$ cd /workdir

2. CLI Usage

$ snc4onnx -h

usage:
  snc4onnx [-h]
    --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
    --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
    [--op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]]
    [--output_onnx_file_path OUTPUT_ONNX_FILE_PATH]
    [--output_of_onnx_file_in_the_process_of_fusion]
    [--non_verbose]

optional arguments:
  -h, --help
    show this help message and exit

  --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
    Input onnx file paths. At least two onnx files must be specified.

  --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
    The names of the output OP to join from and the input OP to join to are
    out1 in1 out2 in2 out3 in3 .... format.
    In other words, to combine model1 and model2,
    --srcop_destop model1_out1 model2_in1 model1_out2 model2_in2
    Also, --srcop_destop can be specified multiple times.
    The first --srcop_destop specifies the correspondence between model1 and model2,
    and the second --srcop_destop specifies the correspondence
    between model1 and model2 combined and model3.
    It is necessary to take into account that the prefix specified
    in op_prefixes_after_merging is given at the beginning of each OP name.
    e.g. To combine model1 with model2 and model3.
    --srcop_destop model1_src_op1 model2_dest_op1 model1_src_op2 model2_dest_op2 ...
    --srcop_destop comb_model12_src_op1 model3_dest_op1 comb_model12_src_op2 model3_dest_op2 ...

  --op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]
    Since a single ONNX file cannot contain multiple OPs with the same name,
    a prefix is added to all OPs in each input ONNX model to avoid duplication.
    Specify the same number of paths as input_onnx_file_paths.
    e.g. --op_prefixes_after_merging model1_prefix model2_prefix model3_prefix ...

  --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
    Output onnx file path.

  --output_of_onnx_file_in_the_process_of_fusion
    Output of onnx files in the process of fusion.

  --non_verbose
    Do not show all information logs. Only error logs are displayed.

3. In-script Usage

$ python
>>> from snc4onnx import combine
>>> help(combine)

Help on function combine in module snc4onnx.onnx_network_combine:

combine(
  srcop_destop: List[str],
  op_prefixes_after_merging: Union[List[str], NoneType] = [],
  input_onnx_file_paths: Union[List[str], NoneType] = [],
  onnx_graphs: Union[List[onnx.onnx_ml_pb2.ModelProto], NoneType] = [],
  output_onnx_file_path: Union[str, NoneType] = '',
  output_of_onnx_file_in_the_process_of_fusion: Union[bool, NoneType] = False,
  non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    srcop_destop: List[str]
        The names of the output OP to join from and the input OP to join to are
        [["out1","in1"], ["out2","in2"], ["out3","in3"]] format.

        In other words, to combine model1 and model2,
        srcop_destop =
            [
                ["model1_out1_opname","model2_in1_opname"],
                ["model1_out2_opname","model2_in2_opname"]
            ]

        The first srcop_destop specifies the correspondence between model1 and model2,
        and the second srcop_destop specifies the correspondence
        between model1 and model2 combined and model3.
        It is necessary to take into account that the prefix specified
        in op_prefixes_after_merging is given at the beginning of each OP name.

        e.g. To combine model1 with model2 and model3.
        srcop_destop =
            [
                [
                    ["model1_src_op1","model2_dest_op1"],
                    ["model1_src_op2","model2_dest_op2"]
                ],
                [
                    ["combined_model1.2_src_op1","model3_dest_op1"],
                    ["combined_model1.2_src_op2","model3_dest_op2"]
                ],
                ...
            ]

    op_prefixes_after_merging: List[str]
        Since a single ONNX file cannot contain multiple OPs with the same name,
        a prefix is added to all OPs in each input ONNX model to avoid duplication.
        Specify the same number of paths as input_onnx_file_paths.
        e.g. op_prefixes_after_merging = ["model1_prefix","model2_prefix","model3_prefix", ...]

    input_onnx_file_paths: Optional[List[str]]
        Input onnx file paths. At least two onnx files must be specified.
        Either input_onnx_file_paths or onnx_graphs must be specified.
        onnx_graphs If specified, ignore input_onnx_file_paths and process onnx_graphs.
        e.g. input_onnx_file_paths = ["model1.onnx", "model2.onnx", "model3.onnx", ...]

    onnx_graphs: Optional[List[onnx.ModelProto]]
        List of onnx.ModelProto. At least two onnx graphs must be specified.
        Either input_onnx_file_paths or onnx_graphs must be specified.
        onnx_graphs If specified, ignore input_onnx_file_paths and process onnx_graphs.
        e.g. onnx_graphs = [graph1, graph2, graph3, ...]

    output_onnx_file_path: Optional[str]
        Output onnx file path.
        If not specified, .onnx is not output.
        Default: ''

    output_of_onnx_file_in_the_process_of_fusion: Optional[bool]
        Output of onnx files in the process of fusion.
        Default: False

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    combined_graph: onnx.ModelProto
        Combined onnx ModelProto

4. CLI Execution

$ snc4onnx \
--input_onnx_file_paths crestereo_init_iter2_120x160.onnx crestereo_next_iter2_240x320.onnx \
--srcop_destop output flow_init \
--op_prefixes_after_merging init next

5. In-script Execution

5-1. ONNX files

from snc4onnx import combine

combined_graph = combine(
    srcop_destop = [
        ['output', 'flow_init']
    ],
    op_prefixes_after_merging = [
        'init',
        'next',
    ],
    input_onnx_file_paths = [
        'crestereo_init_iter2_120x160.onnx',
        'crestereo_next_iter2_240x320.onnx',
    ],
    non_verbose = True,
)

5-2. onnx.ModelProtos

from snc4onnx import combine

combined_graph = combine(
    srcop_destop = [
        ['output', 'flow_init']
    ],
    op_prefixes_after_merging = [
        'init',
        'next',
    ],
    onnx_graphs = [
        graph1,
        graph2,
        graph3,
    ],
    non_verbose = True,
)

6. Sample

6-1 INPUT <-> OUTPUT

  • Summary

    image

  • Model.1

    image

  • Model.2

    image

  • Merge

    $ snc4onnx \
    --input_onnx_file_paths crestereo_init_iter2_120x160.onnx crestereo_next_iter2_240x320.onnx \
    --op_prefixes_after_merging init next \
    --srcop_destop output flow_init
  • Result

    image image

6-2 INPUT + INPUT

  • Summary

    image

  • Model.1

    image

  • Model.2

    image

  • Merge

    $ snc4onnx \
    --input_onnx_file_paths objectron_camera_224x224.onnx objectron_chair_224x224.onnx \
    --srcop_destop input_1 input_1 \
    --op_prefixes_after_merging camera chair \
    --output_onnx_file_path objectron_camera_chair_224x224.onnx
  • Result

    image image

7. Reference

  1. https://github.com/onnx/onnx/blob/main/docs/PythonAPIOverview.md
  2. https://github.com/PINTO0309/sne4onnx
  3. https://github.com/PINTO0309/snd4onnx
  4. https://github.com/PINTO0309/scs4onnx
  5. https://github.com/PINTO0309/sog4onnx
  6. https://github.com/PINTO0309/PINTO_model_zoo

8. Issues

https://github.com/PINTO0309/simple-onnx-processing-tools/issues

You might also like...
Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX.
Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX.

ONNX Object Localization Network Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX. Ori

MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.

MMdnn MMdnn is a comprehensive and cross-framework tool to convert, visualize and diagnose deep learning (DL) models. The "MM" stands for model manage

tf2onnx - Convert TensorFlow, Keras and Tflite models to ONNX.

tf2onnx converts TensorFlow (tf-1.x or tf-2.x), tf.keras and tflite models to ONNX via command line or python api.

This package proposes simplified exporting pytorch models to ONNX and TensorRT, and also gives some base interface for model inference.

PyTorch Infer Utils This package proposes simplified exporting pytorch models to ONNX and TensorRT, and also gives some base interface for model infer

Convert onnx models to pytorch.

onnx2torch onnx2torch is an ONNX to PyTorch converter. Our converter: Is easy to use – Convert the ONNX model with the function call convert; Is easy

Simple node deletion tool for onnx.
Simple node deletion tool for onnx.

snd4onnx Simple node deletion tool for onnx. I only test very miscellaneous and limited patterns as a hobby. There are probably a large number of bugs

XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale

XtremeDistilTransformers for Distilling Massive Multilingual Neural Networks ACL 2020 Microsoft Research [Paper] [Video] Releasing [XtremeDistilTransf

Ever felt tired after preprocessing the dataset, and not wanting to write any code further to train your model? Ever encountered a situation where you wanted to record the hyperparameters of the trained model and able to retrieve it afterward? Models Playground is here to help you do that. Models playground allows you to train your models right from the browser. PyTorch ,ONNX and TensorRT implementation of YOLOv4
PyTorch ,ONNX and TensorRT implementation of YOLOv4

PyTorch ,ONNX and TensorRT implementation of YOLOv4

Releases(1.0.11)
  • 1.0.11(Jan 2, 2023)

  • 1.0.10(Jan 2, 2023)

  • 1.0.9(Sep 7, 2022)

    • Add short form parameter

      $ snc4onnx -h
      
      usage:
        snc4onnx [-h]
          -if INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
          -sd SRCOP_DESTOP [SRCOP_DESTOP ...]
          [-opam OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]]
          [-of OUTPUT_ONNX_FILE_PATH]
          [-f]
          [-n]
      
      optional arguments:
        -h, --help
          show this help message and exit.
      
        -if INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...], --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
            Input onnx file paths. At least two onnx files must be specified.
      
        -sd SRCOP_DESTOP [SRCOP_DESTOP ...], --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
            The names of the output OP to join from and the input OP to join to are
            out1 in1 out2 in2 out3 in3 ....
            format.
            In other words, to combine model1 and model2,
            --srcop_destop model1_out1 model2_in1 model1_out2 model2_in2
            Also, --srcop_destop can be specified multiple times.
            The first --srcop_destop specifies the correspondence between model1 and model2,
            and the second --srcop_destop specifies the correspondence between
            model1 and model2 combined and model3.
            It is necessary to take into account that the prefix specified
            in op_prefixes_after_merging is
            given at the beginning of each OP name.
            e.g. To combine model1 with model2 and model3.
            --srcop_destop model1_src_op1 model2_dest_op1 model1_src_op2 model2_dest_op2 ...
            --srcop_destop combined_model1.2_src_op1 model3_dest_op1 combined_model1.2_src_op2 model3_dest_op2 ...
      
        -opam OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...], --op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]
            Since a single ONNX file cannot contain multiple OPs with the same name,
            a prefix is added to all OPs in each input ONNX model to avoid duplication.
            Specify the same number of paths as input_onnx_file_paths.
            e.g. --op_prefixes_after_merging model1_prefix model2_prefix model3_prefix ...
      
        -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
            Output onnx file path.
      
        -f, --output_of_onnx_file_in_the_process_of_fusion
            Output of onnx files in the process of fusion.
      
        -n, --non_verbose
            Do not show all information logs. Only error logs are displayed.
      
    Source code(tar.gz)
    Source code(zip)
  • 1.0.8(Sep 6, 2022)

    1. Fixed a bug that caused INPUT names to be corrupted. There was a problem with the removal of prefixes added during the model merging process.
      • before: main_input -> put (bug)
      • after: main_input -> input
      • Stop using lstrip and change to forward matching logic with re.sub
    2. Added process to clean up OUTPUT prefixes as much as possible image
    Source code(tar.gz)
    Source code(zip)
  • 1.0.7(May 25, 2022)

  • 1.0.6(May 7, 2022)

  • 1.0.5(May 1, 2022)

  • 1.0.4(Apr 27, 2022)

    • Change op_prefixes_after_merging to optional
    • Added duplicate OP name check
      • If there is a duplicate OP name, the model cannot be combined and the process is aborted with the following error message.
        ERROR: 
        There is a duplicate OP name after merging models.
        op_name:input count:2, op_name:output count:2
        Avoid duplicate OP names by specifying a prefix in op_prefixes_after_merging.
        
    Source code(tar.gz)
    Source code(zip)
  • 1.0.3(Apr 24, 2022)

  • 1.0.2(Apr 11, 2022)

  • 1.0.1(Apr 10, 2022)

  • 1.0.0(Apr 10, 2022)

Owner
Katsuya Hyodo
Hobby programmer. Intel Software Innovator Program member.
Katsuya Hyodo
Using Tensorflow Object Detection API to detect Waymo open dataset

Waymo-2D-Object-Detection Using Tensorflow Object Detection API to detect Waymo open dataset Result CenterNet Training Loss SSD ResNet Training Loss C

76 Dec 12, 2022
BasicRL: easy and fundamental codes for deep reinforcement learning。It is an improvement on rainbow-is-all-you-need and OpenAI Spinning Up.

BasicRL: easy and fundamental codes for deep reinforcement learning BasicRL is an improvement on rainbow-is-all-you-need and OpenAI Spinning Up. It is

RayYoh 12 Apr 28, 2022
PyTorch implementation of MLP-Mixer

PyTorch implementation of MLP-Mixer MLP-Mixer: an all-MLP architecture composed of alternate token-mixing and channel-mixing operations. The token-mix

Duo Li 33 Nov 27, 2022
Franka Emika Panda manipulator kinematics&dynamics simulation

pybullet_sim_panda Pybullet simulation environment for Franka Emika Panda Dependency pybullet, numpy, spatial_math_mini Simple example (please check s

0 Jan 20, 2022
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.

English | 简体中文 | 繁體中文 State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow 🤗 Transformers provides thousands of pretrained mo

Hugging Face 77.2k Jan 02, 2023
Yolov3 pytorch implementation

YOLOV3 Pytorch实现 在bubbliiing大佬代码的基础上进行了修改,添加了部分注释。 预训练模型 预训练模型来源于bubbliiing。 链接:https://pan.baidu.com/s/1ncREw6Na9ycZptdxiVMApw 提取码:appk 训练自己的数据集 按照VO

4 Aug 27, 2022
Pytorch implementation for "Large-Scale Long-Tailed Recognition in an Open World" (CVPR 2019 ORAL)

Large-Scale Long-Tailed Recognition in an Open World [Project] [Paper] [Blog] Overview Open Long-Tailed Recognition (OLTR) is the author's re-implemen

Zhongqi Miao 761 Dec 26, 2022
The implementation of the lifelong infinite mixture model

Lifelong infinite mixture model 📋 This is the implementation of the Lifelong infinite mixture model 📋 Accepted by ICCV 2021 Title : Lifelong Infinit

Fei Ye 5 Oct 20, 2022
This repository provides a basic implementation of our GCPR 2021 paper "Learning Conditional Invariance through Cycle Consistency"

Learning Conditional Invariance through Cycle Consistency This repository provides a basic TensorFlow 1 implementation of the proposed model in our GC

BMDA - University of Basel 1 Nov 04, 2022
AI Summer's complete catalog of articles

Learn Deep Learning with AI Summer A collection of all articles (almost 100) written for the AI Summer blog organized by topic. Deep Learning Theory M

AI Summer 95 Dec 29, 2022
A pre-trained model with multi-exit transformer architecture.

ElasticBERT This repository contains finetuning code and checkpoints for ElasticBERT. Towards Efficient NLP: A Standard Evaluation and A Strong Baseli

fastNLP 48 Dec 14, 2022
STRIVE: Scene Text Replacement In Videos

STRIVE: Scene Text Replacement In Videos Dataset Types: RoboText SynthText RealWorld videos RoboText : Videos of texts collected using navigation robo

15 Jul 11, 2022
Diverse Image Captioning with Context-Object Split Latent Spaces (NeurIPS 2020)

Diverse Image Captioning with Context-Object Split Latent Spaces This repository is the PyTorch implementation of the paper: Diverse Image Captioning

Visual Inference Lab @TU Darmstadt 34 Nov 21, 2022
Public repository of the 3DV 2021 paper "Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds"

Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds Björn Michele1), Alexandre Boulch1), Gilles Puy1), Maxime Bucher1) and Rena

valeo.ai 15 Dec 22, 2022
Image Segmentation Evaluation

Image Segmentation Evaluation Martin Keršner, [email protected] Evaluation

Martin Kersner 273 Oct 28, 2022
PyTorch implementation of "Dataset Knowledge Transfer for Class-Incremental Learning Without Memory" (WACV2022)

Dataset Knowledge Transfer for Class-Incremental Learning Without Memory [Paper] [Slides] Summary Introduction Installation Reproducing results Citati

Habib Slim 5 Dec 05, 2022
Randstad Artificial Intelligence Challenge (powered by VGEN). Soluzione proposta da Stefano Fiorucci (anakin87) - primo classificato

Randstad Artificial Intelligence Challenge (powered by VGEN) Soluzione proposta da Stefano Fiorucci (anakin87) - primo classificato Struttura director

Stefano Fiorucci 1 Nov 13, 2021
Meta Representation Transformation for Low-resource Cross-lingual Learning

MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning This repo hosts the code for MetaXL, published at NAACL 2021. [Meta

Microsoft 36 Aug 17, 2022
Pythonic particle-based (super-droplet) warm-rain/aqueous-chemistry cloud microphysics package with box, parcel & 1D/2D prescribed-flow examples in Python, Julia and Matlab

PySDM PySDM is a package for simulating the dynamics of population of particles. It is intended to serve as a building block for simulation systems mo

Atmospheric Cloud Simulation Group @ Jagiellonian University 32 Oct 18, 2022
This is a Tensorflow implementation of Learning to See in the Dark in CVPR 2018

Learning-to-See-in-the-Dark This is a Tensorflow implementation of Learning to See in the Dark in CVPR 2018, by Chen Chen, Qifeng Chen, Jia Xu, and Vl

5.3k Jan 01, 2023