Neural Surface Maps

Overview

Neural Surface Maps

Official implementation of Neural Surface Maps - Luca Morreale, Noam Aigerman, Vladimir Kim, Niloy J. Mitra

[Paper] [Project Page]

How-To

Replicating the results is possible following these steps:

  1. Parametrize the surface
  2. Prepare surface sample
  3. Overfit the surface
  4. Neural parametrization of the surface
  5. Optimize surface-to-surface map
  6. Optimize a map between a collection

1. Surface Parametrization

This is a preprocessing step. You can use SLIM[1] from this repo to fulfill this step.

2. Sample preparation

Given a parametrized surface (prev. step), we need to convert it into a sample. First of all, we need to over sample the surface with Meshlab. You can use the midpoint subdivision filter.

Once the super-sampled surface is ready then you can convert it into a sample:

python -m preprocessing.convert_sample surface_slim.obj surface_slim_oversampled.obj output_sample.pth

The file output_sample.pth is the sample ready to be over-fitted.

3. Overfit surface

A surface representation is generated with:

python -m training_surface_map dataset.sample_path=output_sample.pth

This will save a surface map inside outputs/neural_maps folder. The folder name follows this patterns: overfit_[timestamp]. Inside that folder, the map is saved under the sample fodler as pth file.

The overfitted surface can be generated with:

python -m show_surface_map

please, set the path to the pth file just created inside the script.

4. Neural parametrization

Generating a neural parametrization need to run:

python -m training_parametrization_map dataset.sample_path=your_surface_map.pth

Like for the overfitting, this saves the map inside outputs/neural_maps folder. The folder name have the following patterns parametrization_[timestamp].

To display the paramtrization obtained run:

python -m show_parametrization_map

please, set the path to the pth file just created inside the script.

5. Optimize surface-to-surface map

To generating a inter-surface map run:

python -m training_intersurface_map dataset.sample_path_g=your_surface_map_a.pth dataset.sample_path_f=your_surface_map_b.pth

Note, this steps requires two surface maps. A source, sample_path_g, and a target, sample_path_f.

Likewise the overfitting, the map is saved inside outputs/neural_maps. The inter-surface map folder pattern is intersurface_[timestamp]. The pth file is inside the models folder.

To display the inter-surface map run:

python -m show_intersurface_map

remember to set the path of the maps inside the script.

6. Optimize collection map

A collection between a set of surface maps can be optimized with:

python -m training_intersurface_map dataset.sample_path_g=your_surface_map_g.pth dataset.sample_path_f=your_surface_map_f.pth dataset.sample_path_q=your_surface_map_q.pth

Note, this steps requires three surface maps. A source, sample_path_g, and two targets, sample_path_f and sample_path_q.

This will save two maps inside outputs/neural_maps folder. The folder name follows this patterns: collection_[timestamp], under the folder models you can find two *.pth file.

To display the collection map run:

python -m show_collection_map

remember to set the path of maps inside the script.


Dependencies

Dependencies are listed in environment.yml. Using conda, all the packages can be installed with conda env create -f environment.yml.

On top of the packages above, please install also pytorch svd on gpu package.


Data

Any mesh can be used for this process. A data example can be downloaded here.


Citation

@misc{morreale2021neural,
      title={Neural Surface Maps},
      author={Luca Morreale and Noam Aigerman and Vladimir Kim and Niloy J. Mitra},
      year={2021},
      eprint={2103.16942},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

References

[1] Scalable locally injective mappings - Michael Rabinovich et. al. - ACM Transactions on Graphics (TOG) 2017

Owner
Luca Morreale
Luca Morreale
The original weights of some Caffe models, ported to PyTorch.

pytorch-caffe-models This repo contains the original weights of some Caffe models, ported to PyTorch. Currently there are: GoogLeNet (Going Deeper wit

Katherine Crowson 9 Nov 04, 2022
Framework for training options with different attention mechanism and using them to solve downstream tasks.

Using Attention in HRL Framework for training options with different attention mechanism and using them to solve downstream tasks. Requirements GPU re

5 Nov 03, 2022
TPH-YOLOv5: Improved YOLOv5 Based on Transformer Prediction Head for Object Detection on Drone-Captured Scenarios

TPH-YOLOv5 This repo is the implementation of "TPH-YOLOv5: Improved YOLOv5 Based on Transformer Prediction Head for Object Detection on Drone-Captured

cv516Buaa 439 Dec 22, 2022
InvTorch: memory-efficient models with invertible functions

InvTorch: Memory-Efficient Invertible Functions This module extends the functionality of torch.utils.checkpoint.checkpoint to work with invertible fun

Modar M. Alfadly 12 May 12, 2022
Python implementation of Wu et al (2018)'s registration fusion

reg-fusion Projection of a central sulcus probability map using the RF-ANTs approach (right hemisphere shown). This is a Python implementation of Wu e

Dan Gale 26 Nov 12, 2021
AITUS - An atomatic notr maker for CYTUS

AITUS an automatic note maker for CYTUS. 利用AI根据指定乐曲生成CYTUS游戏谱面。 效果展示:https://www

GradiusTwinbee 6 Feb 24, 2022
This is the official pytorch implementation of AutoDebias, an automatic debiasing method for recommendation.

AutoDebias This is the official pytorch implementation of AutoDebias, a debiasing method for recommendation system. AutoDebias is proposed in the pape

Dong Hande 77 Nov 25, 2022
A modular, open and non-proprietary toolkit for core robotic functionalities by harnessing deep learning

A modular, open and non-proprietary toolkit for core robotic functionalities by harnessing deep learning Website • About • Installation • Using OpenDR

OpenDR 304 Dec 28, 2022
Pytorch implementation of XRD spectral identification from COD database

XRDidentifier Pytorch implementation of XRD spectral identification from COD database. Details will be explained in the paper to be submitted to NeurI

Masaki Adachi 4 Jan 07, 2023
Final report with code for KAIST Course KSE 801.

Orthogonal collocation is a method for the numerical solution of partial differential equations

Chuanbo HUA 4 Apr 06, 2022
Algorithms for outlier, adversarial and drift detection

Alibi Detect is an open source Python library focused on outlier, adversarial and drift detection. The package aims to cover both online and offline d

Seldon 1.6k Dec 31, 2022
Deploy a ML inference service on a budget in less than 10 lines of code.

BudgetML is perfect for practitioners who would like to quickly deploy their models to an endpoint, but not waste a lot of time, money, and effort trying to figure out how to do this end-to-end.

1.3k Dec 25, 2022
Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR)

This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.

Yongchun Zhu 81 Dec 29, 2022
Patch-Based Deep Autoencoder for Point Cloud Geometry Compression

Patch-Based Deep Autoencoder for Point Cloud Geometry Compression Overview The ever-increasing 3D application makes the point cloud compression unprec

17 Dec 05, 2022
CVPR 2021 - Official code repository for the paper: On Self-Contact and Human Pose.

SMPLify-XMC This repo is part of our project: On Self-Contact and Human Pose. [Project Page] [Paper] [MPI Project Page] License Software Copyright Lic

Lea Müller 83 Dec 14, 2022
Code accompanying the paper "Knowledge Base Completion Meets Transfer Learning"

Knowledge Base Completion Meets Transfer Learning This code accompanies the paper Knowledge Base Completion Meets Transfer Learning published at EMNLP

14 Nov 27, 2022
Machine Learning automation and tracking

The Open-Source MLOps Orchestration Framework MLRun is an open-source MLOps framework that offers an integrative approach to managing your machine-lea

873 Jan 04, 2023
QT Py Media Knob using rotary encoder & neopixel ring

QTPy-Knob QT Py USB Media Knob using rotary encoder & neopixel ring The QTPy-Knob features: Media knob for volume up/down/mute with "qtpy-knob.py" Cir

Tod E. Kurt 56 Dec 30, 2022
A highly efficient, fast, powerful and light-weight anime downloader and streamer for your favorite anime.

AnimDL - Download & Stream Your Favorite Anime AnimDL is an incredibly powerful tool for downloading and streaming anime. Core features Abuses the dev

KR 759 Jan 08, 2023
OpenVisionAPI server

🚀 Quick start An instance of ova-server is free and publicly available here: https://api.openvisionapi.com Checkout ova-client for a quick demo. Inst

Open Vision API 93 Nov 24, 2022