Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Overview

Fast Training of Neural Lumigraph Representations using Meta Learning

Project Page | Paper | Data

Alexander W. Bergman, Petr Kellnhofer, Gordon Wetzstein, Stanford University.
Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Usage

To get started, create a conda environment with all dependencies:

conda env create -f environment.yml
conda activate metanlrpp

Code Structure

The code is organized as follows:

  • experiment_scripts: directory containing scripts to for training and testing MetaNLR++ models.
    • pretrain_features.py: pre-train encoder and decoder networks
    • train_sdf_ibr_meta.py: train meta-learned initialization for encoder, decoder, aggregation fn, and neural SDF
    • test_sdf_ibr_meta.py: specialize meta-learned initialization to a specific scene
    • train_sdf_ibr.py: train NLR++ model from scratch without meta-learned initialization
    • test_sdf_ibr.py: evaluate performance on withheld views
  • configs: directory containing configs to reproduce experiments in the paper
    • nlrpp_nlr.txt: configuration for training NLR++ on the NLR dataset
    • nlrpp_dtu.txt: configuration for training NLR++ on the DTU dataset
    • nlrpp_nlr_meta.txt: configuration for training the MetaNLR++ initialization on the NLR dataset
    • nlrpp_dtu_meta.txt: configuration for training the MetaNLR++ initialization on the DTU dataset
    • nlrpp_nlr_metaspec.txt: configuration for training MetaNLR++ on the NLR dataset using the learned initialization
    • nlrpp_dtu_metaspec.txt: configuration for training MetaNLR++ on the DTU dataset using the learned initialization
  • data_processing: directory containing utility functions for processing data
  • torchmeta: torchmeta library for meta-learning
  • utils: directory containing various utility functions for rendering and visualization
  • loss_functions.py: file containing loss functions for evaluation
  • meta_modules.py: contains meta learning wrappers around standard modules using torchmeta
  • modules.py: contains standard modules for coodinate-based networks
  • modules_sdf.py: extends standard modules for coordinate-based network representations of signed-distance functions.
  • modules_unet.py: contains encoder and decoder modules used for image-space feature processing
  • scheduler.py: utilities for training schedule
  • training.py: training script
  • sdf_rendering.py: functions for rendering SDF
  • sdf_meshing.py: functions for meshing SDF
  • checkpoints: contains checkpoints to some pre-trained models (additional/ablation models by request)
  • assets: contains paths to checkpoints which are used as assets, and pre-computed buffers over multiple runs (if necessary)

Getting Started

Pre-training Encoder and Decoder

Pre-train the encoder and decoder using the FlyingChairsV2 training dataset as follows:

python experiment_scripts/pretrain_features.py --experiment_name XXX --batch_size X --dataset_path /path/to/FlyingChairs2/train

Alternatively, use the checkpoint in the checkpoints directory.

Training NLR++

Train a NLR++ model using the following command:

python experiment_scripts/train_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_img_encoder /path/to/pretrained/encdec

Note that we have uploaded our processed version of the DTU data here, and the NLR data can be found here.

Meta-learned Initialization (MetaNLR++)

Meta-learn the initialization for the encoder, decoder, aggregation function, and neural SDF using the following command:

python experiment_scripts/train_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_meta.txt --experiment_name XXX --dataset_path /path/to/dtu/meta/training --reference_view 24 --checkpoint_img_encoder /path/to/pretrained/encdec

Some optimized initializations for the DTU and NLR datasets can be found in the data directory. Additional models can be provided upon request.

Training MetaNLR++ from Initialization

Use the meta-learned initialization to specialize to a specific scene using the following command:

python experiment_scripts/test_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_metaspec.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --reference_view 24 --meta_initialization /path/to/learned/meta/initialization

Evaluation

Test the converged scene on withheld views using the following command:

python experiment_scripts/test_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_path_test /path/to/checkpoint/to/evaluate

Citation & Contact

If you find our work useful in your research, please cite

@inproceedings{bergman2021metanlr,
author = {Bergman, Alexander W. and Kellnhofer, Petr and Wetzstein, Gordon},
title = {Fast Training of Neural Lumigraph Representations using Meta Learning},
booktitle = {NeurIPS},
year = {2021},
}

If you have any questions or would like access to specific ablations or baselines presented in the paper or supplement (the code presented here is only a subset based off of the source code used to generate the results), please feel free to contact the authors. Alex can be contacted via e-mail at [email protected].

Owner
Alex
Alex
Namish Khanna 40 Oct 11, 2022
Action Segmentation Evaluation

Reference Action Segmentation Evaluation Code This repository contains the reference code for action segmentation evaluation. If you have a bug-fix/im

5 May 22, 2022
FridaHookAppTool - Frida Hook App Tool With Python

FridaHookAppTool(以下是Hook mpaas框架的例子) mpaas移动开发框架ios端抓包hook脚本 使用方法:链接数据线,开启burp设置

13 Nov 30, 2022
Genetic feature selection module for scikit-learn

sklearn-genetic Genetic feature selection module for scikit-learn Genetic algorithms mimic the process of natural selection to search for optimal valu

Manuel Calzolari 260 Dec 14, 2022
Official implementation for "Symbolic Learning to Optimize: Towards Interpretability and Scalability"

Symbolic Learning to Optimize This is the official implementation for ICLR-2022 paper "Symbolic Learning to Optimize: Towards Interpretability and Sca

VITA 8 Dec 19, 2022
Official PyTorch implementation for "Low Precision Decentralized Distributed Training with Heterogenous Data"

Low Precision Decentralized Training with Heterogenous Data Official PyTorch implementation for "Low Precision Decentralized Distributed Training with

Aparna Aketi 0 Nov 23, 2021
A gesture recognition system powered by OpenPose, k-nearest neighbours, and local outlier factor.

OpenHands OpenHands is a gesture recognition system powered by OpenPose, k-nearest neighbours, and local outlier factor. Currently the system can iden

Paul Treanor 12 Jan 10, 2022
An official implementation of "Background-Aware Pooling and Noise-Aware Loss for Weakly-Supervised Semantic Segmentation" (CVPR 2021) in PyTorch.

BANA This is the implementation of the paper "Background-Aware Pooling and Noise-Aware Loss for Weakly-Supervised Semantic Segmentation". For more inf

CV Lab @ Yonsei University 59 Dec 12, 2022
Defocus Map Estimation and Deblurring from a Single Dual-Pixel Image

Defocus Map Estimation and Deblurring from a Single Dual-Pixel Image This repository is an implementation of the method described in the following pap

21 Dec 15, 2022
Image classification for projects and researches

This is a tool to help you quickly solve classification problems including: data analysis, training, report results and model explanation.

Nguyễn Trường Lâu 2 Dec 27, 2021
AIR^2 for Interaction Prediction

This is the repository for AIR^2 for Interaction Prediction. Explanation of the solution: Video: link License AIR is released under the Apache 2.0 lic

21 Sep 27, 2022
A toolset of Python programs for signal modeling and indentification via sparse semilinear autoregressors.

SPAAR Description A toolset of Python programs for signal modeling via sparse semilinear autoregressors. References Vides, F. (2021). Computing Semili

Fredy Vides 0 Oct 30, 2021
Attentional Focus Modulates Automatic Finger‑tapping Movements

"Attentional Focus Modulates Automatic Finger‑tapping Movements", in Scientific Reports

Xingxun Jiang 1 Dec 02, 2021
MMDetection3D is an open source object detection toolbox based on PyTorch

MMDetection3D is an open source object detection toolbox based on PyTorch, towards the next-generation platform for general 3D detection. It is a part of the OpenMMLab project developed by MMLab.

OpenMMLab 3.2k Jan 05, 2023
Code for "Unsupervised State Representation Learning in Atari"

Unsupervised State Representation Learning in Atari Ankesh Anand*, Evan Racah*, Sherjil Ozair*, Yoshua Bengio, Marc-Alexandre Côté, R Devon Hjelm This

Mila 217 Jan 03, 2023
Python package for visualizing the loss landscape of parameterized quantum algorithms.

orqviz A Python package for easily visualizing the loss landscape of Variational Quantum Algorithms by Zapata Computing Inc. orqviz provides a collect

Zapata Computing, Inc. 75 Dec 30, 2022
Implementation for ACProp ( Momentum centering and asynchronous update for adaptive gradient methdos, NeurIPS 2021)

This repository contains code to reproduce results for submission NeurIPS 2021, "Momentum Centering and Asynchronous Update for Adaptive Gradient Meth

Juntang Zhuang 15 Jun 11, 2022
Temporally Coherent GAN SIGGRAPH project.

TecoGAN This repository contains source code and materials for the TecoGAN project, i.e. code for a TEmporally COherent GAN for video super-resolution

Duc Linh Nguyen 2 Jan 18, 2022
A Large-Scale Dataset for Spinal Vertebrae Segmentation in Computed Tomography

A Large-Scale Dataset for Spinal Vertebrae Segmentation in Computed Tomography

ICT.MIRACLE lab 75 Dec 26, 2022
An official implementation of the Anchor DETR.

Anchor DETR: Query Design for Transformer-Based Detector Introduction This repository is an official implementation of the Anchor DETR. We encode the

MEGVII Research 276 Dec 28, 2022