Experiments for Neural Flows paper

Overview

Neural Flows: Efficient Alternative to Neural ODEs [arxiv]

TL;DR: We directly model the neural ODE solutions with neural flows, which is much faster and achieves better results on time series applications, since it avoids using expensive numerical solvers.

image

Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann

Abstract: Neural ordinary differential equations describe how values change in time. This is the reason why they gained importance in modeling sequential data, especially when the observations are made at irregular intervals. In this paper we propose an alternative by directly modeling the solution curves - the flow of an ODE - with a neural network. This immediately eliminates the need for expensive numerical solvers while still maintaining the modeling capability of neural ODEs. We propose several flow architectures suitable for different applications by establishing precise conditions on when a function defines a valid flow. Apart from computational efficiency, we also provide empirical evidence of favorable generalization performance via applications in time series modeling, forecasting, and density estimation.

This repository acts as a supplementary material which implements the models and experiments as described in the main paper. The definition of models relies on the stribor package for normalizing and neural flows. The baselines use torchdiffeq package for differentiable ODE solvers.

Installation

Install the local package nfe (which will also install all the dependencies):

pip install -e .

Download data

Download and preprocess real-world data and generate synthetic data (or run commands in download_all.sh manually):

. scripts/download_all.sh

Many experiments will automatically download data if it's not already downloaded so this step is optional.

Note: MIMIC-III and IV have to be downloaded manually. Use notebooks in data_preproc to preprocess data.

After downloading everything, your directory tree should look like this:

├── nfe
│   ├── experiments
│   │   ├── base_experiment.py
│   │   ├── data
│   │   │   ├── activity
│   │   │   ├── hopper
│   │   │   ├── mimic3
│   │   │   ├── mimic4
│   │   │   ├── physionet
│   │   │   ├── stpp
│   │   │   ├── synth
│   │   │   └── tpp
│   │   ├── gru_ode_bayes
│   │   ├── latent_ode
│   │   ├── stpp
│   │   ├── synthetic
│   │   └── tpp
│   ├── models
│   └── train.py
├── scripts
│   ├── download_all.sh
│   └── run_all.sh
└── setup.py

Models

Models are located in nfe/models. It contains the implementation of CouplingFlow and ResNetFlow. The ODE models and continuous (ODE or flow-based) GRU and LSTM layers can be found in the same directory.

Example: Coupling flow

import torch
from nfe import CouplingFlow

dim = 4
model = CouplingFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeLinear', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x0 = torch.randn(3, 1, dim) # Initial conditions at t=0

xt = model(x0, t) # IVP solutions at t given x0
xt.shape # torch.Size([3, 10, 4])

Example: GRU flow

import torch
from nfe import GRUFlow

dim = 4
model = GRUFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeTanh', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x = torch.randn(3, 10, dim) # Initial conditions, RNN inputs

xt = model(x, t) # IVP solutions at t_i given x_{1:i}
xt.shape # torch.Size([3, 10, 4])

Experiments

Run all experiments: . scripts/run_all.sh. Or run individual commands manually.

Synthetic

Example:

python -m nfe.train --experiment synthetic --data [ellipse|sawtooth|sink|square|triangle] --model [ode|flow] --flow-model [coupling|resnet] --solver [rk4|dopri5]

Smoothing

Example:

python -m nfe.train --experiment latent_ode --data [activity|hopper|physionet] --classify [0|1] --model [ode|flow] --flow-model [coupling|resnet]

Reference:

  • Yulia Rubanova, Ricky Chen, David Duvenaud. "Latent ODEs for Irregularly-Sampled Time Series" (2019) [paper]. We adapted the code from here.

Filtering

Request MIMIC-III and IV data, and download locally. Use notebooks to preprocess data.

Example:

python -m nfe.train --experiment gru_ode_bayes --data [mimic3|mimic4] --model [ode|flow] --odenet gru --flow-model [gru|resnet]

Reference:

  • Edward De Brouwer, Jaak Simm, Adam Arany, Yves Moreau. "GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series" (2019) [paper]. We adapted the code from here.

Temporal point process

Example:

python -m nfe.train --experiment tpp --data [poisson|renewal|hawkes1|hawkes2|mooc|reddit|wiki] --model [rnn|ode|flow] --flow-model [coupling|resnet] --decoder [continuous|mixture] --rnn [gru|lstm] --marks [0|1]

Reference:

  • Junteng Jia, Austin R. Benson. "Neural Jump Stochastic Differential Equations" (2019) [paper]. We adapted the code from here.

Spatio-temporal

Example:

python -m nfe.train --experiment stpp --data [bike|covid|earthquake] --model [ode|flow] --density-model [independent|attention]

Reference:

  • Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel. "Neural Spatio-Temporal Point Processes" (2021) [paper]. We adapted the code from here.

Citation

@article{bilos2021neuralflows,
  title={{N}eural Flows: {E}fficient Alternative to Neural {ODE}s},
  author={Bilo{\v{s}}, Marin and Sommer, Johanna and Rangapuram, Syama Sundar and Januschowski, Tim and G{\"u}nnemann, Stephan},
  journal={Advances in Neural Information Processing Systems},
  year={2021}
}
An efficient implementation of GPNN

Efficient-GPNN An efficient implementation of GPNN as depicted in "Drop the GAN: In Defense of Patches Nearest Neighbors as Single Image Generative Mo

7 Apr 16, 2022
Machine Learning toolbox for Humans

Reproducible Experiment Platform (REP) REP is ipython-based environment for conducting data-driven research in a consistent and reproducible way. Main

Yandex 662 Nov 20, 2022
Fast Axiomatic Attribution for Neural Networks (NeurIPS*2021)

Fast Axiomatic Attribution for Neural Networks This is the official repository accompanying the NeurIPS 2021 paper: R. Hesse, S. Schaub-Meyer, and S.

Visual Inference Lab @TU Darmstadt 11 Nov 21, 2022
The author's officially unofficial PyTorch BigGAN implementation.

BigGAN-PyTorch The author's officially unofficial PyTorch BigGAN implementation. This repo contains code for 4-8 GPU training of BigGANs from Large Sc

Andy Brock 2.6k Jan 02, 2023
TUPÃ was developed to analyze electric field properties in molecular simulations

TUPÃ: Electric field analyses for molecular simulations What is TUPÃ? TUPÃ (pronounced as tu-pan) is a python algorithm that employs MDAnalysis engine

Marcelo D. Polêto 10 Jul 17, 2022
A very tiny, very simple, and very secure file encryption tool.

Picocrypt is a very tiny (hence "Pico"), very simple, yet very secure file encryption tool. It uses the modern ChaCha20-Poly1305 cipher suite as well

Evan Su 1k Dec 30, 2022
Constructing interpretable quadratic accuracy predictors to serve as an objective function for an IQCQP problem that represents NAS under latency constraints and solve it with efficient algorithms.

IQNAS: Interpretable Integer Quadratic programming Neural Architecture Search Realistic use of neural networks often requires adhering to multiple con

0 Oct 24, 2021
pytorch implementation of Attention is all you need

A Pytorch Implementation of the Transformer: Attention Is All You Need Our implementation is largely based on Tensorflow implementation Requirements N

230 Dec 07, 2022
A curated list of programmatic weak supervision papers and resources

A curated list of programmatic weak supervision papers and resources

Jieyu Zhang 118 Jan 02, 2023
Sequence modeling benchmarks and temporal convolutional networks

Sequence Modeling Benchmarks and Temporal Convolutional Networks (TCN) This repository contains the experiments done in the work An Empirical Evaluati

CMU Locus Lab 3.5k Jan 01, 2023
COVID-Net Open Source Initiative

The COVID-Net models provided here are intended to be used as reference models that can be built upon and enhanced as new data becomes available

Linda Wang 1.1k Dec 26, 2022
Locally Constrained Self-Attentive Sequential Recommendation

LOCKER This is the pytorch implementation of this paper: Locally Constrained Self-Attentive Sequential Recommendation. Zhankui He, Handong Zhao, Zhe L

Zhankui (Aaron) He 8 Jul 30, 2022
GraPE is a Rust/Python library for high-performance Graph Processing and Embedding.

GraPE GraPE (Graph Processing and Embedding) is a fast graph processing and embedding library, designed to scale with big graphs and to run on both of

AnacletoLab 194 Dec 29, 2022
The-Secret-Sharing-Schemes - This interactive script demonstrates the Secret Sharing Schemes algorithm

The-Secret-Sharing-Schemes This interactive script demonstrates the Secret Shari

Nishaant Goswamy 1 Jan 02, 2022
Pytorch domain adaptation package

DomainAdaptation This package is created to tackle the problem of domain shifts when dealing with two domains of different feature distributions. In d

Institute of Computational Perception 7 Oct 22, 2022
Code for Talk-to-Edit (ICCV2021). Paper: Talk-to-Edit: Fine-Grained Facial Editing via Dialog.

Talk-to-Edit (ICCV2021) This repository contains the implementation of the following paper: Talk-to-Edit: Fine-Grained Facial Editing via Dialog Yumin

Yuming Jiang 221 Jan 07, 2023
DARTS-: Robustly Stepping out of Performance Collapse Without Indicators

[ICLR'21] DARTS-: Robustly Stepping out of Performance Collapse Without Indicators [openreview] Authors: Xiangxiang Chu, Xiaoxing Wang, Bo Zhang, Shun

55 Nov 01, 2022
Final term project for Bayesian Machine Learning Lecture (XAI-623)

Mixquality_AL Final Term Project For Bayesian Machine Learning Lecture (XAI-623) Youtube Link The presentation is given in YoutubeLink Problem Formula

JeongEun Park 3 Jan 18, 2022
Panoptic SegFormer: Delving Deeper into Panoptic Segmentation with Transformers

Panoptic SegFormer: Delving Deeper into Panoptic Segmentation with Transformers Results results on COCO val Backbone Method Lr Schd PQ Config Download

155 Dec 20, 2022
Official implementation of the article "Unsupervised JPEG Domain Adaptation For Practical Digital Forensics"

Unsupervised JPEG Domain Adaptation for Practical Digital Image Forensics @WIFS2021 (Montpellier, France) Rony Abecidan, Vincent Itier, Jeremie Boulan

Rony Abecidan 6 Jan 06, 2023