Physics-informed Neural Operator for Learning Partial Differential Equation

Related tags

Deep LearningPINO
Overview

PINO

PINO Diagram

Results on Navier Stokes equation

Physics-informed Neural Operator for Learning Partial Differential Equation

Abstract: Machine learning methods have recently shown promise in solving partial differential equations (PDEs). They can be classified into two broad categories: solution function approximation and operator learning. The Physics-Informed Neural Network (PINN) is an example of the former while the Fourier neural operator (FNO) is an example of the latter. Both these approaches have shortcomings. The optimization in PINN is challenging and prone to failure, especially on multi-scale dynamic systems. FNO does not suffer from this optimization issue since it carries out supervised learning on a given dataset, but obtaining such data may be too expensive or infeasible. In this work, we propose the physics-informed neural operator (PINO), where we combine the operating-learning and function-optimization frameworks, and this improves convergence rates and accuracy over both PINN and FNO models. In the operator-learning phase, PINO learns the solution operator over multiple instances of the parametric PDE family. In the test-time optimization phase, PINO optimizes the pre-trained operator ansatz for the querying instance of the PDE. Experiments show PINO outperforms previous ML methods on many popular PDE families while retaining the extraordinary speed-up of FNO compared to solvers. In particular, PINO accurately solves long temporal transient flows and Kolmogorov flows, while PINN and other methods fail to converge.

Requirements

  • Pytorch 1.8.0 or later
  • wandb
  • tqdm
  • scipy
  • h5py
  • numpy
  • DeepXDE:latest
  • tensorflow 2.4.0

Data description

Burgers equation

burgers_pino.mat

Darcy flow

  • spatial domain: $x\in (0,1)^2$
  • Data file: piececonst_r421_N1024_smooth1.mat, piececonst_r421_N1024_smooth2.mat
  • Raw data shape: 1024x421x421

Long roll out of Navier Stokes equation

  • spatial domain: $x\in (0, 1)^2$
  • temporal domain: $t\in [0, 49]$
  • forcing: $0.1(\sin(2\pi(x_1+x_2)) + \cos(2\pi(x_1+x_2)))$
  • viscosity = 0.001

Data file: nv_V1e-3_N5000_T50.mat, with shape 50 x 64 x 64 x 5000

  • train set: -1-4799
  • test set: 4799-4999

Navier Stokes with Reynolds number 500

  • spatial domain: $x\in (0, 2\pi)^2$
  • temporal domain: $t \in [0, 0.5]$
  • forcing: $-4\cos(4x_2)$
  • Reynolds number: 500

Train set: data of shape (N, T, X, Y) where N is the number of instances, T is temporal resolution, X, Y are spatial resolutions.

  1. NS_fft_Re500_T4000.npy : 4000x64x64x65
  2. NS_fine_Re500_T128_part0.npy: 100x129x128x128
  3. NS_fine_Re500_T128_part1.npy: 100x129x128x128

Test set: data of shape (N, T, X, Y) where N is the number of instances, T is temporal resolution, X, Y are spatial resolutions.

  1. NS_Re500_s256_T100_test.npy: 100x129x256x256
  2. NS_fine_Re500_T128_part2.npy: 100x129x128x128

Configuration file format: see .yaml files under folder configs for detail.

Code for Burgers equation

Train PINO

To run PINO for Burgers equation, use, e.g.,

python3 train_burgers.py --config_path configs/pretrain/burgers-pretrain.yaml --mode train

To test PINO for burgers equation, use, e.g.,

python3 train_burgers.py --config_path configs/test/burgers.yaml --mode test

Code for Darcy Flow

Operator learning

To run PINO for Darcy Flow, use, e.g.,

python3 train_operator.py --config_path configs/pretrain/Darcy-pretrain.yaml

To evaluate operator for Darcy Flow, use, e.g.,

python3 eval_operator.py --config_path configs/test/darcy.yaml

Test-time optimization

To do test-time optimization for Darcy Flow, use, e.g.,

python3 run_pino2d.py --config_path configs/finetune/Darcy-finetune.yaml --start [starting index] --stop [stopping index]

Baseline

To run DeepONet, use, e.g.,

python3 deeponet.py --config_path configs/pretrain/Darcy-pretrain-deeponet.yaml --mode train 

To test DeepONet, use, e.g.,

python3 deeponet.py --config_path configs/test/darcy.yaml --mode test

Code for Navier Stokes equation

Train PINO for short time period

To run operator learning, use, e.g.,

python3 train_operator.py --config_path configs/pretrain/Re500-pretrain-05s-4C0.yaml

To evaluate trained operator, use

python3 eval_operator.py --config_path configs/test/Re500-05s.yaml

To run test-time optimization, use

python3 train_PINO3d.py --config_path configs/***.yaml 

To train Navier Stokes equations sequentially without running train_PINO3d.py multiple times, use

python3 run_pino3d.py --config_path configs/[configuration file name].yaml --start [index of the first data] --stop [which data to stop]

Baseline for short time period

To train DeepONet, use

python3 deeponet.py --config_path configs/[configuration file].yaml --mode train

To test DeepONet, use

python3 deeponet.py --config_path configs/[configuration file].yaml --mode test

To train and test PINNs, use, e.g.,

python3 nsfnet.py --config_path configs/Re500-pinns-05s.yaml --start [starting index] --stop [stopping index]

Baseline for long roll out

To train and test PINNs, use

python3 nsfnet.py --config_path configs/scratch/NS-50s.yaml --long --start [starting index] --stop [stopping index]

Pseudospectral solver for Navier Stokes equation

To run solver, use

python3 run_solver.py --config_path configs/Re500-0.5s.yaml
Code for our NeurIPS 2021 paper: Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

GateL0RD This is a lightweight PyTorch implementation of GateL0RD, our RNN presented in "Sparsely Changing Latent States for Prediction and Planning i

Autonomous Learning Group 16 Nov 03, 2022
VLGrammar: Grounded Grammar Induction of Vision and Language

VLGrammar: Grounded Grammar Induction of Vision and Language

Yining Hong 27 Dec 23, 2022
Iterative Normalization: Beyond Standardization towards Efficient Whitening

IterNorm Code for reproducing the results in the following paper: Iterative Normalization: Beyond Standardization towards Efficient Whitening Lei Huan

Lei Huang 21 Dec 27, 2022
Put blind watermark into a text with python

text_blind_watermark Put blind watermark into a text. Can be used in Wechat dingding ... How to Use install pip install text_blind_watermark Alice Pu

郭飞 164 Dec 30, 2022
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

Yonglong Tian 2.2k Jan 08, 2023
[ICCV 2021] Target Adaptive Context Aggregation for Video Scene Graph Generation

Target Adaptive Context Aggregation for Video Scene Graph Generation This is a PyTorch implementation for Target Adaptive Context Aggregation for Vide

Multimedia Computing Group, Nanjing University 44 Dec 14, 2022
Official implementation of NeuralFusion: Online Depth Map Fusion in Latent Space

NeuralFusion This is the official implementation of NeuralFusion: Online Depth Map Fusion in Latent Space. We provide code to train the proposed pipel

53 Jan 01, 2023
Official repository of Semantic Image Matting

Semantic Image Matting This is the official repository of Semantic Image Matting (CVPR2021). Overview Natural image matting separates the foreground f

192 Dec 29, 2022
Awesome Long-Tailed Learning

Awesome Long-Tailed Learning This repo pays specially attention to the long-tailed distribution, where labels follow a long-tailed or power-law distri

Stomach_ache 284 Jan 06, 2023
null

DeformingThings4D dataset Video | Paper DeformingThings4D is an synthetic dataset containing 1,972 animation sequences spanning 31 categories of human

208 Jan 03, 2023
Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"

CSA: Contextual Similarity Aggregation with Self-attention for Visual Re-ranking PyTorch training code for CSA (Contextual Similarity Aggregation). We

Hui Wu 19 Oct 21, 2022
Paddle pit - Rethinking Spatial Dimensions of Vision Transformers

基于Paddle实现PiT ——Rethinking Spatial Dimensions of Vision Transformers,arxiv 官方原版代

Hongtao Wen 4 Jan 15, 2022
CrossMLP - The repository offers the official implementation of our BMVC 2021 paper (oral) in PyTorch.

CrossMLP Cascaded Cross MLP-Mixer GANs for Cross-View Image Translation Bin Ren1, Hao Tang2, Nicu Sebe1. 1University of Trento, Italy, 2ETH, Switzerla

Bingoren 16 Jul 27, 2022
Curvlearn, a Tensorflow based non-Euclidean deep learning framework.

English | 简体中文 Why Non-Euclidean Geometry Considering these simple graph structures shown below. Nodes with same color has 2-hop distance whereas 1-ho

Alibaba 123 Dec 12, 2022
InsightFace: 2D and 3D Face Analysis Project on MXNet and PyTorch

InsightFace: 2D and 3D Face Analysis Project on MXNet and PyTorch

Deep Insight 13.2k Jan 06, 2023
Deploy pytorch classification model using Flask and Streamlit

Deploy pytorch classification model using Flask and Streamlit

Ben Seo 1 Nov 17, 2021
Unofficial PyTorch implementation of Guided Dropout

Unofficial PyTorch implementation of Guided Dropout This is a simple implementation of Guided Dropout for research. We try to reproduce the algorithm

2 Jan 07, 2022
Implementation for Learning to Track with Object Permanence

Learning to Track with Object Permanence A video-based MOT approach capable of tracking through full occlusions: Learning to Track with Object Permane

Toyota Research Institute - Machine Learning 91 Jan 03, 2023
joint detection and semantic segmentation, based on ultralytics/yolov5,

Multi YOLO V5——Detection and Semantic Segmentation Overeview This is my undergraduate graduation project which based on ultralytics YOLO V5 tag v5.0.

477 Jan 06, 2023
unet for image segmentation

Implementation of deep learning framework -- Unet, using Keras The architecture was inspired by U-Net: Convolutional Networks for Biomedical Image Seg

zhixuhao 4.1k Dec 31, 2022