Multi-Objective Loss Balancing for Physics-Informed Deep Learning

Overview

Multi-Objective Loss Balancing for Physics-Informed Deep Learning

Code for ReLoBRaLo.

Abstract

Physics Informed Neural Networks (PINN) are algorithms from deeplearning leveraging physical laws by including partial differential equations (PDE)together with a respective set of boundary and initial conditions (BC / IC) aspenalty terms into their loss function. As the PDE, BC and IC loss function parts cansignificantly differ in magnitudes, due to their underlying physical units or stochasticityof initialisation, training of PINNs may suffer from severe convergence and efficiencyproblems, causing PINNs to stay beyond desirable approximation quality. In thiswork, we observe the significant role of correctly weighting the combination of multiplecompetitive loss functions for training PINNs effectively. To that end, we implementand evaluate different methods aiming at balancing the contributions of multipleterms of the PINNs loss function and their gradients. After review of three existingloss scaling approaches (Learning Rate Annealing, GradNorm as well as SoftAdapt),we propose a novel self-adaptive loss balancing of PINNs calledReLoBRaLo(RelativeLoss Balancing with Random Lookback). Finally, the performance of ReLoBRaLo iscompared and verified against these approaches by solving both forward as well asinverse problems on three benchmark PDEs for PINNs: Burgers’ equation, Kirchhoff’splate bending equation and Helmholtz’s equation. Our simulation studies show thatReLoBRaLo training is much faster and achieves higher accuracy than training PINNswith other balancing methods and hence is very effective and increases sustainabilityof PINNs algorithms. The adaptability of ReLoBRaLo illustrates robustness acrossdifferent PDE problem settings. The proposed method can also be employed tothe wider class of penalised optimisation problems, including PDE-constrained andSobolev training apart from the studied PINNs examples.

Launch Training

Example:

python train.py --verbose --layers 2 --nodes 32 --task helmholtz --update_rule relobralo --resample

The available options are the following:

  • --path, default: experiments, type: str, path where to store the results

  • --layers, default: 1, type: int, number of layers

  • --nodes, default: 32, type: int, number of nodes

  • --network, default: fc, type: str, type of network

  • --optimizer, default: adam, type: str, type of optimizer

  • --lr, default: 0.001, type: float, learning rate

  • --patience, default: 3, type: int, how many evaluations without improvement to wait before reducing learning rate

  • --factor, default: .1, type: float, multiplicative factor by which to reduce the learning rate

  • --task, default: helmholtz, type: str, type of task to fit

  • --inverse, action: store_true, solve inverse problem

  • --inverse_var, default: None, type: float, target inverse variable

  • --update_rule, default: manual, type: str, type of balancing

  • --T, default: 1., type: float, temperature parameter for softmax

  • --alpha, default: .999, type: float, rate for exponential decay

  • --rho, default: 1., type: float, rate for exponential decay

  • --aggregate_boundaries, action: store_true, aggregate all boundary terms into one before balancing

  • --epochs, default: 100000, type: int, number of epochs

  • --resample, action: store_true, resample datapoints or keep them fixed

  • --batch_size, default: 1024, type: int, number of sampled points in a batch

  • --verbose, action: store_true, print progress to terminal

Owner
Rafael Bischof
Rafael Bischof
Implementation of the paper ''Implicit Feature Refinement for Instance Segmentation''.

Implicit Feature Refinement for Instance Segmentation This repository is an official implementation of the ACM Multimedia 2021 paper Implicit Feature

Lufan Ma 17 Dec 28, 2022
HuSpaCy: industrial-strength Hungarian natural language processing

HuSpaCy: Industrial-strength Hungarian NLP HuSpaCy is a spaCy model and a library providing industrial-strength Hungarian language processing faciliti

HuSpaCy 120 Dec 14, 2022
High performance distributed framework for training deep learning recommendation models based on PyTorch.

PERSIA (Parallel rEcommendation tRaining System with hybrId Acceleration) is developed by AI 340 Dec 30, 2022

Pose estimation with MoveNet Lightning

Pose Estimation With MoveNet Lightning MoveNet is the TensorFlow pre-trained model that identifies 17 different key points of the human body. It is th

Yash Vora 2 Jan 04, 2022
A dataset for online Arabic calligraphy

Calliar Calliar is a dataset for Arabic calligraphy. The dataset consists of 2500 json files that contain strokes manually annotated for Arabic callig

ARBML 114 Dec 28, 2022
Auto Seg-Loss: Searching Metric Surrogates for Semantic Segmentation

Auto-Seg-Loss By Hao Li, Chenxin Tao, Xizhou Zhu, Xiaogang Wang, Gao Huang, Jifeng Dai This is the official implementation of the ICLR 2021 paper Auto

61 Dec 21, 2022
Neural Style and MSG-Net

PyTorch-Style-Transfer This repo provides PyTorch Implementation of MSG-Net (ours) and Neural Style (Gatys et al. CVPR 2016), which has been included

Hang Zhang 904 Dec 21, 2022
PyTorch implementations of neural network models for keyword spotting

Honk: CNNs for Keyword Spotting Honk is a PyTorch reimplementation of Google's TensorFlow convolutional neural networks for keyword spotting, which ac

Castorini 475 Dec 15, 2022
The Python code for the paper A Hybrid Quantum-Classical Algorithm for Robust Fitting

About The Python code for the paper A Hybrid Quantum-Classical Algorithm for Robust Fitting The demo program was only tested under Conda in a standard

Anh-Dzung Doan 5 Nov 28, 2022
YolactEdge: Real-time Instance Segmentation on the Edge

YolactEdge, the first competitive instance segmentation approach that runs on small edge devices at real-time speeds. Specifically, YolactEdge runs at up to 30.8 FPS on a Jetson AGX Xavier (and 172.7

Haotian Liu 1.1k Jan 06, 2023
Parametric Contrastive Learning (ICCV2021)

Parametric-Contrastive-Learning This repository contains the implementation code for ICCV2021 paper: Parametric Contrastive Learning (https://arxiv.or

DV Lab 156 Dec 21, 2022
Implementation for Curriculum DeepSDF

Curriculum-DeepSDF This repository is an implementation for Curriculum DeepSDF. Full paper is available here. Preparation Please follow original setti

Haidong Zhu 69 Dec 29, 2022
Randomizes the warps in a stock pokeemerald repo.

pokeemerald warp randomizer Randomizes the warps in a stock pokeemerald repo. Usage Instructions Install networkx and matplotlib via pip3 or similar.

Max Thomas 6 Mar 17, 2022
Re-implementation of the vector capsule with dynamic routing

VectorCapsule Re-implementation of the vector capsule with dynamic routing We implement the vector capsule and dynamic routing via graph neural networ

ZhenchaoTang 10 Feb 10, 2022
Subgraph Based Learning of Contextual Embedding

SLiCE Self-Supervised Learning of Contextual Embeddings for Link Prediction in Heterogeneous Networks Dataset details: We use four public benchmark da

Pacific Northwest National Laboratory 27 Dec 01, 2022
A variational Bayesian method for similarity learning in non-rigid image registration (CVPR 2022)

A variational Bayesian method for similarity learning in non-rigid image registration We provide the source code and the trained models used in the re

daniel grzech 14 Nov 21, 2022
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization

This project is now archived. It's been fun working on it, but it's time for me to move on. Thank you for all the support and feedback over the last c

Max Pumperla 2.1k Jan 03, 2023
Source code of all the projects of Udacity Self-Driving Car Engineer Nanodegree.

self-driving-car In this repository I will share the source code of all the projects of Udacity Self-Driving Car Engineer Nanodegree. Hope this might

Andrea Palazzi 2.4k Dec 29, 2022
Individual Treatment Effect Estimation

CAPE Individual Treatment Effect Estimation Run CAPE python train_causal.py --loop 10 -m cape_cau -d NI --i_t 1 Run a baseline model python train_cau

S. Deng 4 Sep 02, 2022
Dataset and codebase for NeurIPS 2021 paper: Exploring Forensic Dental Identification with Deep Learning

Repository under construction. Example dataset, checkpoints, and training/testing scripts will be avaible soon! 💡 Collated best practices from most p

4 Jun 26, 2022