PyTorch implementation of probabilistic deep forecast applied to air quality.

Overview

Probabilistic Deep Forecast

PyTorch implementation of a paper, titled: Probabilistic Deep Learning to Quantify Uncertainty in Air Quality Forecasting arXiv.

Introduction

In this work, we develop a set of deep probabilistic models for air quality forecasting that quantify both aleatoric and epistemic uncertainties and study how to represent and manipulate their predictive uncertainties. In particular: * We conduct a broad empirical comparison and exploratory assessment of state-of-the-art techniques in deep probabilistic learning applied to air quality forecasting. Through exhaustive experiments, we describe training these models and evaluating their predictive uncertainties using various metrics for regression and classification tasks. * We improve uncertainty estimation using adversarial training to smooth the conditional output distribution locally around training data points. * We apply uncertainty-aware models that exploit the temporal and spatial correlation inherent in air quality data using recurrent and graph neural networks. * We introduce a new state-of-the-art example for air quality forecasting by defining the problem setup and selecting proper input features and models.

drawing
Decision score as a function of normalized aleatoric and epistemic confidence thresholds . See animation video here

Installation

install probabilistic_forecast' locally in “editable” mode ( any changes to the original package would reflect directly in your environment, os you don't have to re-insall the package every time you make some changes):

pip install -e .

Use the configuration file equirements.txt to the install the required packages to run this project.

File Structure

.
├── probabilistic_forecast/
│   ├── bnn.py (class definition for the Bayesian neural networks model)
│   ├── ensemble.py (class definition for the deep ensemble model)
│   ├── gnn_mc.py (class definition for the graph neural network model with MC dropout)
│   ├── lstm_mc.py (class definition for the LSTM model with MC dropout)
│   ├── nn_mc.py (class definition for the standard neural network model with MC droput)
│   ├── nn_standard.py (class definition for the standard neural network model without MC dropout)
│   ├── swag.py (class definition for the SWAG model)
│   └── utils/
│       ├── data_utils.py (utility functions for data loading and pre-processing)
│       ├── gnn_utils.py (utility functions for GNN)
│       ├── plot_utils.py (utility functions for plotting training and evaluation results)
│       ├── swag_utils.py  (utility functions for SWAG)
│       └── torch_utils.py (utility functions for torch dataloader, checking if CUDA is available)
├── dataset/
│   ├── air_quality_measurements.csv (dataset of air quality measurements)
│   ├── street_cleaning.csv  (dataset of air street cleaning records)
│   ├── traffic.csv (dataset of traffic volumes)
│   ├── weather.csv  (dataset of weather observations)
│   └── visualize_data.py  (script to visualize all dataset)
├── main.py (main function with argument parsing to load data, build a model and evaluate (or train))
├── tests/
│   └── confidence_reliability.py (script to evaluate the reliability of confidence estimates of pretrained models)
│   └── epistemic_vs_aleatoric.py (script to show the impact of quantifying both epistemic and aleatoric uncertainties)
├── plots/ (foler containing all evaluation plots)
├── pretrained/ (foler containing pretrained models and training curves plots)
├── evaluate_all_models.sh (bash script for evaluating all models at once)
└── train_all_models.sh (bash script for training all models at once)

Evaluating Pretrained Models

Evaluate a pretrained model, for example:

python main.py --model=SWAG --task=regression --mode=evaluate  --adversarial_training

or evaluate all models:

bash evaluate_all_models.sh
drawing
PM-value regression using Graph Neural Network with MC dropout

Threshold-exceedance prediction

drawing
Threshold-exceedance prediction using Bayesian neural network (BNN)

Confidence Reliability

To evaluate the confidence reliability of the considered probabilistic models, run the following command:

python tests/confidence_reliability.py

It will generate the following plots:

drawing
Confidence reliability of probabilistic models in PM-value regression task in all monitoring stations.
drawing
Confidence reliability of probabilistic models in threshold-exceedance prediction task in all monitoring stations.

Epistemic and aleatoric uncertainties in decision making

To evaluate the impact of quantifying both epistemic and aleatoric uncertainties in decision making, run the following command:

python tests/epistemic_vs_aleatoric.py

It will generate the following plots:

Decision score in a non-probabilistic model
as a function of only aleatoric confidence.
Decision score in a probabilistic model as a function
of both epistemic and aleatoric confidences.
drawing drawing

It will also generate an .vtp file, which can be used to generate a 3D plot with detailed rendering and lighting in ParaView.

Training Models

Train a single model, for example:

python main.py --model=SWAG --task=regression --mode=train --n_epochs=3000 --adversarial_training

or train all models:

bash train_all_models.sh
drawing
Learning curve of training a BNNs model to forecast PM-values. Left: negative log-likelihood loss,
Center: KL loss estimated using MC sampling, Right: learning rate of exponential decay.

Dataset

Run the following command to visualize all data

python dataset/visualize_data.py

It will generate plots in the "dataset folder". For example:

drawing
Air quality level over two years in one representative monitoring station (Elgeseter) in Trondheim, Norway

Attribution

Owner
Abdulmajid Murad
PhD Student, Faculty of Information Technology and Electrical Engineering, NTNU
Abdulmajid Murad
MagFace: A Universal Representation for Face Recognition and Quality Assessment

MagFace MagFace: A Universal Representation for Face Recognition and Quality Assessment in IEEE Conference on Computer Vision and Pattern Recognition

Qiang Meng 523 Jan 05, 2023
RIFE: Real-Time Intermediate Flow Estimation for Video Frame Interpolation

RIFE - Real Time Video Interpolation arXiv | YouTube | Colab | Tutorial | Demo Table of Contents Introduction Collection Usage Evaluation Training and

hzwer 3k Jan 04, 2023
Automates Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning :rocket:

MLJAR Automated Machine Learning Documentation: https://supervised.mljar.com/ Source Code: https://github.com/mljar/mljar-supervised Table of Contents

MLJAR 2.4k Dec 31, 2022
This repository is all about spending some time the with the original problem posed by Minsky and Papert

This repository is all about spending some time the with the original problem posed by Minsky and Papert. Working through this problem is a great way to begin learning computer vision.

Jaissruti Nanthakumar 1 Jan 23, 2022
Video Instance Segmentation using Inter-Frame Communication Transformers (NeurIPS 2021)

Video Instance Segmentation using Inter-Frame Communication Transformers (NeurIPS 2021) Paper Video Instance Segmentation using Inter-Frame Communicat

Sukjun Hwang 81 Dec 29, 2022
A symbolic-model-guided fuzzer for TLS

tlspuffin TLS Protocol Under FuzzINg A symbolic-model-guided fuzzer for TLS Master Thesis | Thesis Presentation | Documentation Disclaimer: The term "

69 Dec 20, 2022
This code is an unofficial implementation of HiFiSinger.

HiFiSinger This code is an unofficial implementation of HiFiSinger. The algorithm is based on the following papers: Chen, J., Tan, X., Luan, J., Qin,

Heejo You 87 Dec 23, 2022
Automated Attendance Project Using Face Recognition

dependencies for project: cmake 3.22.1 dlib 19.22.1 face-recognition 1.3.0 openc

Rohail Taha 1 Jan 09, 2022
Source code for the paper: Variance-Aware Machine Translation Test Sets (NeurIPS 2021 Datasets and Benchmarks Track)

Variance-Aware-MT-Test-Sets Variance-Aware Machine Translation Test Sets License See LICENSE. We follow the data licensing plan as the same as the WMT

NLP2CT Lab, University of Macau 5 Dec 21, 2021
[MICCAI'20] AlignShift: Bridging the Gap of Imaging Thickness in 3D Anisotropic Volumes

AlignShift NEW: Code for our new MICCAI'21 paper "Asymmetric 3D Context Fusion for Universal Lesion Detection" will also be pushed to this repository

Medical 3D Vision 42 Jan 06, 2023
Unofficial Alias-Free GAN implementation. Based on rosinality's version with expanded training and inference options.

Alias-Free GAN An unofficial version of Alias-Free Generative Adversarial Networks (https://arxiv.org/abs/2106.12423). This repository was heavily bas

dusk (they/them) 75 Dec 12, 2022
Trading and Backtesting environment for training reinforcement learning agent or simple rule base algo.

TradingGym TradingGym is a toolkit for training and backtesting the reinforcement learning algorithms. This was inspired by OpenAI Gym and imitated th

Yvictor 1.1k Jan 02, 2023
Kindle is an easy model build package for PyTorch.

Kindle is an easy model build package for PyTorch. Building a deep learning model became so simple that almost all model can be made by copy and paste from other existing model codes. So why code? wh

Jongkuk Lim 77 Nov 11, 2022
A parametric soroban written with CADQuery.

A parametric soroban written in CADQuery The purpose of this project is to demonstrate how "code CAD" can be intuitive to learn. See soroban.py for a

Lee 4 Aug 13, 2022
Collect some papers about transformer with vision. Awesome Transformer with Computer Vision (CV)

Awesome Visual-Transformer Collect some Transformer with Computer-Vision (CV) papers. If you find some overlooked papers, please open issues or pull r

dkliang 2.8k Jan 08, 2023
Code of Periodic Activation Functions Induce Stationarity

Periodic Activation Functions Induce Stationarity This repository is the official implementation of the methods in the publication: L. Meronen, M. Tra

AaltoML 12 Jun 07, 2022
Decentralized Reinforcment Learning: Global Decision-Making via Local Economic Transactions (ICML 2020)

Decentralized Reinforcement Learning This is the code complementing the paper Decentralized Reinforcment Learning: Global Decision-Making via Local Ec

40 Oct 30, 2022
PyTorch implementation of Neural Dual Contouring.

NDC PyTorch implementation of Neural Dual Contouring. Citation We are still writing the paper while adding more improvements and applications. If you

Zhiqin Chen 140 Dec 26, 2022
TransZero++: Cross Attribute-guided Transformer for Zero-Shot Learning

TransZero++ This repository contains the testing code for the paper "TransZero++: Cross Attribute-guided Transformer for Zero-Shot Learning" submitted

Shiming Chen 6 Aug 16, 2022
Activating More Pixels in Image Super-Resolution Transformer

HAT [Paper Link] Activating More Pixels in Image Super-Resolution Transformer Xiangyu Chen, Xintao Wang, Jiantao Zhou and Chao Dong BibTeX @article{ch

XyChen 270 Dec 27, 2022