Implementation of the Paper: "Parameterized Hypercomplex Graph Neural Networks for Graph Classification" by Tuan Le, Marco Bertolini, Frank Noé and Djork-Arné Clevert

Overview

Parameterized Hypercomplex Graph Neural Networks (PHC-GNNs)

PHC-GNNs (Le et al., 2021): https://arxiv.org/abs/2103.16584

PHM Linear Layer Illustration PHC-GNN Layer Computation Diagram

Overview

Here we provide the implementation of Parameterized Hypercomplex Graph Neural Networks (PHC-GNNs) in PyTorch Geometric, along with 6 minimal execution examples in the benchmarks/ directory.

This repository is organised as follows:

  • phc/hypercomplex/ contains the implementation of the PHC-GNN with all its submodules. This directory resembles the quaternion/ in most cases, with the user-defined phm-dimension n. For more details, check the subdirectory README.md
  • phc/quaternion/ contains the implementation for quaternion GNN with all its submodules. For more details, check the subdirectory README.md
  • benchmarks/ contains the python training-scripts for 3 datasets from Open Graph Benchmark (OGB) and 3 datasets from Benchmarking-GNNs. Additionally, we provide 6 bash-scripts with default arguments to run our models.

Generally speaking, the phc/hypercomplex/ subdirectory also includes the quaternion-valued GNN, with the modification to only work on torch.Tensor objects. The phc/quaternion/ subdirectory was first implemented with the fixed rules of the quaternion-algebra, such as how to perform addition, and multiplication which can be summarized in the quaternion-valued affine transformation. The phc/hypercomplex/ directory generalizes such operations to work directly on torch.Tensor objects, making it applicable to many already existing projects.
For completeness and to share our initial motivation of this project, we also provide the implementations from the phc/quaternion/ subdirectory.

Installation

Requirements

To run our examples, the main requirements are listed in the environment_gpu.yml file. The main requirements used are the following:

python=3.8.5
pytest=6.2.1
cudatoolkit=10.1
cudnn=7.6.5
numpy=1.19.2
scipy=1.5.2
pytorch=1.7.1
torch-geometric=1.6.1
ogb=1.2.4

Conda

Create a new environment:

git clone https://github.com/bayer-science-for-a-better-life/phc-gnn.git
cd phc-gnn
conda env create -f environment_gpu.yml
conda activate phc-gnn

Install Pytorch Geometric and this module with pip by executing the bash-script install_pyg.sh

chmod +x install_pyg.sh
bash install_pyg.sh

#install this library
pip install -e .

Run the implemented pytests in the subdirectories, by executing:

pytest .

Getting started

Run our example scripts in the benchmarks/ directory. Make sure to have the phc-gnn environment activated. For more details, please have a look at benchmarks/README.md.

Reference

If you make use of the implementations of quaternion or parameterized hypercomplex GNN in your research, please cite our manuscript:

@misc{le2021parameterized,
      title={Parameterized Hypercomplex Graph Neural Networks for Graph Classification}, 
      author={Tuan Le and Marco Bertolini and Frank Noé and Djork-Arné Clevert},
      year={2021},
      eprint={2103.16584},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2103.16584}
}

License

GPL-3

Owner
Bayer AG
Science for a better life
Bayer AG
JAX bindings to the Flatiron Institute Non-uniform Fast Fourier Transform (FINUFFT) library

JAX bindings to FINUFFT This package provides a JAX interface to (a subset of) the Flatiron Institute Non-uniform Fast Fourier Transform (FINUFFT) lib

Dan Foreman-Mackey 32 Oct 15, 2022
TEA: A Sequential Recommendation Framework via Temporally Evolving Aggregations

TEA: A Sequential Recommendation Framework via Temporally Evolving Aggregations Requirements python 3.6 torch 1.9 numpy 1.19 Quick Start The experimen

DMIRLAB 4 Oct 16, 2022
Namish Khanna 40 Oct 11, 2022
Technical experimentations to beat the stock market using deep learning :chart_with_upwards_trend:

DeepStock Technical experimentations to beat the stock market using deep learning. Experimentations Deep Learning Stock Prediction with Daily News Hea

Keon 449 Dec 29, 2022
Codebase for ECCV18 "The Sound of Pixels"

Sound-of-Pixels Codebase for ECCV18 "The Sound of Pixels". *This repository is under construction, but the core parts are already there. Environment T

Hang Zhao 318 Dec 20, 2022
Explaining neural decisions contrastively to alternative decisions.

Contrastive Explanations for Model Interpretability This is the repository for the paper "Contrastive Explanations for Model Interpretability", about

AI2 16 Oct 16, 2022
Implementation of SegNet: A Deep Convolutional Encoder-Decoder Architecture for Semantic Pixel-Wise Labelling

Caffe SegNet This is a modified version of Caffe which supports the SegNet architecture As described in SegNet: A Deep Convolutional Encoder-Decoder A

Alex Kendall 1.1k Jan 02, 2023
Code for ACM MM2021 paper "Complementary Trilateral Decoder for Fast and Accurate Salient Object Detection"

CTDNet The PyTorch code for ACM MM2021 paper "Complementary Trilateral Decoder for Fast and Accurate Salient Object Detection" Requirements Python 3.6

CVTEAM 28 Oct 20, 2022
A method to perform unsupervised cross-region adaptation of crop classifiers trained with satellite image time series.

TimeMatch Official source code of TimeMatch: Unsupervised Cross-region Adaptation by Temporal Shift Estimation by Joachim Nyborg, Charlotte Pelletier,

Joachim Nyborg 17 Nov 01, 2022
基于tensorflow 2.x的图片识别工具集

Classification.tf2 基于tensorflow 2.x的图片识别工具集 功能 粗粒度场景图片分类 细粒度场景图片分类 其他场景图片分类 模型部署 tensorflow serving本地推理和docker部署 tensorRT onnx ... 数据集 https://hyper.a

Wei Qi 1 Nov 03, 2021
Training Confidence-Calibrated Classifier for Detecting Out-of-Distribution Samples / ICLR 2018

Training Confidence-Calibrated Classifier for Detecting Out-of-Distribution Samples This project is for the paper "Training Confidence-Calibrated Clas

168 Nov 29, 2022
Memory Defense: More Robust Classificationvia a Memory-Masking Autoencoder

Memory Defense: More Robust Classificationvia a Memory-Masking Autoencoder Authors: - Eashan Adhikarla - Dan Luo - Dr. Brian D. Davison Abstract Many

Eashan Adhikarla 4 Dec 25, 2022
Patch Rotation: A Self-Supervised Auxiliary Task for Robustness and Accuracy of Supervised Models

Patch-Rotation(PatchRot) Patch Rotation: A Self-Supervised Auxiliary Task for Robustness and Accuracy of Supervised Models Submitted to Neurips2021 To

4 Jul 12, 2021
We present a regularized self-labeling approach to improve the generalization and robustness properties of fine-tuning.

Overview This repository provides the implementation for the paper "Improved Regularization and Robustness for Fine-tuning in Neural Networks", which

NEU-StatsML-Research 21 Sep 08, 2022
CCAFNet: Crossflow and Cross-scale Adaptive Fusion Network for Detecting Salient Objects in RGB-D Images

Code and result about CCAFNet(IEEE TMM) 'CCAFNet: Crossflow and Cross-scale Adaptive Fusion Network for Detecting Salient Objects in RGB-D Images' IEE

zyrant丶 14 Dec 29, 2021
code for `Look Closer to Segment Better: Boundary Patch Refinement for Instance Segmentation`

Look Closer to Segment Better: Boundary Patch Refinement for Instance Segmentation (CVPR 2021) Introduction PBR is a conceptually simple yet effective

H.Chen 143 Jan 05, 2023
Monitora la qualità della ricezione dei segnali radio nelle province siciliane.

FMap-server Monitora la qualità della ricezione dei segnali radio nelle province siciliane. Conversion data Frequency - StationName maps are stored in

Triglie 5 May 24, 2021
Rank1 Conversation Emotion Detection Task

Rank1-Conversation_Emotion_Detection_Task accuracy macro-f1 recall 0.826 0.7544 0.719 基于预训练模型和时序预测模型的对话情感探测任务 1 摘要 针对对话情感探测任务,本文将其分为文本分类和时间序列预测两个子任务,分

Yuchen Han 2 Nov 28, 2021
Mesh Graphormer is a new transformer-based method for human pose and mesh reconsruction from an input image

MeshGraphormer ✨ ✨ This is our research code of Mesh Graphormer. Mesh Graphormer is a new transformer-based method for human pose and mesh reconsructi

Microsoft 251 Jan 08, 2023