Rapid experimentation and scaling of deep learning models on molecular and crystal graphs.

Overview

LitMatter

A template for rapid experimentation and scaling deep learning models on molecular and crystal graphs.

How to use

  1. Clone this repository and start editing, or save it and use it as a template for new projects.
  2. Edit lit_models/models.py with the PyTorch code for your model of interest.
  3. Edit lit_data/data.py to load and process your PyTorch datasets.
  4. Perform interactive experiments in prototyping.py.
  5. Scale network training to any number of GPUs using the example batch scripts.

Principles

LitMatter uses PyTorch Lightning to organize PyTorch code so scientists can rapidly experiment with geometric deep learning and scale up to hundreds of GPUs without difficulty. Many amazing applied ML methods (even those with open-source code) are never used by the wider community because the important details are buried in hundreds of lines of boilerplate code. It may require a significant engineering effort to get the method working on a new dataset and in a different computing environment, and it can be hard to justify this effort before verifying that the method will provide some advantage. Packaging your code with the LitMatter template makes it easy for other researchers to experiment with your models and scale them beyond common benchmark datasets.

Features

  • Maximum flexibility. LitMatter supports arbitrary PyTorch models and dataloaders.
  • Eliminate boilerplate. Engineering code is abstracted away, but still accessible if needed.
  • Full end-to-end pipeline. Data processing, model construction, training, and inference can be launched from the command line, in a Jupyter notebook, or through a SLURM job.
  • Lightweight. Using the template is easier than not using it; it reduces infrastructure overhead for simple and complex deep learning projects.

Examples

The example notebooks show how to use LitMatter to scale model training for different applications.

  • Prototyping GNNs - train an equivariant graph neural network to predict quantum properties of small molecules.
  • Neural Force Fields - train a neural force field on molecular dynamics trajectories of small molecules.
  • DeepChem - train a PyTorch model in DeepChem on a MoleculeNet dataset.
  • 🤗 - train a 🤗 language model to generate molecules.

Note that these examples have additional dependencies beyond the core depdencies of LitMatter.

References

If you use LitMatter for your own research and scaling experiments, please cite the following work: Frey, Nathan C., et al. "Scalable Geometric Deep Learning on Molecular Graphs." NeurIPS 2021 AI for Science Workshop. 2021.

@inproceedings{frey2021scalable,
  title={Scalable Geometric Deep Learning on Molecular Graphs},
  author={Frey, Nathan C and Samsi, Siddharth and McDonald, Joseph and Li, Lin and Coley, Connor W and Gadepally, Vijay},
  booktitle={NeurIPS 2021 AI for Science Workshop},
  year={2021}
}

Please also cite the relevant frameworks: PyG, PyTorch Distributed, PyTorch Lightning,

and any extensions you use: 🤗 , DeepChem, NFFs, etc.

Extensions

When you're ready to upgrade to fully configurable, reproducible, and scalable workflows, use hydra-zen. hydra-zen integrates seamlessly with LitMatter to self-document ML experiments and orchestrate multiple training runs for extensive hyperparameter sweeps.

Disclaimer

DISTRIBUTION STATEMENT A. Approved for public release. Distribution is unlimited.

© 2021 MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Subject to FAR 52.227-11 – Patent Rights – Ownership by the Contractor (May 2014)
SPDX-License-Identifier: MIT

This material is based upon work supported by the Under Secretary of Defense for Research and Engineering under Air Force Contract No. FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Under Secretary of Defense for Research and Engineering.

The software/firmware is provided to you on an As-Is basis.

Owner
Nathan Frey
Postdoc at MIT
Nathan Frey
Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising

Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising

Kai Zhang 1.2k Dec 29, 2022
Pytorch0.4.1 codes for InsightFace

InsightFace_Pytorch Pytorch0.4.1 codes for InsightFace 1. Intro This repo is a reimplementation of Arcface(paper), or Insightface(github) For models,

1.5k Jan 01, 2023
Style-based Neural Drum Synthesis with GAN inversion

Style-based Drum Synthesis with GAN Inversion Demo TensorFlow implementation of a style-based version of the adversarial drum synth (ADS) from the pap

Sound and Music Analysis (SoMA) Group 29 Nov 19, 2022
Python implementation of "Elliptic Fourier Features of a Closed Contour"

PyEFD An Python/NumPy implementation of a method for approximating a contour with a Fourier series, as described in [1]. Installation pip install pyef

Henrik Blidh 71 Dec 09, 2022
A pure PyTorch implementation of the loss described in "Online Segment to Segment Neural Transduction"

ssnt-loss ℹ️ This is a WIP project. the implementation is still being tested. A pure PyTorch implementation of the loss described in "Online Segment t

張致強 1 Feb 09, 2022
Deep and online learning with spiking neural networks in Python

Introduction The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern

Jason Eshraghian 447 Jan 03, 2023
CLNTM - Contrastive Learning for Neural Topic Model

Contrastive Learning for Neural Topic Model This repository contains the impleme

Thong Thanh Nguyen 25 Nov 24, 2022
Pytorch implementation of DeepMind's differentiable neural computer paper.

DNC pytorch This is a Pytorch implementation of DeepMind's Differentiable Neural Computer (DNC) architecture introduced in their recent Nature paper:

Yuanpu Xie 91 Nov 21, 2022
App for identification of various objects. Based on YOLO v4 tiny architecture

Object_detection Repository containing trained model yolo v4 tiny, which is capable of identification 80 different classes Default feed is set to be a

Mateusz Kurdziel 0 Jun 22, 2022
Supporting code for short YouTube series Neural Networks Demystified.

Neural Networks Demystified Supporting iPython notebooks for the YouTube Series Neural Networks Demystified. I've included formulas, code, and the tex

Stephen 1.3k Dec 23, 2022
ECAENet (TensorFlow and Keras)

ECAENet: EfficientNet with Efficient Channel Attention for Plant Species Recognition (SCI:Q3) (Journal of Intelligent & Fuzzy Systems)

4 Dec 22, 2022
Implements pytorch code for the Accelerated SGD algorithm.

AccSGD This is the code associated with Accelerated SGD algorithm used in the paper On the insufficiency of existing momentum schemes for Stochastic O

205 Jan 02, 2023
The code of NeurIPS 2021 paper "Scalable Rule-Based Representation Learning for Interpretable Classification".

Rule-based Representation Learner This is a PyTorch implementation of Rule-based Representation Learner (RRL) as described in NeurIPS 2021 paper: Scal

Zhuo Wang 53 Dec 17, 2022
Pytorch implementation of Nueral Style transfer

Nueral Style Transfer Pytorch implementation of Nueral style transfer algorithm , it is used to apply artistic styles to content images . Content is t

Abhinav 9 Oct 15, 2022
SpinalNet: Deep Neural Network with Gradual Input

SpinalNet: Deep Neural Network with Gradual Input This repository contains scripts for training different variations of the SpinalNet and its counterp

H M Dipu Kabir 142 Dec 30, 2022
An official source code for "Augmentation-Free Self-Supervised Learning on Graphs"

Augmentation-Free Self-Supervised Learning on Graphs An official source code for Augmentation-Free Self-Supervised Learning on Graphs paper, accepted

Namkyeong Lee 59 Dec 01, 2022
Code repository accompanying the paper "On Adversarial Robustness: A Neural Architecture Search perspective"

On Adversarial Robustness: A Neural Architecture Search perspective Preparation: Clone the repository: https://github.com/tdchaitanya/nas-robustness.g

Chaitanya Devaguptapu 4 Nov 10, 2022
Implementation for Homogeneous Unbalanced Regularized Optimal Transport

HUROT: An Homogeneous formulation of Unbalanced Regularized Optimal Transport. This repository provides code related to this preprint. This is an alph

Théo Lacombe 1 Feb 17, 2022
Pixel-level Crack Detection From Images Of Levee Systems : A Comparative Study

PIXEL-LEVEL CRACK DETECTION FROM IMAGES OF LEVEE SYSTEMS : A COMPARATIVE STUDY G

Manisha Panta 2 Jul 23, 2022
Fast Axiomatic Attribution for Neural Networks (NeurIPS*2021)

Fast Axiomatic Attribution for Neural Networks This is the official repository accompanying the NeurIPS 2021 paper: R. Hesse, S. Schaub-Meyer, and S.

Visual Inference Lab @TU Darmstadt 11 Nov 21, 2022