The hippynn python package - a modular library for atomistic machine learning with pytorch.

Related tags

Deep Learninghippynn
Overview

The hippynn python package - a modular library for atomistic machine learning with pytorch.

We aim to provide a powerful library for the training of atomistic (or physical point-cloud) machine learning. We want entry-level users to be able to efficiently train models to millions of datapoints, and a modular structure for extension or contribution.

While hippynn's development so-far has centered around the HIP-NN architecture, don't let that discourage you if you are performing research with another model. Get in touch, and let's work together to provide a high-quality implementation of your work, either as a contribution or an interface extension to your own package.

Features:

Modular set of pytorch layers for atomistic operations

  • Atomistic operations can be tricky to write in native pytorch. Most operations provided here support linear-scaling models.
  • Model energy, force charge & charge moments, bond orders, and more!
  • nn.Modules are written with minimal reference to the rest of the library; if you want to use them in your scripts without using the rest of the features provided here -- no problem!

Graph level API for simple and flexible construction of models from pytorch components.

  • Build models based on the abstract physics/mathematics of the problem, without having to think about implementation details.
  • Graph nodes support native python syntax, for example different forms of loss can be directly added.
  • Link predicted values in the model with a database entry to compare predicted and true values
  • IndexType logic records metadata about tensor structure, and provides automatic conversion to compatible structures when possible.
  • Graph API is independent of module implementation.

Plot level API for tracking your training.

  • Using the graph API, define quantities to evaluate before, during, or after training as figures using matplotlib.

Training & Experiment API

  • Integrated with graph level API
  • Pretty-printing loss metrics, generating plots periodically
  • Callbacks and checkpointing

Custom Kernels for fast execution

  • Certain operations are not efficiently written in pure pytorch, we provide alternative implementations with numba
  • These are directly linked in with pytorch Autograd -- use them like native pytorch functions.
  • These provide advantages in memory footprint and speed
  • Includes CPU and GPU execution for custom kernels

Interfaces

  • ASE: Define ASE calculators based on the graph-level API.
  • PYSEQM: Use PYSEQM calculations as nodes in a graph.

Installation

  • Clone this repository and navigate into it.
  • Run pip install .

If you fee like tinkering, do an editable install: pip install -e .

You can install using all optional dependencies from pip with: pip install -e .[full]

Notes

  • Install dependencies with pip from requirements.txt .
  • Install dependencies with conda from conda_requirements.txt .
  • If you don't want pip to install them, conda install from file before installing hippynn. You may want to use -c pytorch for the pytorch channel. For ase, you may want to use -c conda-forge.
  • Optional dependencies are in optional_dependencies.txt

We are currently under development. At the moment you should be prepared for breaking changes -- keep track of what version you are using if you need to maintain consistency.

As we clean up the rough edges, we are preparing a manuscript. If, in the mean time, you are using hippynn in your work, please cite this repository and the HIP-NN paper:

Lubbers, N., Smith, J. S., & Barros, K. (2018). Hierarchical modeling of molecular energies using a deep neural network. The Journal of chemical physics, 148(24), 241715.

See AUTHORS.txt for information on authors.

See LICENSE.txt for licensing information. hippynn is licensed under the BSD-3 license.

Triad National Security, LLC (Triad) owns the copyright to hippynn, which it identifies as project number LA-CC-19-093.

Copyright 2019. Triad National Security, LLC. All rights reserved. This program was produced under U.S. Government contract 89233218CNA000001 for Los Alamos National Laboratory (LANL), which is operated by Triad National Security, LLC for the U.S. Department of Energy/National Nuclear Security Administration. All rights in the program are reserved by Triad National Security, LLC, and the U.S. Department of Energy/National Nuclear Security Administration. The Government is granted for itself and others acting on its behalf a nonexclusive, paid-up, irrevocable worldwide license in this material to reproduce, prepare derivative works, distribute copies to the public, perform publicly and display publicly, and to permit others to do so.

Owner
Los Alamos National Laboratory
Los Alamos National Laboratory
Transformer Tracking (CVPR2021)

TransT - Transformer Tracking [CVPR2021] Official implementation of the TransT (CVPR2021) , including training code and trained models. We are revisin

chenxin 465 Jan 06, 2023
DAT4 - General Assembly's Data Science course in Washington, DC

DAT4 Course Repository Course materials for General Assembly's Data Science course in Washington, DC (12/15/14 - 3/16/15). Instructors: Sinan Ozdemir

Kevin Markham 779 Dec 25, 2022
Files for a tutorial to train SegNet for road scenes using the CamVid dataset

SegNet and Bayesian SegNet Tutorial This repository contains all the files for you to complete the 'Getting Started with SegNet' and the 'Bayesian Seg

Alex Kendall 800 Dec 31, 2022
CT-Net: Channel Tensorization Network for Video Classification

[ICLR2021] CT-Net: Channel Tensorization Network for Video Classification @inproceedings{ li2021ctnet, title={{\{}CT{\}}-Net: Channel Tensorization Ne

33 Nov 15, 2022
Efficient neural networks for analog audio effect modeling

micro-TCN Efficient neural networks for audio effect modeling

Christian Steinmetz 94 Dec 29, 2022
Alignment Attention Fusion framework for Few-Shot Object Detection

AAF framework Framework generalities This repository contains the code of the AAF framework proposed in this paper. The main idea behind this work is

Pierre Le Jeune 20 Dec 16, 2022
Contrastively Disentangled Sequential Variational Audoencoder

Contrastively Disentangled Sequential Variational Audoencoder (C-DSVAE) Overview This is the implementation for our C-DSVAE, a novel self-supervised d

Junwen Bai 35 Dec 24, 2022
Pretty Tensor - Fluent Neural Networks in TensorFlow

Pretty Tensor provides a high level builder API for TensorFlow. It provides thin wrappers on Tensors so that you can easily build multi-layer neural networks.

Google 1.2k Dec 29, 2022
An experimentation and research platform to investigate the interaction of automated agents in an abstract simulated network environments.

CyberBattleSim April 8th, 2021: See the announcement on the Microsoft Security Blog. CyberBattleSim is an experimentation research platform to investi

Microsoft 1.5k Dec 25, 2022
李云龙二次元风格化!打滚卖萌,使用了animeGANv2进行了视频的风格迁移

李云龙二次元风格化!一键star、fork,你也可以生成这样的团长! 打滚卖萌求star求fork! 0.效果展示 视频效果前往B站观看效果最佳:李云龙二次元风格化: github开源repo:李云龙二次元风格化 百度AIstudio开源地址,一键fork即可运行: 李云龙二次元风格化!一键fork

oukohou 44 Dec 04, 2022
Tianshou - An elegant PyTorch deep reinforcement learning library.

Tianshou (天授) is a reinforcement learning platform based on pure PyTorch. Unlike existing reinforcement learning libraries, which are mainly based on

Tsinghua Machine Learning Group 5.5k Jan 05, 2023
Official pytorch implement for “Transformer-Based Source-Free Domain Adaptation”

Official implementation for TransDA Official pytorch implement for “Transformer-Based Source-Free Domain Adaptation”. Overview: Result: Prerequisites:

stanley 54 Dec 22, 2022
A containerized REST API around OpenAI's CLIP model.

OpenAI's CLIP — REST API This is a container wrapping OpenAI's CLIP model in a RESTful interface. Running the container locally First, build the conta

Santiago Valdarrama 48 Nov 06, 2022
Repo for the paper "DiLBERT: Cheap Embeddings for Disease Related Medical NLP"

DiLBERT Repo for the paper "DiLBERT: Cheap Embeddings for Disease Related Medical NLP" Pretrained Model The pretrained model presented in the paper is

Kevin Roitero 2 Dec 15, 2022
This repository contains a Ruby API for utilizing TensorFlow.

tensorflow.rb Description This repository contains a Ruby API for utilizing TensorFlow. Linux CPU Linux GPU PIP Mac OS CPU Not Configured Not Configur

somatic labs 825 Dec 26, 2022
Differentiable rasterization applied to 3D model simplification tasks

nvdiffmodeling Differentiable rasterization applied to 3D model simplification tasks, as described in the paper: Appearance-Driven Automatic 3D Model

NVIDIA Research Projects 336 Dec 30, 2022
CapsuleVOS: Semi-Supervised Video Object Segmentation Using Capsule Routing

CapsuleVOS This is the code for the ICCV 2019 paper CapsuleVOS: Semi-Supervised Video Object Segmentation Using Capsule Routing. Arxiv Link: https://a

53 Oct 27, 2022
Official repository of IMPROVING DEEP IMAGE MATTING VIA LOCAL SMOOTHNESS ASSUMPTION.

IMPROVING DEEP IMAGE MATTING VIA LOCAL SMOOTHNESS ASSUMPTION This is the official repository of IMPROVING DEEP IMAGE MATTING VIA LOCAL SMOOTHNESS ASSU

电线杆 14 Dec 15, 2022
This repository contains the source code of an efficient 1D probabilistic model for music time analysis proposed in ICASSP2022 venue.

Jump Reward Inference for 1D Music Rhythmic State Spaces An implementation of the probablistic jump reward inference model for music rhythmic informat

Mojtaba Heydari 25 Dec 16, 2022
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models

Hyperparameter Optimization of Machine Learning Algorithms This code provides a hyper-parameter optimization implementation for machine learning algor

Li Yang 1.1k Dec 19, 2022