TensorFlow, PyTorch and Numpy layers for generating Orthogonal Polynomials

Related tags

Deep LearningOrthNet
Overview

OrthNet

TensorFlow, PyTorch and Numpy layers for generating multi-dimensional Orthogonal Polynomials

1. Installation
2. Usage
3. Polynomials
4. Base Class(Poly)

Installation:

  1. the stable version:
    pip3 install orthnet

  2. the dev version:

git clone https://github.com/orcuslc/orthnet.git && cd orthnet
python3 setup.py build_ext --inplace && python3 setup.py install

Usage:

with TensorFlow

import tensorflow as tf
import numpy as np
from orthnet import Legendre

x_data = np.random.random((10, 2))
x = tf.placeholder(dtype = tf.float32, shape = [None, 2])
L = Legendre(x, 5)

with tf.Session() as sess:
    print(L.tensor, feed_dict = {x: x_data})

with PyTorch

import torch
import numpy as np
from orthnet import Legendre

x = torch.DoubleTensor(np.random.random((10, 2)))
L = Legendre(x, 5)
print(L.tensor)

with Numpy

import numpy as np
from orthnet import Legendre

x = np.random.random((10, 2))
L = Legendre(x, 5)
print(L.tensor)

Specify Backend

In some scenarios, users can specify the exact backend compatible with the input x. The backends provided are:

An example to specify the backend is as follows.

import numpy as np
from orthnet import Legendre, NumpyBackend

x = np.random.random((10, 2))
L = Legendre(x, 5, backend = NumpyBackend())
print(L.tensor)

Specify tensor product combinations

In some scenarios, users may provide pre-computed tensor product combinations to save computing time. An example of providing combinations is as follows.

import numpy as np
from orthnet import Legendre, enum_dim

dim = 2
degree = 5
x = np.random.random((10, dim))
L = Legendre(x, degree, combinations = enum_dim(degree, dim))
print(L.tensor)

Polynomials:

Class Polynomial
orthnet.Legendre(Poly) Legendre polynomial
orthnet.Legendre_Normalized(Poly) Normalized Legendre polynomial
orthnet.Laguerre(Poly) Laguerre polynomial
orthnet.Hermite(Poly) Hermite polynomial of the first kind (in probability theory)
orthnet.Hermite2(Poly) Hermite polynomial of the second kind (in physics)
orthnet.Chebyshev(Poly) Chebyshev polynomial of the first kind
orthnet.Chebyshev2(Poly) Chebyshev polynomial of the second kind
orthnet.Jacobi(Poly, alpha, beta) Jacobi polynomial

Base class:

Class Poly(x, degree, combination = None):

  • Inputs:
    • x a tensor
    • degree highest degree for target polynomials
    • combination optional, tensor product combinations
  • Attributes:
    • Poly.tensor the tensor of function values (with degree from 0 to Poly.degree(included))
    • Poly.length the number of function basis (columns) in Poly.tensor
    • Poly.index the index of the first combination of each degree in Poly.combinations
    • Poly.combinations all combinations of tensor product
    • Poly.tensor_of_degree(degree) return all polynomials of given degrees
    • Poly.eval(coefficients) return the function values with given coefficients
    • Poly.quadrature(function, weight) return Gauss quadrature with given function and weight
You might also like...
A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]  An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)
An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)

An Ensemble of CNN (Python 3.5.1 Tensorflow 1.3 numpy 1.13)

Unofficial PyTorch implementation of Attention Free Transformer (AFT) layers by Apple Inc.
Unofficial PyTorch implementation of Attention Free Transformer (AFT) layers by Apple Inc.

aft-pytorch Unofficial PyTorch implementation of Attention Free Transformer's layers by Zhai, et al. [abs, pdf] from Apple Inc. Installation You can i

A library to inspect itermediate layers of PyTorch models.
A library to inspect itermediate layers of PyTorch models.

A library to inspect itermediate layers of PyTorch models. Why? It's often the case that we want to inspect intermediate layers of a model without mod

a reccurrent neural netowrk that when trained on a peice of text and fed a starting prompt will write its on 250 character text using LSTM layers

RNN-Playwrite a reccurrent neural netowrk that when trained on a peice of text and fed a starting prompt will write its on 250 character text using LS

Meta Language-Specific Layers in Multilingual Language Models

Meta Language-Specific Layers in Multilingual Language Models This repo contains the source codes for our paper On Negative Interference in Multilingu

 Improving Deep Network Debuggability via Sparse Decision Layers
Improving Deep Network Debuggability via Sparse Decision Layers

Improving Deep Network Debuggability via Sparse Decision Layers This repository contains the code for our paper: Leveraging Sparse Linear Layers for D

Spectral Tensor Train Parameterization of Deep Learning Layers
Spectral Tensor Train Parameterization of Deep Learning Layers

Spectral Tensor Train Parameterization of Deep Learning Layers This repository is the official implementation of our AISTATS 2021 paper titled "Spectr

RepMLP: Re-parameterizing Convolutions into Fully-connected Layers for Image Recognition

RepMLP: Re-parameterizing Convolutions into Fully-connected Layers for Image Recognition (PyTorch) Paper: https://arxiv.org/abs/2105.01883 Citation: @

Comments
  • Cuda support

    Cuda support

    Hi,

    First of all thank your for developing this project. Is it possible to create the Jacobi.tensor in the gpu? Currently I am creating the tensor in the cpu and then moving them to gpu, which is time consuming.

    Cheers

    opened by mariolinovIC 1
  • Jacobi polynomial incorrect evaluation

    Jacobi polynomial incorrect evaluation

    Hi, I have noticed than when I evaluate Jacobi polynomial with alpha=1 and beta=1 the results are not ok. Particularly I tried in range (-1,1) and I noticed the problem for n greater than 1 (i.e., 2,3,4). Thank you for your support.

    opened by mariolinovIC 0
Owner
Chuan
+1s.
Chuan
Interpretable and Generalizable Person Re-Identification with Query-Adaptive Convolution and Temporal Lifting

QAConv Interpretable and Generalizable Person Re-Identification with Query-Adaptive Convolution and Temporal Lifting This PyTorch code is proposed in

Shengcai Liao 166 Dec 28, 2022
Alex Pashevich 62 Dec 24, 2022
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining

COCO-LM This repository contains the scripts for fine-tuning COCO-LM pretrained models on GLUE and SQuAD 2.0 benchmarks. Paper: COCO-LM: Correcting an

Microsoft 106 Dec 12, 2022
This repo is customed for VisDrone.

Object Detection for VisDrone(无人机航拍图像目标检测) My environment 1、Windows10 (Linux available) 2、tensorflow = 1.12.0 3、python3.6 (anaconda) 4、cv2 5、ensemble

53 Jul 17, 2022
This repo is about implementing different approaches of pose estimation and also is a sub-task of the smart hospital bed project :smile:

Pose-Estimation This repo is a sub-task of the smart hospital bed project which is about implementing the task of pose estimation 😄 Many thanks to th

Max 11 Oct 17, 2022
Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations

TopClus The source code used for Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations, published in WWW 2022. Requ

Yu Meng 63 Dec 18, 2022
Training neural models with structured signals.

Neural Structured Learning in TensorFlow Neural Structured Learning (NSL) is a new learning paradigm to train neural networks by leveraging structured

955 Jan 02, 2023
Some pre-commit hooks for OpenMMLab projects

pre-commit-hooks Some pre-commit hooks for OpenMMLab projects. Using pre-commit-hooks with pre-commit Add this to your .pre-commit-config.yaml - rep

OpenMMLab 16 Nov 29, 2022
(AAAI 2021) Progressive One-shot Human Parsing

End-to-end One-shot Human Parsing This is the official repository for our two papers: Progressive One-shot Human Parsing (AAAI 2021) End-to-end One-sh

54 Dec 30, 2022
Categorical Depth Distribution Network for Monocular 3D Object Detection

CaDDN CaDDN is a monocular-based 3D object detection method. This repository is based off of [OpenPCDet]. Categorical Depth Distribution Network for M

Toronto Robotics and AI Laboratory 289 Jan 05, 2023
The official code for paper "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling".

R2D2 This is the official code for paper titled "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Mode

Alipay 49 Dec 17, 2022
A large dataset of 100k Google Satellite and matching Map images, resembling pix2pix's Google Maps dataset.

Larger Google Sat2Map dataset This dataset extends the aerial ⟷ Maps dataset used in pix2pix (Isola et al., CVPR17). The provide script download_sat2m

34 Dec 28, 2022
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training @ KDD 2020

GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training Original implementation for paper GCC: Graph Contrastive Coding for Graph Neural N

THUDM 274 Dec 27, 2022
HNN: Human (Hollywood) Neural Network

HNN: Human (Hollywood) Neural Network Learn the top 1000 actors on IMDB with your very own low cost, highly parallel, CUDAless biological neural netwo

Madhava Jay 0 Dec 21, 2021
Pytorch for Segmentation

Pytorch for Semantic Segmentation This repo has been deprecated currently and I will not maintain it. Meanwhile, I strongly recommend you can refer to

ycszen 411 Nov 22, 2022
[CoRL 2021] A robotics benchmark for cross-embodiment imitation.

x-magical x-magical is a benchmark extension of MAGICAL specifically geared towards cross-embodiment imitation. The tasks still provide the Demo/Test

Kevin Zakka 36 Nov 26, 2022
Deep Unsupervised 3D SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment.

(ACMMM 2021 Oral) SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment This repository shows two tasks: Face landmark detection and Fac

BoomStar 51 Dec 13, 2022
Process JSON files for neural recording sessions using Medtronic's BrainSense Percept PC neurostimulator

percept_processing This code processes JSON files for streamed neural data using Medtronic's Percept PC neurostimulator with BrainSense Technology for

Maria Olaru 3 Jun 06, 2022
CAUSE: Causality from AttribUtions on Sequence of Events

CAUSE: Causality from AttribUtions on Sequence of Events

Wei Zhang 21 Dec 01, 2022
From a body shape, infer the anatomic skeleton.

OSSO: Obtaining Skeletal Shape from Outside (CVPR 2022) This repository contains the official implementation of the skeleton inference from: OSSO: Obt

Marilyn Keller 166 Dec 28, 2022