Package for decomposing EMG signals into motor unit firings, as used in Formento et al 2021.

Overview

EMGDecomp

DOI

Package for decomposing EMG signals into motor unit firings, created for Formento et al 2021. Based heavily on Negro et al, 2016. Supports GPU via CUDA and distributed computation via Dask.

Installation

pip install emgdecomp

For those that want to either use Dask and/or CUDA, you can alternatively run:

pip install emgdecomp[dask]
pip install emgdecomp[cuda]

Usage

Basic

# data should be a numpy array of n_channels x n_samples
sampling_rate, data = fetch_data(...)

decomp = EmgDecomposition(
  params=EmgDecompositionParams(
    sampling_rate=sampling_rate
  ))

firings = decomp.decompose(data)
print(firings)

The resulting firings object is a NumPy structured array containing the columns source_idx, discharge_samples, and discharge_seconds. source_idx is a 0-indexed ID for each "source" learned from the data; each source is a putative motor unit. discharge_samples indicates the sample at which the source was detected as "firing"; note that the algorithm can only detect sources up to a delay. discharge_seconds is the conversion of discharge_samples into seconds via the passed-in sampling rate.

As a structured NumPy array, the resulting firings object is suitable for conversion into a Pandas DataFrame:

import pandas as pd
print(pd.DataFrame(firings))

And the "sources" (i.e. components corresponding to motor units) can be interrogated as needed via the decomp.model property:

model = decomp.model
print(model.components)

Advanced

Given an already-fit EmgDecomposition object, you can then decompose a new batch of EMG data with its existing sources via transform:

# Assumes decomp is already fit
new_data = fetch_more_data(...)
new_firings = decomp.transform(new_data)
print(new_firings)

Alternatively, you can add new sources (i.e. new putative motor units) while retaining the existing sources with decompose_batch:

# Assumes decomp is already fit

more_data = fetch_even_more_data(...)
# Firings corresponding to sources that were both existing and newly added
firings2 = decomp.decompose_batch(more_data)
# Should have at least as many components as before decompose_batch()
print(decomp.model.components)

Finally, basic plotting capabilities are included as well:

from emgdecomp.plots import plot_firings, plot_muaps
plot_muaps(decomp, data, firings)
plot_firings(decomp, data, firings)

File I/O

The EmgDecomposition class is equipped with load and save methods that can save/load parameters to disk as needed; for example:

with open('/path/to/decomp.pkl', 'wb') as f:
  decomp.save(f)

with open('/path/to/decomp.pkl', 'rb') as f:
  decomp_reloaded = EmgDecomposition.load(f)

Dask and/or CUDA

Both Dask and CUDA are supported within EmgDecomposition for support for distributed computation across workers and/or use of GPU acceleration. Each are controlled via the use_dask and use_cuda boolean flags in the EmgDecomposition constructor.

Parameter Tuning

See the list of parameters in EmgDecompositionParameters. The defaults on master are set as they were used for Formento et. al, 2021 and should be reasonable defaults for others.

Documentation

See documentation on classes EmgDecomposition and EmgDecompositionParameters for more details.

Acknowledgements

If you enjoy this package and use it for your research, you can:

  • cite the Journal of Neural Engineering paper, Formento et. al 2021, for which this package was developed: TODO
  • cite this github repo using its DOI: 10.5281/zenodo.5641426
  • star this repo using the top-right star button.

Contributing / Questions

Feel free to open issues in this project if there are questions or feature requests. Pull requests for feature requests are very much encouraged, but feel free to create an issue first before implementation to ensure the desired change sounds appropriate.

You might also like...
Useful tool for inserting DataFrames into the Excel sheet.

PyCellFrame Insert Pandas DataFrames into the Excel sheet with a bunch of conditions Install pip install pycellframe Usage Examples Let's suppose that

Import, connect and transform data into Excel

xlwings_query Import, connect and transform data into Excel. Description The concept is to apply data transformations to a main query object. When the

Used for data processing in machine learning, and help us to construct ML model more easily from scratch

Used for data processing in machine learning, and help us to construct ML model more easily from scratch. Can be used in linear model, logistic regression model, and decision tree.

A Python package for Bayesian forecasting with object-oriented design and probabilistic models under the hood.
A Python package for Bayesian forecasting with object-oriented design and probabilistic models under the hood.

Disclaimer This project is stable and being incubated for long-term support. It may contain new experimental code, for which APIs are subject to chang

Statistical package in Python based on Pandas
Statistical package in Python based on Pandas

Pingouin is an open-source statistical package written in Python 3 and based mostly on Pandas and NumPy. Some of its main features are listed below. F

A Python package for the mathematical modeling of infectious diseases via compartmental models
A Python package for the mathematical modeling of infectious diseases via compartmental models

A Python package for the mathematical modeling of infectious diseases via compartmental models. Originally designed for epidemiologists, epispot can be adapted for almost any type of modeling scenario.

GWpy is a collaboration-driven Python package providing tools for studying data from ground-based gravitational-wave detectors

GWpy is a collaboration-driven Python package providing tools for studying data from ground-based gravitational-wave detectors. GWpy provides a user-f

A powerful data analysis package based on mathematical step functions.  Strongly aligned with pandas.
A powerful data analysis package based on mathematical step functions. Strongly aligned with pandas.

The leading use-case for the staircase package is for the creation and analysis of step functions. Pretty exciting huh. But don't hit the close button

Python Package for DataHerb: create, search, and load datasets.
Python Package for DataHerb: create, search, and load datasets.

The Python Package for DataHerb A DataHerb Core Service to Create and Load Datasets.

Comments
  • Expose functions for validation

    Expose functions for validation

    From https://github.com/carmenalab/emgdecomp/issues/3:

    Another question is that could you please provide some interface like '_assert_decomp_successful' at https://github.com/carmenalab/emgdecomp/blob/master/emgdecomp/tests/test_decomposition.py#L140 for validation?

    cc @shihan-ma

    opened by pbotros 1
  • Server restart error

    Server restart error

    Hi, Thanks for your repository!

    I used the scripts in the readme and tried to decompose a 10-s simulated signal (64 channels * 20480 samples). It works at most times, producing around 10 MUs against 18 real ones. However, sometimes our server restarted after running the scripts three or four times. We found that the program stuck at https://github.com/carmenalab/emgdecomp/blob/master/emgdecomp/decomposition.py#L405. After converting 'whitening_matrix' and 'normalized_data' to np.float32, the error decreases but still happens sometimes. Could you please give me some advice on the reason that induced the restart of the server? The memory seems okay and we did not use CUDA at this point.

    Another question is that could you please provide some interface like '_assert_decomp_successful' at https://github.com/carmenalab/emgdecomp/blob/master/emgdecomp/tests/test_decomposition.py#L140 for validation?

    Thanks!

    opened by shihan-ma 3
Releases(v0.1.0)
Synthetic data need to preserve the statistical properties of real data in terms of their individual behavior and (inter-)dependences

Synthetic data need to preserve the statistical properties of real data in terms of their individual behavior and (inter-)dependences. Copula and functional Principle Component Analysis (fPCA) are st

32 Dec 20, 2022
Retail-Sim is python package to easily create synthetic dataset of retaile store.

Retailer's Sale Data Simulation Retail-Sim is python package to easily create synthetic dataset of retaile store. Simulation Model Simulator consists

Corca AI 7 Sep 30, 2022
Implementation in Python of the reliability measures such as Omega.

reliabiliPy Summary Simple implementation in Python of the [reliability](https://en.wikipedia.org/wiki/Reliability_(statistics) measures for surveys:

Rafael Valero Fernández 2 Apr 27, 2022
scikit-survival is a Python module for survival analysis built on top of scikit-learn.

scikit-survival scikit-survival is a Python module for survival analysis built on top of scikit-learn. It allows doing survival analysis while utilizi

Sebastian Pölsterl 876 Jan 04, 2023
Pandas and Dask test helper methods with beautiful error messages.

beavis Pandas and Dask test helper methods with beautiful error messages. test helpers These test helper methods are meant to be used in test suites.

Matthew Powers 18 Nov 28, 2022
Stream-Kafka-ELK-Stack - Weather data streaming using Apache Kafka and Elastic Stack.

Streaming Data Pipeline - Kafka + ELK Stack Streaming weather data using Apache Kafka and Elastic Stack. Data source: https://openweathermap.org/api O

Felipe Demenech Vasconcelos 2 Jan 20, 2022
Predictive Modeling & Analytics on Home Equity Line of Credit

Predictive Modeling & Analytics on Home Equity Line of Credit Data (Python) HMEQ Data Set In this assignment we will use Python to examine a data set

Dhaval Patel 1 Jan 09, 2022
4CAT: Capture and Analysis Toolkit

4CAT: Capture and Analysis Toolkit 4CAT is a research tool that can be used to analyse and process data from online social platforms. Its goal is to m

Digital Methods Initiative 147 Dec 20, 2022
A lightweight interface for reading in output from the Weather Research and Forecasting (WRF) model into xarray Dataset

xwrf A lightweight interface for reading in output from the Weather Research and Forecasting (WRF) model into xarray Dataset. The primary objective of

National Center for Atmospheric Research 43 Nov 29, 2022
This is an analysis and prediction project for house prices in King County, USA based on certain features of the house

This is a project for analysis and estimation of House Prices in King County USA The .csv file contains the data of the house and the .ipynb file con

Amit Prakash 1 Jan 21, 2022
Project under the certification "Data Analysis with Python" on FreeCodeCamp

Sea Level Predictor Assignment You will anaylize a dataset of the global average sea level change since 1880. You will use the data to predict the sea

Bhavya Gopal 3 Jan 31, 2022
Python Package for DataHerb: create, search, and load datasets.

The Python Package for DataHerb A DataHerb Core Service to Create and Load Datasets.

DataHerb 4 Feb 11, 2022
Transform-Invariant Non-Negative Matrix Factorization

Transform-Invariant Non-Negative Matrix Factorization A comprehensive Python package for Non-Negative Matrix Factorization (NMF) with a focus on learn

EMD Group 6 Jul 01, 2022
track your GitHub statistics

GitHub-Stalker track your github statistics 👀 features find new followers or unfollowers find who got a star on your project or remove stars find who

Bahadır Araz 34 Nov 18, 2022
Statistical Analysis 📈 focused on statistical analysis and exploration used on various data sets for personal and professional projects.

Statistical Analysis 📈 This repository focuses on statistical analysis and the exploration used on various data sets for personal and professional pr

Andy Pham 1 Sep 03, 2022
Data Science Environment Setup in single line

datascienv is package that helps your to setup your environment in single line of code with all dependency and it is also include pyforest that provide single line of import all required ml libraries

Ashish Patel 55 Dec 16, 2022
yt is an open-source, permissively-licensed Python library for analyzing and visualizing volumetric data.

The yt Project yt is an open-source, permissively-licensed Python library for analyzing and visualizing volumetric data. yt supports structured, varia

The yt project 367 Dec 25, 2022
A Python 3 library making time series data mining tasks, utilizing matrix profile algorithms

MatrixProfile MatrixProfile is a Python 3 library, brought to you by the Matrix Profile Foundation, for mining time series data. The Matrix Profile is

Matrix Profile Foundation 302 Dec 29, 2022
InDels analysis of CRISPR lines by NGS amplicon sequencing technology for a multicopy gene family.

CRISPRanalysis InDels analysis of CRISPR lines by NGS amplicon sequencing technology for a multicopy gene family. In this work, we present a workflow

2 Jan 31, 2022
BErt-like Neurophysiological Data Representation

BENDR BErt-like Neurophysiological Data Representation This repository contains the source code for reproducing, or extending the BERT-like self-super

114 Dec 23, 2022