⬛ Python Individual Conditional Expectation Plot Toolbox

Overview

PyCEbox

Python Individual Conditional Expectation Plot Toolbox

Individual conditional expectation plot

A Python implementation of individual conditional expecation plots inspired by R's ICEbox. Individual conditional expectation plots were introduced in Peeking Inside the Black Box: Visualizing Statistical Learning with Plots of Individual Conditional Expectation (arXiv:1309.6392).

Quickstart

pycebox is available on PyPI and can be installed with pip install pycebox.

The tutorial recreates the first example in the above paper using pycebox.

Development

For easy development and prototyping using IPython notebooks, a Docker environment is included. To run an IPython notebook with access to your development version of pycebox, run PORT=8889 sh ./start_container.sh. A Jupyter notebook server with access to your development version of pycebox should be available at http://localhost:8889/tree.

To run the pycebox's tests in your development container

  1. Access a bash shell on the container with docker exec -it pycebox bash.
  2. Change to the pycebox directory with cd ../pycebox
  3. Run the tests with pytest test/test.py

Documentation

For details of pycebox's API, consult the documentation.

License

This library is distributed under the MIT License.

Comments
  • Typo in ice_plot() regarding _get_quantiles()

    Typo in ice_plot() regarding _get_quantiles()

    There is a typo in the ice_plot() function when calling the _get_quantiles() function. In lines 124 and 137, the ice_plot() calls __get_quantiles() (which is undefined) instead of _get_quantiles(), which results in an error if trying to use quantiles or center the ICE curves.

    opened by savvastj 6
  • Using predicted probabilities for binary classification

    Using predicted probabilities for binary classification

    Is there any way to give some form of predict_proba function to the ice() function in order to see the probability as opposed to the prediction?

    Thanks! Nema

    opened by nemasobhani 1
  • Plot mistake

    Plot mistake

    There is a problem in the visualization part. When I am trying to plot the graph in the example, I see the following mistake:


    TypeError Traceback (most recent call last) in 12 ice_plot(ice_df, frac_to_plot=0.1, 13 color_by='x3', cmap=PuOr, ---> 14 ax=ice_ax); 15 16 ice_ax.set_xlabel('$X_2$');

    C:\ProgramData\Anaconda3\lib\site-packages\pycebox\ice.py in ice_plot(ice_data, frac_to_plot, plot_points, point_kwargs, x_quantile, plot_pdp, centered, centered_quantile, color_by, cmap, ax, pdp_kwargs, **kwargs) 128 if frac_to_plot < 1.: 129 n_cols = ice_data.shape[1] --> 130 icols = np.random.choice(n_cols, size=frac_to_plot * n_cols, replace=False) 131 plot_ice_data = ice_data.iloc[:, icols] 132 else:

    mtrand.pyx in mtrand.RandomState.choice()

    TypeError: 'float' object cannot be interpreted as an integer

    opened by karakol15 4
  • "frac_to_plot" parameter in ice_plot

    Hey Austin,

    This package rocks, thanks for publishing it!

    I have a question and a potential small bug in the ice_plot method, specifically on the "frac_to_plot" parameter.

    It is my understanding that you simply take the fraction and multiply by the number of columns, and then pass this to the "size" parameter of np.random.choice(). I think we should make sure that the number being passed is an integer, not a float. Otherwise np.random.choice() will not accept a float as a parameter for "size".

    Current: icols = np.random.choice(n_cols, size=frac_to_plot * n_cols, replace=False)

    Fix: icols = np.random.choice(n_cols, size=int(frac_to_plot * n_cols), replace=False)

    Best, Andrew

    opened by andrew-cho 1
  • Extended use to classification models, fixed typecast bug

    Extended use to classification models, fixed typecast bug

    • Extended use to classification models by allowing predict_proba to be passed to the ice_plot function.
    • Fixed 'type error when size is non-int' error for np.random.choice function
    opened by sanjifr3 0
  • Averaging ICE plots across multiple runs/folds of a model

    Averaging ICE plots across multiple runs/folds of a model

    Hi Austin,

    I was wondering if it is possible to average across multiple runs/folds of the same model.

    I am trying at the moment, but the resulting ICE plots do not make sense. The per run plots make sense but when I average them across both runs and folds the data gets screwed.

    Cheers,

    Dan

    opened by danieltudosiu 0
Releases(0.0.1)
Owner
Austin Rochford
Chief Data Scientist @ Kibo Commerce, recovering mathematician, enthusiastic Bayesian
Austin Rochford
Interactive convnet features visualization for Keras

Quiver Interactive convnet features visualization for Keras The quiver workflow Video Demo Build your model in keras model = Model(...) Launch the vis

Keplr 1.7k Dec 21, 2022
Model analysis tools for TensorFlow

TensorFlow Model Analysis TensorFlow Model Analysis (TFMA) is a library for evaluating TensorFlow models. It allows users to evaluate their models on

1.2k Dec 26, 2022
TensorFlowTTS: Real-Time State-of-the-art Speech Synthesis for Tensorflow 2 (supported including English, Korean, Chinese, German and Easy to adapt for other languages)

🤪 TensorFlowTTS provides real-time state-of-the-art speech synthesis architectures such as Tacotron-2, Melgan, Multiband-Melgan, FastSpeech, FastSpeech2 based-on TensorFlow 2. With Tensorflow 2, we c

3k Jan 04, 2023
Visualizer for neural network, deep learning, and machine learning models

Netron is a viewer for neural network, deep learning and machine learning models. Netron supports ONNX, TensorFlow Lite, Keras, Caffe, Darknet, ncnn,

Lutz Roeder 20.9k Dec 28, 2022
Using / reproducing ACD from the paper "Hierarchical interpretations for neural network predictions" 🧠 (ICLR 2019)

Hierarchical neural-net interpretations (ACD) 🧠 Produces hierarchical interpretations for a single prediction made by a pytorch neural network. Offic

Chandan Singh 111 Jan 03, 2023
Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Webis 42 Aug 14, 2022
Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve 73 Dec 12, 2022
A Practical Debugging Tool for Training Deep Neural Networks

Cockpit is a visual and statistical debugger specifically designed for deep learning!

31 Aug 14, 2022
⬛ Python Individual Conditional Expectation Plot Toolbox

⬛ PyCEbox Python Individual Conditional Expectation Plot Toolbox A Python implementation of individual conditional expecation plots inspired by R's IC

Austin Rochford 140 Dec 30, 2022
Bias and Fairness Audit Toolkit

The Bias and Fairness Audit Toolkit Aequitas is an open-source bias audit toolkit for data scientists, machine learning researchers, and policymakers

Data Science for Social Good 513 Jan 06, 2023
Logging MXNet data for visualization in TensorBoard.

Logging MXNet Data for Visualization in TensorBoard Overview MXBoard provides a set of APIs for logging MXNet data for visualization in TensorBoard. T

Amazon Web Services - Labs 327 Dec 05, 2022
Implementation of linear CorEx and temporal CorEx.

Correlation Explanation Methods Official implementation of linear correlation explanation (linear CorEx) and temporal correlation explanation (T-CorEx

Hrayr Harutyunyan 34 Nov 15, 2022
Many Class Activation Map methods implemented in Pytorch for CNNs and Vision Transformers. Including Grad-CAM, Grad-CAM++, Score-CAM, Ablation-CAM and XGrad-CAM

Class Activation Map methods implemented in Pytorch pip install grad-cam ⭐ Comprehensive collection of Pixel Attribution methods for Computer Vision.

Jacob Gildenblat 6.5k Jan 01, 2023
A game theoretic approach to explain the output of any machine learning model.

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allo

Scott Lundberg 18.3k Jan 08, 2023
JittorVis - Visual understanding of deep learning model.

JittorVis - Visual understanding of deep learning model.

182 Jan 06, 2023
Python implementation of R package breakDown

pyBreakDown Python implementation of breakDown package (https://github.com/pbiecek/breakDown). Docs: https://pybreakdown.readthedocs.io. Requirements

MI^2 DataLab 41 Mar 17, 2022
A python library for decision tree visualization and model interpretation.

dtreeviz : Decision Tree Visualization Description A python library for decision tree visualization and model interpretation. Currently supports sciki

Terence Parr 2.4k Jan 02, 2023
Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet

Neural-Backed Decision Trees · Site · Paper · Blog · Video Alvin Wan, *Lisa Dunlap, *Daniel Ho, Jihan Yin, Scott Lee, Henry Jin, Suzanne Petryk, Sarah

Alvin Wan 556 Dec 20, 2022
Interpretability and explainability of data and machine learning models

AI Explainability 360 (v0.2.1) The AI Explainability 360 toolkit is an open-source library that supports interpretability and explainability of datase

1.2k Dec 29, 2022
FairML - is a python toolbox auditing the machine learning models for bias.

======== FairML: Auditing Black-Box Predictive Models FairML is a python toolbox auditing the machine learning models for bias. Description Predictive

Julius Adebayo 338 Nov 09, 2022