An implementation of Deep Forest 2021.2.1.

Overview

Deep Forest (DF) 21

github readthedocs codecov python pypi style

DF21 is an implementation of Deep Forest 2021.2.1. It is designed to have the following advantages:

  • Powerful: Better accuracy than existing tree-based ensemble methods.
  • Easy to Use: Less efforts on tunning parameters.
  • Efficient: Fast training speed and high efficiency.
  • Scalable: Capable of handling large-scale data.

Whenever one used tree-based machine learning approaches such as Random Forest or GBDT, DF21 may offer a new powerful option.

For a quick start, please refer to How to Get Started. For a detailed guidance on parameter tunning, please refer to Parameters Tunning.

Installation

The package is available via PyPI using:

pip install deep-forest

Quickstart

from sklearn.datasets import load_digits
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score

from deepforest import CascadeForestClassifier

X, y = load_digits(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=1)
model = CascadeForestClassifier(random_state=1)
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
acc = accuracy_score(y_test, y_pred) * 100
print("\nTesting Accuracy: {:.3f} %".format(acc))
>>> Testing Accuracy: 98.667 %

Resources

Reference

@article{zhou2019deep,
    title={Deep forest},
    author={Zhi-Hua Zhou and Ji Feng},
    journal={National Science Review},
    volume={6},
    number={1},
    pages={74--86},
    year={2019}}

@inproceedings{zhou2017deep,
    Author = {Zhi-Hua Zhou and Ji Feng},
    Booktitle = {IJCAI},
    Pages = {3553-3559},
    Title = {{Deep Forest:} Towards an alternative to deep neural networks},
    Year = {2017}}

Acknowledgement

The lead developer and maintainer of DF21 is Mr. Yi-Xuan Xu. Before the release, it has been used internally in the LAMDA Group, Nanjing University, China.

Comments
  • Custom CascadeForestClassifier

    Custom CascadeForestClassifier

    Hey,

    Thanks for your awesome repo.

    I have a question if you don't mind could you please give me an example on how to change RandomForestClassifier and ExtraTreesClassifier in the CascadeForestClassifier?

    opened by Maryom 31
  • Starting the interpretability of the Deep Forest using SHAP

    Starting the interpretability of the Deep Forest using SHAP

    Hey,

    This is an initial implementation, however I'm not sure it will work I see that we will get the following error:

    AttributeError: 'CascadeForestClassifier' object has no attribute 'estimators_'

    What do you think @xuyxu ?

    opened by Maryom 25
  • Error: could not allocate 0 bytes

    Error: could not allocate 0 bytes

    When I was using this package, I experienced the following problem. According to my observation, there is still a lot of available memory. Thus, what's the problem?

      File "deepforest/tree/_tree.pyx", line 123, in deepforest.tree._tree.DepthFirstTreeBuilder.build
      File "deepforest/tree/_tree.pyx", line 256, in deepforest.tree._tree.DepthFirstTreeBuilder.build
      File "deepforest/tree/_tree.pyx", line 480, in deepforest.tree._tree.Tree._resize_node_c
      File "deepforest/tree/_utils.pyx", line 34, in deepforest.tree._utils.safe_realloc
    MemoryError: could not allocate 0 bytes
    
    bug 
    opened by hengzhe-zhang 23
  • Add support for pandas.DataFrame and list in `fit`

    Add support for pandas.DataFrame and list in `fit`

    Now, the fit method only support np.array for input. However, most ml algorithms with scikit-learn-Compatible API (i.e. XGBoost , NGBoost) support DataFrame object or List or numpy array of predictors (n x p) in numeric format using sklearn.utils.check_array . This RP is to make Deep-Forest consistent with other algorithms so that it will be more easy to be used in other integrated machine learning frameworks (i.e. PyCaret).

    opened by IncubatorShokuhou 13
  • [BUG] `CascadeForestRegressor` somehow cannot be inserted into a DataFrame

    [BUG] `CascadeForestRegressor` somehow cannot be inserted into a DataFrame

    Describe the bug CascadeForestRegressor somehow cannot be inserted into a DataFrame

    To Reproduce

    import pandas as pd
    from deepforest import CascadeForestRegressor
    from ngboost import NGBRegressor
    
    ngr = NGBRegressor()  # ngboost regressor for example. xgb, lgb should also be no problem.
    cfr = CascadeForestRegressor()
    df= pd.DataFrame()
    
    # somehow OK
    df.insert(0, "ngr", [ngr])
    # somehow error
    df.insert(0, "cf", [cforest])
    

    Expected behavior No error

    Additional context

    ValueError                                Traceback (most recent call last)
    <ipython-input-32-ab0139d10254> in <module>
    ----> 1 df.insert(0, "cf", [cforest])
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/frame.py in insert(self, loc, column, value, allow_duplicates)
       3760             )
       3761         self._ensure_valid_index(value)
    -> 3762         value = self._sanitize_column(column, value, broadcast=False)
       3763         self._mgr.insert(loc, column, value, allow_duplicates=allow_duplicates)
       3764 
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/frame.py in _sanitize_column(self, key, value, broadcast)
       3900             if not isinstance(value, (np.ndarray, Index)):
       3901                 if isinstance(value, list) and len(value) > 0:
    -> 3902                     value = maybe_convert_platform(value)
       3903                 else:
       3904                     value = com.asarray_tuplesafe(value)
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/dtypes/cast.py in maybe_convert_platform(values)
        110     """ try to do platform conversion, allow ndarray or list here """
        111     if isinstance(values, (list, tuple, range)):
    --> 112         values = construct_1d_object_array_from_listlike(values)
        113     if getattr(values, "dtype", None) == np.object_:
        114         if hasattr(values, "_values"):
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/pandas/core/dtypes/cast.py in construct_1d_object_array_from_listlike(values)
       1636     # making a 1D array that contains list-likes is a bit tricky:
       1637     result = np.empty(len(values), dtype="object")
    -> 1638     result[:] = values
       1639     return result
       1640 
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/deepforest/cascade.py in __getitem__(self, index)
        518 
        519     def __getitem__(self, index):
    --> 520         return self._get_layer(index)
        521 
        522     def _get_n_output(self, y):
    
    /mnt/hdd2/lvhao/miniconda3/envs/pycaret/lib/python3.7/site-packages/deepforest/cascade.py in _get_layer(self, layer_idx)
        561             logger.debug("self.n_layers_ = "+ str(self.n_layers_))
        562             logger.debug("layer_idx = "+ str(layer_idx))
    --> 563             raise ValueError(msg.format(self.n_layers_ - 1, layer_idx))
        564 
        565         layer_key = "layer_{}".format(layer_idx)
    
    ValueError: The layer index should be in the range [0, 1], but got 2 instead.
    

    This bug can be simpliy fixed if we change if not 0 <= layer_idx < self.n_layers_: to if not 0 <= layer_idx <= self.n_layers_:, but I still don't know the cause of this error and whether this fix is corret.

    needtriage 
    opened by IncubatorShokuhou 10
  • [Question] use custom estimator to tackle imbalanced datasets

    [Question] use custom estimator to tackle imbalanced datasets

    Hi All,

    As I expressed previously in another post, I want to express my gratitude for your amazing research. I am delighted you found time in your library to deal with custom estimators. However, I am having difficulty with the following:

    Assume I develop the following implementation (using imblearn) and obtain an AUROC score of 0.62:

    model = BalancedRandomForestClassifier(random_state=global_seed_random_state,
                                           class_weight="balanced_subsample",
                                           n_jobs=-1,
                                           replacement=True,
                                           )
    
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    show_output(model, X_test, y_test, y_pred)
    
    
    Classification_report:
                  precision    recall  f1-score   support
    
               0       0.97      0.61      0.75       499
               1       0.08      0.63      0.14        27
    
        accuracy                           0.61       526
       macro avg       0.52      0.62      0.45       526
    weighted avg       0.92      0.61      0.72       526
    
    ROC AUC Score:
    0.6204260372597047
    

    According to the reviews I've been reading regarding your original paper, if we have good results with RF and other similar classifiers, it is worthwhile to attempt Deep Forest and as base learner the one that worked well. However, I attempted to use the custom estimators via the following implementation:

    model = CascadeForestClassifier(
        random_state=global_seed_random_state,
    )
    
    main_estimators = [BalancedRandomForestClassifier(
        class_weight="balanced_subsample",
        n_jobs=-1,
        replacement=True,
        random_state=global_seed_random_state,
    ) for _ in range(2)]
    
    
    diverse_estimators = [BalancedRandomForestClassifier(
        class_weight="balanced_subsample",
        n_jobs=-1,
        replacement=True,
        random_state=global_seed_random_state,
    ) for _ in range(2)]
    
    estimators = main_estimators + diverse_estimators
    
    # layer
    model.set_estimator(estimators)
    

    The findings, however, are 10% less impressive, with an AUROC of 0.555. Note: Above, diverse estimators appears because I attempted to add ExtraTrees or XGBoost instead of a second set of BalancedRandomForestClassifiers. Could you please attempt to direct me in the proper direction? What did I do incorrectly? From your perspective, what type of diversified classifier should I use? Note 2: An AUROC above of 0.6 is quite promising for my current application.

    Thank you very much for your help in advanced. Great day,

    opened by simonprovost 8
  • Survival models

    Survival models

    Hi maintainer,

    I am wondering is that possible to cascade random survival forest (maybe a sksurv model) instead of RF in your deep forest model? As in #48, it seems that the supported model types are classification and regression. (or did I miss some parts of those tutorial docs?)

    Thanks.

    feature request 
    opened by yunwezhang 8
  • [ENH] Support customized base estimator and predictor

    [ENH] Support customized base estimator and predictor

    resolves #29 #26

    Steps

    • [x] Implement K-Fold wrapper for base estimators
    • [x] Implement customized cascade layer
    • [x] Implement set_estimator and set_predictor for the model
    • [x] Add unit tests
    • [x] Add backward compatibility
    • [x] Add documentation and working examples

    Code Snippet

    from deepforest import CascadeForestClassifier
    
    model = CascadeForestClassifier()
    
    # New Steps
    estimator_1, estimator_2 = your_estimator(), your_estimator()
    model.set_estimator(estimator=[estimator_1, estimator_2],  # a list of your base estimators
                        n_splits=5,  # the number of folds
                        oob_approx=False,  # whether to use out-of-bag approximation
                        random_state=None)  # random state used for base estimators
    
    model.set_predictor(predictor=your_predictor)  # an instantiated object of your predictor
    
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    
    feature request 
    opened by xuyxu 8
  • Label encoder for the case where y is 1-D.

    Label encoder for the case where y is 1-D.

    Resolved issue #13

    This is a very naive label encoder implemented with sklearn.preprocessing.LabelEncoder

    • [x] single output (1-D) partial mode
    • [x] single output (1-D) full mode
    • [x] unit test
    opened by NiMaZi 8
  • can't install package use conda env

    can't install package use conda env

    ERROR: Could not find a version that satisfies the requirement deep-forest (from versions: none) ERROR: No matching distribution found for deep-forest

    system: mac python version: 3.8.5 pip version: 20.2.4

    opened by morestart 7
  • Buffer dtype mismatch

    Buffer dtype mismatch

    调用数据集训练出现错误: File "deepforest/_cutils.pyx", line 59, in deepforest._cutils._map_to_bins File "deepforest/_cutils.pyx", line 76, in deepforest._cutils._map_to_bins ValueError: Buffer dtype mismatch, expected 'const X_DTYPE_C' but got 'long'

    bug 
    opened by Mr-memorandum 7
  • pip install deep-forest didn't work in wsl2

    pip install deep-forest didn't work in wsl2

    i was trying to install the package using wsl2. But the terminal raises an error:

    ERROR: Could not find a version that satisfies the requirement deep-forest (from versions: none) ERROR: No matching distribution found for deep-forest

    i don't find any related articles or even stackoverflow post to solve this, please help me.

    opened by romfahrury 4
  • How to apply shap model to DF model to interpret features?

    How to apply shap model to DF model to interpret features?

    How to apply shap model to DF model to interpret features? I tried to apply it directly, but suggested that the model was not in the SHAP package. https://github.com/slundberg/shap. image I suggest that the author improve the interpretability of the DF model,thanks.

    opened by Leopoldxxx 2
  • importing error

    importing error

    Got this erroe with importing

    ImportError Traceback (most recent call last) Input In [59], in <cell line: 24>() 22 import time 23 import io ---> 24 from deepforest import CascadeForestRegressor 25 import joblib 26 from sklearn.utils.fixes import joblib

    File ~\anaconda3\lib\site-packages\deepforest_init_.py:1, in ----> 1 from .cascade import CascadeForestClassifier, CascadeForestRegressor 2 from .forest import RandomForestClassifier, RandomForestRegressor 3 from .forest import ExtraTreesClassifier, ExtraTreesRegressor

    File ~\anaconda3\lib\site-packages\deepforest\cascade.py:17, in 15 from . import _utils 16 from . import _io ---> 17 from ._layer import ( 18 ClassificationCascadeLayer, 19 RegressionCascadeLayer, 20 CustomCascadeLayer, 21 ) 22 from ._binner import Binner 25 def _get_predictor_kwargs(predictor_kwargs, **kwargs) -> dict:

    File ~\anaconda3\lib\site-packages\deepforest_layer.py:17, in 14 from sklearn.base import BaseEstimator, ClassifierMixin, RegressorMixin 16 from . import _utils ---> 17 from ._estimator import Estimator 18 from .utils.kfoldwrapper import KFoldWrapper 21 def _build_estimator( 22 X, 23 y, (...) 32 sample_weight=None, 33 ):

    File ~\anaconda3\lib\site-packages\deepforest_estimator.py:7, in 4 all = ["Estimator"] 6 import numpy as np ----> 7 from .forest import ( 8 RandomForestClassifier, 9 ExtraTreesClassifier, 10 RandomForestRegressor, 11 ExtraTreesRegressor, 12 ) 13 from sklearn.ensemble import ( 14 RandomForestClassifier as sklearn_RandomForestClassifier, 15 ExtraTreesClassifier as sklearn_ExtraTreesClassifier, 16 RandomForestRegressor as sklearn_RandomForestRegressor, 17 ExtraTreesRegressor as sklearn_ExtraTreesRegressor, 18 ) 21 def make_classifier_estimator( 22 name, 23 criterion, (...) 30 ): 31 # RandomForestClassifier

    File ~\anaconda3\lib\site-packages\deepforest\forest.py:34, in 32 from sklearn.utils import check_random_state, compute_sample_weight 33 from sklearn.exceptions import DataConversionWarning ---> 34 from sklearn.utils.fixes import _joblib_parallel_args 35 from sklearn.utils.validation import check_is_fitted, _check_sample_weight 36 from sklearn.utils.validation import _deprecate_positional_args

    ImportError: cannot import name '_joblib_parallel_args' from 'sklearn.utils.fixes' (C:\Users\Mohammad\anaconda3\lib\site-packages\sklearn\utils\fixes.py)

    scikit-learn was upgraded joblib was upgraded still got error

    opened by MohammadSoltani100 2
  • [BUG] cannot correctly clone `CascadeForestRegressor` with `sklearn.base.clone` when using customized estimators

    [BUG] cannot correctly clone `CascadeForestRegressor` with `sklearn.base.clone` when using customized estimators

    Describe the bug cannot correctly clone CascadeForestClassifier/CascadeForestRegressor object with sklearn.base.clone when using customized stimators

    To Reproduce

    from sklearn.datasets import load_boston
    from sklearn.model_selection import train_test_split
    from sklearn.metrics import mean_squared_error
    from sklearn.base import clone
    from deepforest import CascadeForestRegressor
    import xgboost as xgb
    import lightgbm as lgb
    
    X, y = load_boston(return_X_y=True)
    X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=1)
    model = CascadeForestRegressor(random_state=1)
    
    # set estimator
    n_estimators = 4  # the number of base estimators per cascade layer
    estimators = [lgb.LGBMRegressor(random_state=i)  for i in range(n_estimators)]
    model.set_estimator(estimators)
    
    # set predictor 
    predictor = xgb.XGBRegressor()
    model.set_predictor(predictor)
    
    # clone model
    model_new = clone(model)
    
    # try to fit
    model.fit(X_train, y_train)
    

    Expected behavior No error

    Additional context

    ~/miniconda3/envs/pycaret/lib/python3.8/site-packages/deep_forest-0.1.5-py3.8-linux-x86_64.egg/deepforest/cascade.py in fit(self, X, y, sample_weight)
       1004                 if not hasattr(self, "predictor_"):
       1005                     msg = "Missing predictor after calling `set_predictor`"
    -> 1006                     raise RuntimeError(msg)
       1007 
       1008             binner_ = Binner(
    
    RuntimeError: Missing predictor after calling `set_predictor`
    

    This bug occours because when the model is cloned, if the model has customized predictor or estimators, predictor='custom' will be cloned, while self.predictor_ / self.dummy_estimators will not be correctly cloned, which introduced the bug described above.

    I think this bug can be easily fixed by putting the predictor and the list of estimators into the parameter of CascadeForestClassifier/CascadeForestRegressor, just like the way of those meta estimators (e.g. ngboost), but maybe the corresponding APIs will have to be changed.

    For example, the API parameters could be:

    model = CascadeForestRegressor(
        estimators=[lgb.LGBMRegressor(random_state=i) for i in range(n_estimators)],
        predictor=xgb.XGBRegressor(),
    )
    
    needtriage 
    opened by IncubatorShokuhou 1
  • take() got an unexpected keyword argument 'axis'

    take() got an unexpected keyword argument 'axis'

    Got error with code: from sklearn.datasets import load_digits from sklearn.model_selection import train_test_split from sklearn.metrics import accuracy_score

    from deepforest import CascadeForestClassifier

    model = CascadeForestClassifier(random_state=1) model.fit(X_train, y_train)


    TypeError Traceback (most recent call last) in 6 7 model = CascadeForestClassifier(random_state=1) ----> 8 model.fit(X_train, y_train.values.ravel())

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in fit(self, X, y, sample_weight) 1395 y = self._encode_class_labels(y) 1396 -> 1397 super().fit(X, y, sample_weight) 1398 1399 def predict_proba(self, X):

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in fit(self, X, y, sample_weight) 754 755 # Bin the training data --> 756 X_train_ = self.bin_data(binner, X, is_training_data=True) 757 X_train_ = self.buffer_.cache_data(0, X_train_, is_training_data=True) 758

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/cascade.py in _bin_data(self, binner, X, is_training_data) 665 tic = time.time() 666 if is_training_data: --> 667 X_binned = binner.fit_transform(X) 668 else: 669 X_binned = binner.transform(X)

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/sklearn/base.py in fit_transform(self, X, y, **fit_params) 697 if y is None: 698 # fit method of arity 1 (unsupervised transformation) --> 699 return self.fit(X, **fit_params).transform(X) 700 else: 701 # fit method of arity 2 (supervised transformation)

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/_binner.py in fit(self, X) 128 self.validate_params() 129 --> 130 self.bin_thresholds = _find_binning_thresholds( 131 X, 132 self.n_bins - 1,

    /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/deepforest/_binner.py in _find_binning_thresholds(X, n_bins, bin_subsample, bin_type, random_state) 75 if n_samples > bin_subsample: 76 subset = rng.choice(np.arange(n_samples), bin_subsample, replace=False) ---> 77 X = X.take(subset, axis=0) 78 79 binning_thresholds = []

    TypeError: take() got an unexpected keyword argument 'axis'

    Dataset is loaded with vaex, is this a problem particular for vaex?

    enhancement 
    opened by JiaLeXian 5
Releases(v0.1.7)
  • v0.1.7(Oct 1, 2022)

  • v0.1.6(Sep 17, 2022)

  • v0.1.5(Apr 16, 2021)

  • v0.1.4(Mar 11, 2021)

    Added

    • Add support on customized estimators (#48) @xuyxu
    • Add official support for ManyLinux-aarch64 (#47) @xuyxu

    Fixed

    • Fix the prediction workflow with only one cascade layer (#56) @xuyxu
    • Fix inconsistency on predictor name (#52) @xuyxu
    • Fix accepted types of target for CascadeForestRegressor (#44) @xuyxu

    Improved

    • Improve target checks for CascadeForestRegressor (#53) @chendingyan
    Source code(tar.gz)
    Source code(zip)
  • v0.1.3(Feb 22, 2021)

    Added

    • Add multi-output support for CascadeForestRegressor (#40) @Alex-Medium
    • Add layer-wise feature importances (#39) @xuyxu
    • Add scikit-learn backend (#36) @xuyxu
    • Add official support for Mac-OS (#34) @T-Allen-sudo
    • Add support on configurable criterion (#28) @tczhao
    Source code(tar.gz)
    Source code(zip)
  • v0.1.2(Feb 11, 2021)

  • v0.1.1(Feb 7, 2021)

    Added

    • Implement the get_forest() method for efficient indexing (#22) @xuyxu
    • Support class label encoding (#18) @NiMaZi
    • Support sample weight in fit() (#7) @tczhao
    • Add configurable predictor parameter (#9) @tczhao
    • Add base class BaseEstimator and ClassifierMixin (#8) @pjgao

    Fixed

    • Fix accepted data types on the binner (#23) @xuyxu
    Source code(tar.gz)
    Source code(zip)
Owner
LAMDA Group, Nanjing University
LAMDA is affiliated with the National Key Laboratory for Novel Software Technology and the Department of Computer Science & Technology, Nanjing University.
LAMDA Group, Nanjing University
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations

Thalles Silva 1.7k Dec 28, 2022
minimizer-space de Bruijn graphs (mdBG) for whole genome assembly

rust-mdbg: Minimizer-space de Bruijn graphs (mdBG) for whole-genome assembly rust-mdbg is an ultra-fast minimizer-space de Bruijn graph (mdBG) impleme

Barış Ekim 148 Dec 01, 2022
Get started learning C# with C# notebooks powered by .NET Interactive and VS Code.

.NET Interactive Notebooks for C# Welcome to the home of .NET interactive notebooks for C#! How to Install Download the .NET Coding Pack for VS Code f

.NET Platform 425 Dec 25, 2022
End-to-end Temporal Action Detection with Transformer. [Under review]

TadTR: End-to-end Temporal Action Detection with Transformer By Xiaolong Liu, Qimeng Wang, Yao Hu, Xu Tang, Song Bai, Xiang Bai. This repo holds the c

Xiaolong Liu 105 Dec 25, 2022
pytorch implementation of ABC : Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning

ABC:Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning, NeurIPS 2021 pytorch implementation of ABC : Auxiliary Balanced Class

Hyuck Lee 25 Dec 22, 2022
RRxIO - Robust Radar Visual/Thermal Inertial Odometry: Robust and accurate state estimation even in challenging visual conditions.

RRxIO - Robust Radar Visual/Thermal Inertial Odometry RRxIO offers robust and accurate state estimation even in challenging visual conditions. RRxIO c

Christopher Doer 64 Dec 29, 2022
1st ranked 'driver careless behavior detection' for AI Online Competition 2021, hosted by MSIT Korea.

2021AICompetition-03 본 repo 는 mAy-I Inc. 팀으로 참가한 2021 인공지능 온라인 경진대회 중 [이미지] 운전 사고 예방을 위한 운전자 부주의 행동 검출 모델] 태스크 수행을 위한 레포지토리입니다. mAy-I 는 과학기술정보통신부가 주최하

Junhyuk Park 9 Dec 01, 2022
Robustness between the worst and average case

Robustness between the worst and average case A repository that implements intermediate robustness training and evaluation from the NeurIPS 2021 paper

CMU Locus Lab 16 Dec 02, 2022
Wandb-predictions - WANDB Predictions With Python

WANDB API CI/CD Below we capture the CI/CD scenarios that we would expect with o

Anish Shah 6 Oct 07, 2022
A curated list of Generative Deep Art projects, tools, artworks, and models

Generative Deep Art A curated list of Generative Deep Art projects, tools, artworks, and models Inbox Get started with making AI art in 2022 – deeplea

Filipe Calegario 251 Jan 03, 2023
Background-Click Supervision for Temporal Action Localization

Background-Click Supervision for Temporal Action Localization This repository is the official implementation of BackTAL. In this work, we study the te

LeYang 221 Oct 09, 2022
Revisiting, benchmarking, and refining Heterogeneous Graph Neural Networks.

Heterogeneous Graph Benchmark Revisiting, benchmarking, and refining Heterogeneous Graph Neural Networks. Roadmap We organize our repo by task, and on

THUDM 176 Dec 17, 2022
ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS.

ONNX Runtime Web demo is an interactive demo portal showing real use cases running ONNX Runtime Web in VueJS. It currently supports four examples for you to quickly experience the power of ONNX Runti

Microsoft 58 Dec 18, 2022
Convert onnx models to pytorch.

onnx2torch onnx2torch is an ONNX to PyTorch converter. Our converter: Is easy to use – Convert the ONNX model with the function call convert; Is easy

ENOT 264 Dec 30, 2022
Finetune the base 64 px GLIDE-text2im model from OpenAI on your own image-text dataset

Finetune the base 64 px GLIDE-text2im model from OpenAI on your own image-text dataset

Clay Mullis 82 Oct 13, 2022
[NeurIPS '21] Adversarial Attacks on Graph Classification via Bayesian Optimisation (GRABNEL)

Adversarial Attacks on Graph Classification via Bayesian Optimisation @ NeurIPS 2021 This repository contains the official implementation of GRABNEL,

Xingchen Wan 12 Dec 23, 2022
Merlion: A Machine Learning Framework for Time Series Intelligence

Merlion: A Machine Learning Library for Time Series Table of Contents Introduction Installation Documentation Getting Started Anomaly Detection Foreca

Salesforce 2.8k Dec 30, 2022
Code for Towards Unifying Behavioral and Response Diversity for Open-ended Learning in Zero-sum Games

Unifying Behavioral and Response Diversity for Open-ended Learning in Zero-sum Games How to run our algorithm? Create the new environment using: conda

MARL @ SJTU 8 Dec 27, 2022
QilingLab challenge writeup

qiling lab writeup shielder 在 2021/7/21 發布了 QilingLab 來幫助學習 qiling framwork 的用法,剛好最近有用到,順手解了一下並寫了一下 writeup。 前情提要 Qiling 是一款功能強大的模擬框架,和 qemu user mode

Yuan 17 Nov 17, 2022
Specificity-preserving RGB-D Saliency Detection

Specificity-preserving RGB-D Saliency Detection Authors: Tao Zhou, Huazhu Fu, Geng Chen, Yi Zhou, Deng-Ping Fan, and Ling Shao. 1. Preface This reposi

Tao Zhou 35 Jan 08, 2023