Easily Process a Batch of Cox Models

Overview

ezcox: Easily Process a Batch of Cox Models

CRAN status Hits R-CMD-check Codecov test coverage Lifecycle: stable

The goal of ezcox is to operate a batch of univariate or multivariate Cox models and return tidy result.

Installation

You can install the released version of ezcox from CRAN with:

install.packages("ezcox")

And the development version from GitHub with:

# install.packages("remotes")
remotes::install_github("ShixiangWang/ezcox")

It is possible to install ezcox from Conda conda-forge channel:

conda install r-ezcox --channel conda-forge

Visualization feature of ezcox needs the recent version of forestmodel, please run the following commands:

remotes::install_github("ShixiangWang/forestmodel")

🔰 Example

This is a basic example which shows you how to get result from a batch of cox models.

library(ezcox)
#> Welcome to 'ezcox' package!
#> =======================================================================
#> You are using ezcox version 0.8.1
#> 
#> Github page  : https://github.com/ShixiangWang/ezcox
#> Documentation: https://shixiangwang.github.io/ezcox/articles/ezcox.html
#> 
#> Run citation("ezcox") to see how to cite 'ezcox'.
#> =======================================================================
#> 
library(survival)

# Build unvariable models
ezcox(lung, covariates = c("age", "sex", "ph.ecog"))
#> => Processing variable age
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> => Processing variable sex
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> => Processing variable ph.ecog
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> # A tibble: 3 × 12
#>   Variable is_control contrast_level ref_level n_contrast n_ref    beta    HR
#>   <chr>    <lgl>      <chr>          <chr>          <int> <int>   <dbl> <dbl>
#> 1 age      FALSE      age            age              228   228  0.0187 1.02 
#> 2 sex      FALSE      sex            sex              228   228 -0.531  0.588
#> 3 ph.ecog  FALSE      ph.ecog        ph.ecog          227   227  0.476  1.61 
#> # … with 4 more variables: lower_95 <dbl>, upper_95 <dbl>, p.value <dbl>,
#> #   global.pval <dbl>

# Build multi-variable models
# Control variable 'age'
ezcox(lung, covariates = c("sex", "ph.ecog"), controls = "age")
#> => Processing variable sex
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> => Processing variable ph.ecog
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> # A tibble: 4 × 12
#>   Variable is_control contrast_level ref_level n_contrast n_ref    beta    HR
#>   <chr>    <lgl>      <chr>          <chr>          <int> <int>   <dbl> <dbl>
#> 1 sex      FALSE      sex            sex              228   228 -0.513  0.599
#> 2 sex      TRUE       age            age              228   228  0.017  1.02 
#> 3 ph.ecog  FALSE      ph.ecog        ph.ecog          227   227  0.443  1.56 
#> 4 ph.ecog  TRUE       age            age              228   228  0.0113 1.01 
#> # … with 4 more variables: lower_95 <dbl>, upper_95 <dbl>, p.value <dbl>,
#> #   global.pval <dbl>
lung$ph.ecog = factor(lung$ph.ecog)
zz = ezcox(lung, covariates = c("sex", "ph.ecog"), controls = "age", return_models=TRUE)
#> => Processing variable sex
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
#> => Processing variable ph.ecog
#> ==> Building Surv object...
#> ==> Building Cox model...
#> ==> Done.
mds = get_models(zz)
str(mds, max.level = 1)
#> List of 2
#>  $ Surv ~ sex + age    :List of 19
#>   ..- attr(*, "class")= chr "coxph"
#>   ..- attr(*, "Variable")= chr "sex"
#>  $ Surv ~ ph.ecog + age:List of 22
#>   ..- attr(*, "class")= chr "coxph"
#>   ..- attr(*, "Variable")= chr "ph.ecog"
#>  - attr(*, "class")= chr [1:2] "ezcox_models" "list"
#>  - attr(*, "has_control")= logi TRUE

show_models(mds)

🌟 Vignettes

📃 Citation

If you are using it in academic research, please cite the preprint arXiv:2110.14232 along with URL of this repo.

Comments
  • Fast way to add interaction terms?

    Fast way to add interaction terms?

    Hi, just wondering how the the interaction terms can be handled as "controls" here. Any way to add them rather than manually create new 'interaction variables' in the data? Cheers.

    opened by lijing-lin 12
  • similar tools or approach

    similar tools or approach

    • https://github.com/kevinblighe/RegParallel https://bioconductor.org/packages/release/data/experiment/vignettes/RegParallel/inst/doc/RegParallel.html
    • https://pubmed.ncbi.nlm.nih.gov/25769333/
    opened by ShixiangWang 12
  • 没有show-models这个函数

    没有show-models这个函数

    install.packages("ezcox")#先安装包 packageVersion("ezcox")#0.4.0版本 library(survival) library(ezcox) library("devtools") install.packages("devtools") devtools::install_github("ShixiangWang/ezcox") lung$ph.ecog <- factor(lung$ph.ecog) zz <- ezcox(lung, covariates = c("sex", "ph.ecog"), controls = "age", return_models = TRUE) zz mds <- get_models(zz) str(mds, max.level = 1) install.packages("forestmodel") library("forestmodel") show_models(mds) 问题是没有show-models这个函数

    opened by demi0304 4
  • 并行速度不够快

    并行速度不够快

    library(survival)
    ### write a function
    fastcox_single <- function(num){
      data= cbind(clin,expreset[,num])
      UniNames <- colnames(data)[-c(1:2)]
      do.call(rbind,lapply(UniNames,function(i){
        surv =as.formula(paste('Surv(times, status)~',i))
        cur_cox=coxph(surv, data = data)
        x = summary(cur_cox)
        HR=x$coefficients[i,"exp(coef)"]
        HR.confint.lower = signif(x$conf.int[i,"lower .95"],3)
        HR.confint.upper = signif(x$conf.int[i,"upper .95"],3)
        CI <- paste0("(",HR.confint.lower, "-",HR.confint.upper,")")
        p.value=x$coef[i,"Pr(>|z|)"]
        data.frame(gene=i,HR=HR,CI=CI,p.value=p.value)
      }))
    }
    
    
    clin = share.data[,1:2]
    expreset = share.data[,-c(1:2)]
    length = ncol(expreset)
    groupdf = data.frame(colnuber = seq(1,length),
                         group = rep(1:ceiling(length/100),each=100,length.out=length))
    index = split(groupdf$colnuber,groupdf$group)
    library(future.apply)
    # options(future.globals.maxSize= 891289600)
    plan(multiprocess)
    share.data.os.result=do.call(rbind,future_lapply(index,fastcox_single))
    
    
    #=== Use ezcox
    # devtools::install_github("ShixiangWang/ezcox")
    res = ezcox::ezcox(share.data, covariates = colnames(share.data)[-(1:2)], parallel = TRUE, time = "times")
    
    
    share.data$VIM.INHBE
    tt = ezcox::ezcox(share.data, covariates = "VIM.INHBE", return_models = T, time = "times")
    
    
    
    

    大批量计算时两者时间差4倍

    enhancement 
    opened by ShixiangWang 3
  • 建议

    建议

    诗翔:

    我用你的这个R包,有两个建议,你可以改进一下:

    1. 对covariates的顺序,按照用户给的顺序进行展示,现在是按照字符的大小排序的。
    2. 对HR太大的值,使用科学记数法进行展示

    这个是用的代码

    zz = ezcox(
      scores.combined,
      covariates = c("JSI", "Tindex", "Subclonal_Aca", "Subclonal_Nec", "ITH_Aca", "ITH_Nec"),
      controls = "Age",
      time = "Survival_months",
      status = "Death",
      return_models = TRUE
    )
    
    mds = get_models(zz)
    
    show_models(mds, drop_controls = TRUE)
    
    

    这个是现在的图

    image

    opened by qingjian1991 2
  • Change format setting including text size

    Change format setting including text size

    See

    library(survival)
    library(forestmodel)
    library(ezcox)
    show_forest(lung, covariates = c("sex", "ph.ecog"), controls = "age", format_options = forest_model_format_options(text_size = 3))
    

    image

    opened by ShixiangWang 0
  • Weekly Digest (22 September, 2019 - 29 September, 2019)

    Weekly Digest (22 September, 2019 - 29 September, 2019)

    Here's the Weekly Digest for ShixiangWang/ezcox:


    ISSUES

    Last week, no issues were created.


    PULL REQUESTS

    Last week, no pull requests were created, updated or merged.


    COMMITS

    Last week there were no commits.


    CONTRIBUTORS

    Last week there were no contributors.


    STARGAZERS

    Last week there were no stargazers.


    RELEASES

    Last week there were no releases.


    That's all for last week, please :eyes: Watch and :star: Star the repository ShixiangWang/ezcox to receive next weekly updates. :smiley:

    You can also view all Weekly Digests by clicking here.

    Your Weekly Digest bot. :calendar:

    opened by weekly-digest[bot] 0
  • Weekly Digest (15 September, 2019 - 22 September, 2019)

    Weekly Digest (15 September, 2019 - 22 September, 2019)

    Here's the Weekly Digest for ShixiangWang/ezcox:


    ISSUES

    Last week, no issues were created.


    PULL REQUESTS

    Last week, no pull requests were created, updated or merged.


    COMMITS

    Last week there were no commits.


    CONTRIBUTORS

    Last week there were no contributors.


    STARGAZERS

    Last week there were no stargazers.


    RELEASES

    Last week there were no releases.


    That's all for last week, please :eyes: Watch and :star: Star the repository ShixiangWang/ezcox to receive next weekly updates. :smiley:

    You can also view all Weekly Digests by clicking here.

    Your Weekly Digest bot. :calendar:

    weekly-digest 
    opened by weekly-digest[bot] 0
  • Weekly Digest (8 September, 2019 - 15 September, 2019)

    Weekly Digest (8 September, 2019 - 15 September, 2019)

    Here's the Weekly Digest for ShixiangWang/ezcox:


    ISSUES

    Last week, no issues were created.


    PULL REQUESTS

    Last week, no pull requests were created, updated or merged.


    COMMITS

    Last week there were no commits.


    CONTRIBUTORS

    Last week there were no contributors.


    STARGAZERS

    Last week there were no stargazers.


    RELEASES

    Last week there were no releases.


    That's all for last week, please :eyes: Watch and :star: Star the repository ShixiangWang/ezcox to receive next weekly updates. :smiley:

    You can also view all Weekly Digests by clicking here.

    Your Weekly Digest bot. :calendar:

    weekly-digest 
    opened by weekly-digest[bot] 0
  • Weekly Digest (1 September, 2019 - 8 September, 2019)

    Weekly Digest (1 September, 2019 - 8 September, 2019)

    Here's the Weekly Digest for ShixiangWang/ezcox:


    ISSUES

    Last week, no issues were created.


    PULL REQUESTS

    Last week, no pull requests were created, updated or merged.


    COMMITS

    Last week there were no commits.


    CONTRIBUTORS

    Last week there were no contributors.


    STARGAZERS

    Last week there were no stargazers.


    RELEASES

    Last week there were no releases.


    That's all for last week, please :eyes: Watch and :star: Star the repository ShixiangWang/ezcox to receive next weekly updates. :smiley:

    You can also view all Weekly Digests by clicking here.

    Your Weekly Digest bot. :calendar:

    weekly-digest 
    opened by weekly-digest[bot] 0
  • Weekly Digest (28 August, 2019 - 4 September, 2019)

    Weekly Digest (28 August, 2019 - 4 September, 2019)

    Here's the Weekly Digest for ShixiangWang/ezcox:


    ISSUES

    Last week, no issues were created.


    PULL REQUESTS

    Last week, no pull requests were created, updated or merged.


    COMMITS

    Last week there were no commits.


    CONTRIBUTORS

    Last week there were no contributors.


    STARGAZERS

    Last week there were no stargazers.


    RELEASES

    Last week there were no releases.


    That's all for last week, please :eyes: Watch and :star: Star the repository ShixiangWang/ezcox to receive next weekly updates. :smiley:

    You can also view all Weekly Digests by clicking here.

    Your Weekly Digest bot. :calendar:

    weekly-digest 
    opened by weekly-digest[bot] 0
Releases(v1.0.1)
Owner
Shixiang Wang
Don't Program by Coincidence.
Shixiang Wang
A PaddlePaddle version image model zoo.

Paddle-Image-Models English | 简体中文 A PaddlePaddle version image model zoo. Install Package Install by pip: $ pip install ppim Install by wheel package

AgentMaker 131 Dec 07, 2022
Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

6 Dec 19, 2022
Jremesh-tools - Blender addon for quad remeshing

JRemesh Tools Blender 2.8 - 3.x addon for quad remeshing. Currently it is a wrap

Jayanam 89 Dec 30, 2022
True Few-Shot Learning with Language Models

This codebase supports using language models (LMs) for true few-shot learning: learning to perform a task using a limited number of examples from a single task distribution.

Ethan Perez 124 Jan 04, 2023
An Exact Solver for Semi-supervised Minimum Sum-of-Squares Clustering

PC-SOS-SDP: an Exact Solver for Semi-supervised Minimum Sum-of-Squares Clustering PC-SOS-SDP is an exact algorithm based on the branch-and-bound techn

Antonio M. Sudoso 1 Nov 13, 2022
Spherical CNNs

Spherical CNNs Equivariant CNNs for the sphere and SO(3) implemented in PyTorch Overview This library contains a PyTorch implementation of the rotatio

Jonas Köhler 893 Dec 28, 2022
A repository for the updated version of CoinRun used to collect MUGEN, a multimodal video-audio-text dataset.

A repository for the updated version of CoinRun used to collect MUGEN, a multimodal video-audio-text dataset. This repo contains scripts to train RL agents to navigate the closed world and collect vi

MUGEN 11 Oct 22, 2022
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain

Kaleido-BERT: Vision-Language Pre-training on Fashion Domain Mingchen Zhuge*, Dehong Gao*, Deng-Ping Fan#, Linbo Jin, Ben Chen, Haoming Zhou, Minghui

248 Dec 04, 2022
small collection of functions for neural networks

neurobiba other languages: RU small collection of functions for neural networks. very easy to use! Installation: pip install neurobiba See examples h

4 Aug 23, 2021
The pure and clear PyTorch Distributed Training Framework.

The pure and clear PyTorch Distributed Training Framework. Introduction Requirements and Usage Dependency Dataset Basic Usage Slurm Cluster Usage Base

WILL LEE 208 Dec 20, 2022
The official implementation of VAENAR-TTS, a VAE based non-autoregressive TTS model.

VAENAR-TTS This repo contains code accompanying the paper "VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis". Sa

THUHCSI 138 Oct 28, 2022
A simple Python library for stochastic graphical ecological models

What is Viridicle? Viridicle is a library for simulating stochastic graphical ecological models. It implements the continuous time models described in

Theorem Engine 0 Dec 04, 2021
[NeurIPS'21] "AugMax: Adversarial Composition of Random Augmentations for Robust Training" by Haotao Wang, Chaowei Xiao, Jean Kossaifi, Zhiding Yu, Animashree Anandkumar, and Zhangyang Wang.

AugMax: Adversarial Composition of Random Augmentations for Robust Training Haotao Wang, Chaowei Xiao, Jean Kossaifi, Zhiding Yu, Anima Anandkumar, an

VITA 112 Nov 07, 2022
BEAS: Blockchain Enabled Asynchronous & Secure Federated Machine Learning

BEAS Blockchain Enabled Asynchronous and Secure Federated Machine Learning Default Network Configuration: The default application uses the HyperLedger

Harpreet Virk 11 Nov 20, 2022
Adjust Decision Boundary for Class Imbalanced Learning

Adjusting Decision Boundary for Class Imbalanced Learning This repository is the official PyTorch implementation of WVN-RS, introduced in Adjusting De

Peyton Byungju Kim 16 Jan 04, 2023
Code base for reproducing results of I.Schubert, D.Driess, O.Oguz, and M.Toussaint: Learning to Execute: Efficient Learning of Universal Plan-Conditioned Policies in Robotics. NeurIPS (2021)

Learning to Execute (L2E) Official code base for completely reproducing all results reported in I.Schubert, D.Driess, O.Oguz, and M.Toussaint: Learnin

3 May 18, 2022
Caffe models in TensorFlow

Caffe to TensorFlow Convert Caffe models to TensorFlow. Usage Run convert.py to convert an existing Caffe model to TensorFlow. Make sure you're using

Saumitro Dasgupta 2.8k Dec 31, 2022
Easy to use Audio Tagging in PyTorch

Audio Classification, Tagging & Sound Event Detection in PyTorch Progress: Fine-tune on audio classification Fine-tune on audio tagging Fine-tune on s

sithu3 15 Dec 22, 2022
A modular application for performing anomaly detection in networks

Deep-Learning-Models-for-Network-Annomaly-Detection The modular app consists for mainly three annomaly detection algorithms. The system supports model

Shivam Patel 1 Dec 09, 2021
State of the Art Neural Networks for Generative Deep Learning

pyradox-generative State of the Art Neural Networks for Generative Deep Learning Table of Contents pyradox-generative Table of Contents Installation U

Ritvik Rastogi 8 Sep 29, 2022