Transfer Learning for Survival Analysis via Efficient L2,1-norm Regularized Cox Regression

被引:0
|
作者
Li, Yan [1 ]
Wang, Lu [2 ]
Wang, Jie [1 ]
Ye, Jieping [1 ,3 ]
Reddy, Chandan K. [4 ]
机构
[1] Univ Michigan, Dept Computat Med & Bioinformat, Ann Arbor, MI 48109 USA
[2] Wayne State Univ, Dept Comp Sci, Detroit, MI 48202 USA
[3] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
[4] Virginia Tech, Dept Comp Sci, Arlington, VA 22203 USA
基金
美国国家科学基金会;
关键词
Transfer learning; survival analysis; regularization; regression; high-dimensional data; CANCER CELL-LINES; ALGORITHM; PATHS;
D O I
10.1109/ICDM.2016.129
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
survival analysis, the primary goal is to monitor several entities and model the occurrence of a particular event of interest. In such applications, it is quite often the case that the event of interest may not always be observed during the study period and this gives rise to the problem of censoring which cannot be easily handled in the standard regression approaches. In addition, obtaining sufficient labeled training instances for learning a robust prediction model is a very time consuming process and can be extremely difficult in practice. In this paper, we propose a transfer learning based Cox method, called Transfer-Cox, which uses auxiliary data to augment learning when there are insufficient amount of training examples. The proposed method aims to extract "useful" knowledge from the source domain and transfer it to the target domain, thus potentially improving the prediction performance in such timeto-event data. The proposed method uses the l(2,1)-normpenalty to encourage multiple predictors to share similar sparsity patterns, thus learns a shared representation across source and target domains, potentially improving the model performance on the target task. To speedup the computation, we apply the screening approach and extend the strong rule to sparse survival analysis models in multiple high-dimensional censored datasets. We demonstrate the performance of the proposed transfer learning method using several synthetic and high-dimensional microarray gene expression benchmark datasets and compare with other related competing state-of-the-art methods. Our results show that the proposed screening approach significantly improves the computational efficiency of the proposed algorithm without compromising the prediction performance. We also demonstrate the scalability of the proposed approach and show that the time taken to obtain the results is linear with respect to both the number of instances and features.
引用
收藏
页码:231 / 240
页数:10
相关论文
共 50 条
  • [21] Face recognition by sparse discriminant analysis via joint L2,1-norm minimization
    Shi, Xiaoshuang
    Yang, Yujiu
    Guo, Zhenhua
    Lai, Zhihui
    PATTERN RECOGNITION, 2014, 47 (07) : 2447 - 2453
  • [22] Underdetermined Wideband Source Localization via Sparse Bayesian Learning modeling l2,1-norm
    Hu, Nan
    Chen, Tingting
    2018 10TH INTERNATIONAL CONFERENCE ON COMMUNICATIONS, CIRCUITS AND SYSTEMS (ICCCAS 2018), 2018, : 217 - 221
  • [23] Joint nuclear- and l2,1-norm regularized heterogeneous tensor decomposition for robust classification
    Jing, Peiguang
    Li, Yaxin
    Li, Xinhui
    Wu, Yuting
    Su, Yuting
    NEUROCOMPUTING, 2021, 464 (464) : 317 - 329
  • [24] Canonical Correlation Analysis With L2,1-Norm for Multiview Data Representation
    Xu, Meixiang
    Zhu, Zhenfeng
    Zhang, Xingxing
    Zhao, Yao
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (11) : 4772 - 4782
  • [25] Regularized Semi-Nonnegative Matrix Factorization Using L2,1-Norm for Data Compression
    Rhodes, Anthony
    Jiang, Bin
    2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 365 - 365
  • [26] Adaptive Neighborhood Propagation by Joint L2,1-norm Regularized Sparse Coding for Representation and Classification
    Jia, Lei
    Zhang, Zhao
    Wang, Lei
    Jiang, Weiming
    Zhao, Mingbo
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 201 - 210
  • [27] Low-rank Representation Regularized by L2,1-norm for Identifying Differentially Expressed Genes
    Wang, Ya-Xuan
    Liu, Jin-Xing
    Gao, Ying-Lian
    Zheng, Chun-Hou
    Dai, Ling-Yun
    2017 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2017, : 626 - 629
  • [28] Assessing Dry Weight of Hemodialysis Patients via Sparse Laplacian Regularized RVFL Neural Network with L2,1-Norm
    Guo, Xiaoyi
    Zhou, Wei
    Lu, Qun
    Du, Aiyan
    Cai, Yinghua
    Ding, Yijie
    BIOMED RESEARCH INTERNATIONAL, 2021, 2021
  • [29] Tensor completion via tensor QR decomposition and L2,1-norm minimization
    Zheng, Yongming
    Xu, An-Bao
    SIGNAL PROCESSING, 2021, 189
  • [30] Anchored Projection Based Capped l2,1-Norm Regression for Super-Resolution
    Ma, Xiaotian
    Zhao, Mingbo
    Zhang, Zhao
    Fan, Jicong
    Zhan, Choujun
    PRICAI 2018: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2018, 11013 : 10 - 18