Transfer Learning for Survival Analysis via Efficient L2,1-norm Regularized Cox Regression

被引:0
|
作者
Li, Yan [1 ]
Wang, Lu [2 ]
Wang, Jie [1 ]
Ye, Jieping [1 ,3 ]
Reddy, Chandan K. [4 ]
机构
[1] Univ Michigan, Dept Computat Med & Bioinformat, Ann Arbor, MI 48109 USA
[2] Wayne State Univ, Dept Comp Sci, Detroit, MI 48202 USA
[3] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
[4] Virginia Tech, Dept Comp Sci, Arlington, VA 22203 USA
基金
美国国家科学基金会;
关键词
Transfer learning; survival analysis; regularization; regression; high-dimensional data; CANCER CELL-LINES; ALGORITHM; PATHS;
D O I
10.1109/ICDM.2016.129
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
survival analysis, the primary goal is to monitor several entities and model the occurrence of a particular event of interest. In such applications, it is quite often the case that the event of interest may not always be observed during the study period and this gives rise to the problem of censoring which cannot be easily handled in the standard regression approaches. In addition, obtaining sufficient labeled training instances for learning a robust prediction model is a very time consuming process and can be extremely difficult in practice. In this paper, we propose a transfer learning based Cox method, called Transfer-Cox, which uses auxiliary data to augment learning when there are insufficient amount of training examples. The proposed method aims to extract "useful" knowledge from the source domain and transfer it to the target domain, thus potentially improving the prediction performance in such timeto-event data. The proposed method uses the l(2,1)-normpenalty to encourage multiple predictors to share similar sparsity patterns, thus learns a shared representation across source and target domains, potentially improving the model performance on the target task. To speedup the computation, we apply the screening approach and extend the strong rule to sparse survival analysis models in multiple high-dimensional censored datasets. We demonstrate the performance of the proposed transfer learning method using several synthetic and high-dimensional microarray gene expression benchmark datasets and compare with other related competing state-of-the-art methods. Our results show that the proposed screening approach significantly improves the computational efficiency of the proposed algorithm without compromising the prediction performance. We also demonstrate the scalability of the proposed approach and show that the time taken to obtain the results is linear with respect to both the number of instances and features.
引用
收藏
页码:231 / 240
页数:10
相关论文
共 50 条
  • [31] An Augmented Lagrangian Method for l2,1-Norm Minimization Problems in Machine Learning
    Liu Shulun
    Li Jie
    2014 Fifth International Conference on Intelligent Systems Design and Engineering Applications (ISDEA), 2014, : 138 - 140
  • [32] Unsupervised maximum margin feature selection via L2,1-norm minimization
    Shizhun Yang
    Chenping Hou
    Feiping Nie
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1791 - 1799
  • [33] L2,1-norm and graph-regularization based low-rank transfer subspace learning
    Qu L.
    Fang Y.
    Xiong Y.-L.
    Tang J.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2018, 35 (12): : 1738 - 1749
  • [34] Towards Robust Discriminative Projections Learning via Non-Greedy l2,1-Norm MinMax
    Nie, Feiping
    Wang, Zheng
    Wang, Rong
    Wang, Zhen
    Li, Xuelong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (06) : 2086 - 2100
  • [35] Discriminative Kernel Transfer Learning via l2,1- Norm Minimization
    Zhang, Lei
    Jha, Sunil Kr.
    Liu, Tao
    Pei, Guangshu
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 2220 - 2227
  • [36] Power Constrained Contrast Enhancement by Joint L2,1-norm Regularized Sparse Coding for OLED Display
    Lai, En-Hung
    Chen, Bo-Hao
    Shi, Ling-Feng
    IEEE 1ST CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2018), 2018, : 309 - 314
  • [37] Discriminant analysis via jointly L2,1-norm sparse tensor preserving embedding for image classification
    Huang, Rongbing
    Liu, Chang
    Zhou, Jiliu
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2017, 47 : 10 - 22
  • [38] Hypergraph regularized NMF by L2,1-norm for Clustering and Com-abnormal Expression Genes Selection
    Yu, Na
    Gao, Ying-Lian
    Liu, Jin-Xing
    Wang, Juan
    Shang, Junliang
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 578 - 582
  • [39] Classifying fabric defects with evolving Inception v3 by improved L2,1-norm regularized extreme learning machine
    Zhou, Zhiyu
    Yang, Xingfan
    Ji, Jiangfei
    Wang, Yaming
    Zhu, Zefei
    TEXTILE RESEARCH JOURNAL, 2023, 93 (3-4) : 936 - 956
  • [40] The l2,1-Norm Stacked Robust Autoencoders for Domain Adaptation
    Jiang, Wenhao
    Gao, Hongchang
    Chung, Fu-lai
    Huang, Heng
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1723 - 1729