Variational Inference for Gaussian Process Models for Survival Analysis

被引:0
|
作者
Kim, Minyoung [1 ]
Pavlovic, Vladimir [2 ]
机构
[1] Seoul Natl Univ Sci & Technol, Dept Elect Engn, Seoul, South Korea
[2] Rutgers State Univ, Dept Comp Sci, Piscataway, NJ 08854 USA
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian process survival analysis model (GP-SAM) was recently proposed to address key deficiencies of the Cox proportional hazard model, namely the need to account for uncertainty in the hazard function modeling while, at the same time, relaxing the time-covariates factorized assumption of the Cox model. However, the existing MCMC inference algorithms for GPSAM have proven to be slow in practice. In this paper we propose novel and scalable variational inference algorithms for GPSAM that reduce the time complexity of the sampling approaches and improve scalability to large datasets. We accomplish this by employing two effective strategies in scalable GP: i) using pseudo inputs and ii) approximation via random feature expansions. In both setups, we derive the full and partial likelihood formulations, typically considered in survival analysis settings. The proposed approaches are evaluated on two clinical and a divorce-marriage benchmark datasets, where we demonstrate improvements in prediction accuracy over the existing survival analysis methods, while reducing the complexity of inference compared to the recent state-of-the-art MCMC-based algorithms.
引用
收藏
页码:435 / 445
页数:11
相关论文
共 50 条
  • [1] Automated Variational Inference for Gaussian Process Models
    Nguyen, Trung, V
    Bonilla, Edwin, V
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [2] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [3] Variational Inference for Gaussian Process Models with Linear Complexity
    Cheng, Ching-An
    Boots, Byron
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Gaussian variational approximate inference for joint models of longitudinal biomarkers and a survival outcome
    Tu, Jieqi
    Sun, Jiehuan
    [J]. STATISTICS IN MEDICINE, 2023, 42 (03) : 316 - 330
  • [5] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [6] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [7] Fast variational inference for Gaussian process models through KL-correction
    King, Nathaniel J.
    Lawrence, Neil D.
    [J]. MACHINE LEARNING: ECML 2006, PROCEEDINGS, 2006, 4212 : 270 - 281
  • [8] Variational Inference for Sparse Gaussian Process Modulated Hawkes Process
    Zhang, Rui
    Walder, Christian
    Rizoiu, Marian-Andrei
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6803 - 6810
  • [9] Structured Variational Inference in Partially Observable Unstable Gaussian Process State Space Models
    Melchior, Silvan
    Curi, Sebastian
    Berkenkamp, Felix
    Krause, Andreas
    [J]. LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 147 - 157
  • [10] Natural Gradients in Practice: Non-Conjugate Variational Inference in Gaussian Process Models
    Salimbeni, Hugh
    Eleftheriadis, Stefanos
    Hensman, James
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84