Approximate Bayesian inference for mixture cure models

被引:5
|
作者
Lazaro, E. [1 ]
Armero, C. [1 ]
Gomez-Rubio, V [2 ]
机构
[1] Univ Valencia, Dept Stat & Operat Res, Valencia, Spain
[2] Univ Castilla La Mancha, Dept Math, Albacete, Spain
关键词
Accelerated failure time mixture cure models; Complete and marginal likelihood function; Gibbs sampling; Proportional hazards mixture cure models; Survival analysis; SURVIVAL-DATA; CANCER;
D O I
10.1007/s11749-019-00679-x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Cure models in survival analysis deal with populations in which a part of the individuals cannot experience the event of interest. Mixture cure models consider the target population as a mixture of susceptible and non-susceptible individuals. The statistical analysis of these models focuses on examining the probability of cure (incidence model) and inferring on the time to event in the susceptible subpopulation (latency model). Bayesian inference for mixture cure models has typically relied upon Markov chain Monte Carlo (MCMC) methods. The integrated nested Laplace approximation (INLA) is a recent and attractive approach for doing Bayesian inference but in its natural definition cannot fit mixture models. This paper focuses on the implementation of a feasible INLA extension for fitting standard mixture cure models. Our proposal is based on an iterative algorithm which combines the use of INLA for estimating the process of interest in each of the subpopulations in the study, and Gibbs sampling for computing the posterior distribution of the cure latent indicator variable which classifies individuals to the susceptible or non-susceptible subpopulations. We illustrated our approach by means of the analysis of two paradigmatic datasets in the framework of clinical trials. Outputs provide closing estimates and a substantial reduction of computational time in relation to those using MCMC.
引用
收藏
页码:750 / 767
页数:18
相关论文
共 50 条
  • [21] Bayesian Estimation of Beta Mixture Models with Variational Inference
    Ma, Zhanyu
    Leijon, Arne
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (11) : 2160 - 2173
  • [22] Fast Bayesian Inference in Dirichlet Process Mixture Models
    Wang, Lianming
    Dunson, David B.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2011, 20 (01) : 196 - 216
  • [23] AABC: Approximate approximate Bayesian computation for inference in population-genetic models
    Buzbas, Erkan O.
    Rosenberg, Noah A.
    THEORETICAL POPULATION BIOLOGY, 2015, 99 : 31 - 42
  • [24] Approximate Bayesian Inference in Spatial Generalized Linear Mixed Models
    Eidsvik, Jo
    Martino, Sara
    Rue, Havard
    SCANDINAVIAN JOURNAL OF STATISTICS, 2009, 36 (01) : 1 - 22
  • [25] Parameter Inference for Computational Cognitive Models with Approximate Bayesian Computation
    Kangasraasio, Antti
    Jokinen, Jussi P. P.
    Oulasvirta, Antti
    Howes, Andrew
    Kaski, Samuel
    COGNITIVE SCIENCE, 2019, 43 (06)
  • [26] BAYESIAN NETWORK META-ANALYSIS FOR MIXTURE CURE MODELS
    Garcia, A.
    Ouwens, M. J.
    Postma, M.
    Pham, H. A.
    Heeg, B.
    VALUE IN HEALTH, 2019, 22 : S515 - S515
  • [27] On predictive inference for intractable models via approximate Bayesian computation
    Jarvenpaa, Marko
    Corander, Jukka
    STATISTICS AND COMPUTING, 2023, 33 (02)
  • [28] On predictive inference for intractable models via approximate Bayesian computation
    Marko Järvenpää
    Jukka Corander
    Statistics and Computing, 2023, 33
  • [29] Calibrated Approximate Bayesian Inference
    Xing, Hanwen
    Nicholls, Geoff K.
    Lee, Jeong Eun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [30] Approximate Decentralized Bayesian Inference
    Campbell, Trevor
    How, Jonathan P.
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2014, : 102 - 111