Efficient sampling for Bayesian inference of conjunctive Bayesian networks

被引:17
|
作者
Sakoparnig, Thomas [1 ,2 ]
Beerenwinkel, Niko [1 ,2 ]
机构
[1] Swiss Fed Inst Technol, Dept Biosyst Sci & Engn, CH-4058 Basel, Switzerland
[2] Supra Univ, SIB Swiss Inst Bioinformat, Basel, Switzerland
关键词
TREE MODELS; ONCOGENETIC TREE; PROGRESSION; MIXTURES;
D O I
10.1093/bioinformatics/bts433
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Motivation: Cancer development is driven by the accumulation of advantageous mutations and subsequent clonal expansion of cells harbouring these mutations, but the order in which mutations occur remains poorly understood. Advances in genome sequencing and the soon-arriving flood of cancer genome data produced by large cancer sequencing consortia hold the promise to elucidate cancer progression. However, new computational methods are needed to analyse these large datasets. Results: We present a Bayesian inference scheme for Conjunctive Bayesian Networks, a probabilistic graphical model in which mutations accumulate according to partial order constraints and cancer genotypes are observed subject to measurement noise. We develop an efficient MCMC sampling scheme specifically designed to overcome local optima induced by dependency structures. We demonstrate the performance advantage of our sampler over traditional approaches on simulated data and show the advantages of adopting a Bayesian perspective when reanalyzing cancer datasets and comparing our results to previous maximum-likelihood-based approaches.
引用
收藏
页码:2318 / 2324
页数:7
相关论文
共 50 条
  • [31] Structure Learning in Bayesian Networks of a Moderate Size by Efficient Sampling
    He, Ru
    Tian, Jin
    Wu, Huaiqing
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [32] GIBBS SAMPLING IN BAYESIAN NETWORKS
    HRYCEJ, T
    ARTIFICIAL INTELLIGENCE, 1990, 46 (03) : 351 - 363
  • [33] Cutset sampling for Bayesian networks
    Bidyuk, Bozhena
    Dechter, Rina
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2007, 28 : 1 - 48
  • [34] Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks
    Sun, W
    Chang, KC
    Signal Processing, Sensor Fusion, and Target Recognition XIV, 2005, 5809 : 322 - 329
  • [35] Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation
    Haussmann, Manuel
    Hamprecht, Fred A.
    Kandemir, Melih
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 563 - 573
  • [36] Ordered designs and Bayesian inference in survey sampling
    Glen Meeden
    Siamak Noorbaloochi
    Sankhya A, 2010, 72 (1): : 119 - 135
  • [37] Approximate Bayesian inference under informative sampling
    Wang, Z.
    Kim, J. K.
    Yang, S.
    BIOMETRIKA, 2018, 105 (01) : 91 - 102
  • [38] Adaptation Accelerating Sampling-based Bayesian Inference in Attractor Neural Networks
    Dong, Xingsi
    Ji, Zilong
    Chu, Tianhao
    Huang, Tiejun
    Zhang, Wen-Hao
    Wu, Si
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [39] Approximate Slice Sampling for Bayesian Posterior Inference
    DuBois, Christopher
    Korattikara, Anoop
    Welling, Max
    Smyth, Padhraic
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 185 - 193
  • [40] Policy Gradient Importance Sampling for Bayesian Inference
    El-Laham, Yousef
    Bugallo, Monica F.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4245 - 4256