NAUTILUS: boosting Bayesian importance nested sampling with deep learning

被引:9
|
作者
Lange, Johannes U. [1 ,2 ,3 ,4 ,5 ]
机构
[1] Stanford Univ, Kavli Inst Particle Astrophys & Cosmol, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Phys, Stanford, CA 94305 USA
[3] Univ Calif Santa Cruz, Dept Astron & Astrophys, Santa Cruz, CA 95064 USA
[4] Univ Michigan, Dept Phys, Ann Arbor, MI 48109 USA
[5] Univ Michigan, Leinweber Ctr Theoret Phys, Ann Arbor, MI 48109 USA
基金
美国国家航空航天局; 美国国家科学基金会;
关键词
methods: data analysis; methods: statistical; software: data analysis; EFFICIENT; COSMOLOGY; GALAXIES; ACCURATE;
D O I
10.1093/mnras/stad2441
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
We introduce a novel approach to boost the efficiency of the importance nested sampling (INS) technique for Bayesian posterior and evidence estimation using deep learning. Unlike rejection-based sampling methods such as vanilla nested sampling (NS) or Markov chain Monte Carlo (MCMC) algorithms, importance sampling techniques can use all likelihood evaluations for posterior and evidence estimation. However, for efficient importance sampling, one needs proposal distributions that closely mimic the posterior distributions. We show how to combine INS with deep learning via neural network regression to accomplish this task. We also introduce NAUTILUS, a reference open-source PYTHON implementation of this technique for Bayesian posterior and evidence estimation. We compare NAUTILUS against popular NS and MCMC packages, including EMCEE, DYNESTY, ULTRANEST, and POCOMC, on a variety of challenging synthetic problems and real-world applications in exoplanet detection, galaxy SED fitting and cosmology. In all applications, the sampling efficiency of NAUTILUS is substantially higher than that of all other samplers, often by more than an order of magnitude. Simultaneously, NAUTILUS delivers highly accurate results and needs fewer likelihood evaluations than all other samplers tested. We also show that NAUTILUS has good scaling with the dimensionality of the likelihood and is easily parallelizable to many CPUs.
引用
收藏
页码:3181 / 3194
页数:14
相关论文
共 50 条
  • [41] Importance sampling algorithms for Bayesian networks: Principles and performance
    Yuan, Changhe
    Druzdzel, Marek J.
    [J]. MATHEMATICAL AND COMPUTER MODELLING, 2006, 43 (9-10) : 1189 - 1207
  • [42] Importance sampling and Bayesian model comparison in ecology and evolution
    Hudson, Dave W.
    Hodgson, Dave J.
    Cant, Michael A.
    Thompson, Faye J.
    Delahay, Richard
    McDonald, Robbie A.
    McKinley, Trevelyan J.
    [J]. METHODS IN ECOLOGY AND EVOLUTION, 2023, 14 (12): : 2994 - 3006
  • [43] Importance sampling in Bayesian networks using probability trees
    Salmerón, A
    Cano, A
    Moral, S
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2000, 34 (04) : 387 - 413
  • [44] Bayesian Update with Importance Sampling: Required Sample Size
    Sanz-Alonso, Daniel
    Wang, Zijian
    [J]. ENTROPY, 2021, 23 (01) : 1 - 21
  • [45] Recentered Importance Sampling With Applications to Bayesian Model Validation
    McVinish, Ross
    Mengersen, Kerrie
    Nur, Darfiana
    Rousseau, Judith
    Guihenneuc-Jouyaux, Chantal
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2013, 22 (01) : 215 - 228
  • [46] Fast sky localization of gravitational waves using deep learning seeded importance sampling
    Kolmus, Alex
    Baltus, Gregory
    Janquart, Justin
    van Laarhoven, Twan
    Caudill, Sarah
    Heskes, Tom
    [J]. PHYSICAL REVIEW D, 2022, 106 (02)
  • [47] Deep Bayesian Multimedia Learning
    Chien, Jen-Tzung
    [J]. MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 4791 - 4793
  • [48] A Survey on Bayesian Deep Learning
    Wang, Hao
    Yeung, Dit-Yan
    [J]. ACM COMPUTING SURVEYS, 2020, 53 (05)
  • [49] Bayesian Compression for Deep Learning
    Louizos, Christos
    Ullrich, Karen
    Welling, Max
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [50] Deep Learning: A Bayesian Perspective
    Polson, Nicholas G.
    Sokolov, Vadim
    [J]. BAYESIAN ANALYSIS, 2017, 12 (04): : 1275 - 1304