NAUTILUS: boosting Bayesian importance nested sampling with deep learning

被引:9
|
作者
Lange, Johannes U. [1 ,2 ,3 ,4 ,5 ]
机构
[1] Stanford Univ, Kavli Inst Particle Astrophys & Cosmol, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Phys, Stanford, CA 94305 USA
[3] Univ Calif Santa Cruz, Dept Astron & Astrophys, Santa Cruz, CA 95064 USA
[4] Univ Michigan, Dept Phys, Ann Arbor, MI 48109 USA
[5] Univ Michigan, Leinweber Ctr Theoret Phys, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会; 美国国家航空航天局;
关键词
methods: data analysis; methods: statistical; software: data analysis; EFFICIENT; COSMOLOGY; GALAXIES; ACCURATE;
D O I
10.1093/mnras/stad2441
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
We introduce a novel approach to boost the efficiency of the importance nested sampling (INS) technique for Bayesian posterior and evidence estimation using deep learning. Unlike rejection-based sampling methods such as vanilla nested sampling (NS) or Markov chain Monte Carlo (MCMC) algorithms, importance sampling techniques can use all likelihood evaluations for posterior and evidence estimation. However, for efficient importance sampling, one needs proposal distributions that closely mimic the posterior distributions. We show how to combine INS with deep learning via neural network regression to accomplish this task. We also introduce NAUTILUS, a reference open-source PYTHON implementation of this technique for Bayesian posterior and evidence estimation. We compare NAUTILUS against popular NS and MCMC packages, including EMCEE, DYNESTY, ULTRANEST, and POCOMC, on a variety of challenging synthetic problems and real-world applications in exoplanet detection, galaxy SED fitting and cosmology. In all applications, the sampling efficiency of NAUTILUS is substantially higher than that of all other samplers, often by more than an order of magnitude. Simultaneously, NAUTILUS delivers highly accurate results and needs fewer likelihood evaluations than all other samplers tested. We also show that NAUTILUS has good scaling with the dimensionality of the likelihood and is easily parallelizable to many CPUs.
引用
收藏
页码:3181 / 3194
页数:14
相关论文
共 50 条
  • [1] Efficient nonparametric importance sampling for Bayesian learning
    Zlochin, M
    Baram, Y
    [J]. PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 2498 - 2502
  • [2] IMPORTANCE SAMPLING AND THE NESTED BOOTSTRAP
    HINKLEY, DV
    SHI, S
    [J]. BIOMETRIKA, 1989, 76 (03) : 435 - 446
  • [3] Nested sampling for general bayesian computation
    Shaw, J. R.
    Bridges, M.
    Hobson, M. P.
    [J]. MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2007, 378 (04) : 1365 - 1370
  • [4] Nested Sampling for General Bayesian Computation
    Skilling, John
    [J]. BAYESIAN ANALYSIS, 2006, 1 (04): : 833 - 859
  • [5] Bayesian Posterior Repartitioning for Nested Sampling
    Chen, Xi
    Feroz, Farhan
    Hobson, Michael
    [J]. BAYESIAN ANALYSIS, 2023, 18 (03): : 695 - 721
  • [6] Bayesian Reinforcement Learning via Deep, Sparse Sampling
    Grover, Divya
    Basu, Debabrota
    Dimitrakakis, Christos
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 3036 - 3044
  • [7] Importance nested sampling with normalising flows
    Williams, Michael J.
    Veitch, John
    Messenger, Chris
    [J]. MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (03):
  • [8] DIAMONDS: a new Bayesian nested sampling tool
    Corsaro, Enrico
    De Ridder, Joris
    [J]. SPACE PHOTOMETRY REVOLUTION - COROT SYMPOSIUM 3, KEPLER KASC-7 JOINT MEETING, 2015, 101
  • [9] Not All Samples Are Created Equal: Deep Learning with Importance Sampling
    Katharopoulos, Angelos
    Fleuret, Francois
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [10] Importance Sampling with the Integrated Nested Laplace Approximation
    Berild, Martin Outzen
    Martino, Sara
    Gomez-Rubio, Virgilio
    Rue, Havard
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (04) : 1225 - 1237