astroABC: An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

被引:49
|
作者
Jennings, E. [1 ,2 ]
Madigan, M. [3 ]
机构
[1] Fermilab Natl Accelerator Lab, Ctr Particle Astrophys, MS209,POB 500,Kirk Rd & Pine St, Batavia, IL 60510 USA
[2] Univ Chicago, Enrico Fermi Inst, Kavli Inst Cosmol Phys, Chicago, IL 60637 USA
[3] Univ Dublin, Trinity Coll, Dept Theoret Phys, Dublin, Ireland
基金
美国能源部; 美国国家科学基金会; 英国科学技术设施理事会;
关键词
Cosmology; Theory cosmology; Cosmological parameters galaxies; Statistics methods; Statistical (stars); Supernovae; MODEL SELECTION; INFERENCE; PROBES;
D O I
10.1016/j.ascom.2017.01.001
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
Given the complexity of modern cosmological parameter inference where we are faced with non Gaussian data and noise, correlated systematics and multi-probe correlated datasets, the Approximate Bayesian Computation (ABC) method is a promising alternative to traditional Markov Chain Monte Carlo approaches in the case where the Likelihood is intractable or unknown. The ABC method is called "Likelihood free" as it avoids explicit evaluation of the Likelihood by using a forward model simulation of the data which can include systematics. We introduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler for parameter estimation. A key challenge in astrophysics is the efficient use of large multi-probe datasets to constrain high dimensional, possibly correlated parameter spaces. With this in mind astroABC allows for massive parallelization using MPI, a framework that handles spawning of processes across multiple nodes. A key new feature of astroABC is the ability to create MPI groups with different communicators, one for the sampler and several others for the forward model simulation, which speeds up sampling time considerably. For smaller jobs the Python multiprocessing option is also available. Other key features of this new sampler include: a Sequential Monte Carlo sampler; a method for iteratively adapting tolerance levels; local covariance estimate using scikit-learn's KDTree; modules for specifying optimal covariance matrix for a component-wise or multivariate normal perturbation kernel and a weighted covariance metric; restart files output frequently so an interrupted sampling run can be resumed at any iteration; output and restart files are backed up at every iteration; user defined distance metric and simulation methods; a module for specifying heterogeneous parameter priors including non-standard prior PDFs; a module for specifying a constant, linear, log or exponential tolerance level; well-documented examples and sample scripts. This code is hosted online at https://github.com/EliseJ/astroABC. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:16 / 22
页数:7
相关论文
共 50 条
  • [41] Parameter Estimation of Platelets Deposition: Approximate Bayesian Computation With High Performance Computing
    Dutta, Ritabrata
    Chopard, Bastien
    Laett, Jonas
    Dubois, Frank
    Boudjeltia, Karim Zouaoui
    Mira, Antonietta
    [J]. FRONTIERS IN PHYSIOLOGY, 2018, 9
  • [42] APPROXIMATE BAYESIAN COMPUTATION FOR COPULA ESTIMATION
    Grazian, Clara
    Liseo, Brunero
    [J]. STATISTICA, 2015, 75 (01) : 111 - 127
  • [43] A fast approximate method for parameter sensitivity estimation in Monte Carlo structural reliability
    Melchers, RE
    Ahammed, M
    [J]. COMPUTERS & STRUCTURES, 2004, 82 (01) : 55 - 61
  • [44] BAYESIAN COMPUTATION VIA THE GIBBS SAMPLER AND RELATED MARKOV-CHAIN MONTE-CARLO METHODS
    SMITH, AFM
    ROBERTS, GO
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1993, 55 (01): : 3 - 23
  • [45] Calibrating the Discrete Boundary Conditions of a Dynamic Simulation: A Combinatorial Approximate Bayesian Computation Sequential Monte Carlo (ABC-SMC) Approach
    Shamas, Jah
    Rogers, Tim
    Krynkin, Anton
    Prisutova, Jevgenija
    Gardner, Paul
    Horoshenkov, Kirill V.
    Shelley, Samuel R.
    Dickenson, Paul
    [J]. SENSORS, 2024, 24 (15)
  • [46] COSMOABC: Likelihood-free inference via Population Monte Carlo Approximate Bayesian Computation
    Ishida, E. E. O.
    Vitenti, S. D. P.
    Penna-Lima, M.
    Cisewski, J.
    de Souza, R. S.
    Trindade, A. M. M.
    Cameron, E.
    Busti, V. C.
    [J]. ASTRONOMY AND COMPUTING, 2015, 13 : 1 - 11
  • [47] Variance bounding and geometric ergodicity of Markov chain Monte Carlo kernels for approximate Bayesian computation
    Lee, Anthony
    Latuszynski, Krzysztof
    [J]. BIOMETRIKA, 2014, 101 (03) : 655 - 671
  • [48] Information driven parameter dynamics for on-line Bayesian learning with sequential Monte Carlo
    Yosui, K
    Wakahara, M
    Nakada, Y
    Matsumoto, T
    [J]. ISPACS 2005: PROCEEDINGS OF THE 2005 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS, 2005, : 377 - 380
  • [49] Sequential Monte Carlo methods for static parameter estimation in random set models
    Vo, BN
    Vo, BT
    Singh, S
    [J]. PROCEEDINGS OF THE 2004 INTELLIGENT SENSORS, SENSOR NETWORKS & INFORMATION PROCESSING CONFERENCE, 2004, : 313 - 318
  • [50] Sequential Monte Carlo Methods for State and Parameter Estimation in Abruptly Changing Environments
    Nemeth, Christopher
    Fearnhead, Paul
    Mihaylova, Lyudmila
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (05) : 1245 - 1255