Hamiltonian Adaptive Importance Sampling

被引:10
|
作者
Mousavi, Ali [1 ]
Monsefi, Reza [1 ]
Elvira, Victor [2 ]
机构
[1] Ferdowsi Univ Mashhad FUM, Dept Comp, Engn Fac, Mashhad 91897, Razavi Khorasan, Iran
[2] Univ Edinburgh, Sch Math, Edinburgh EH8 9YL, Midlothian, Scotland
关键词
Proposals; Monte Carlo methods; Artificial intelligence; Signal processing algorithms; Markov processes; Heuristic algorithms; Convergence; Adaptive importance sampling; hamiltonian monte carlo; MONTE-CARLO; ADAPTATION;
D O I
10.1109/LSP.2021.3068616
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Importance sampling (IS) is a powerful Monte Carlo (MC) methodology for approximating integrals, for instance in the context of Bayesian inference. In IS, the samples are simulated from the so-called proposal distribution, and the choice of this proposal is key for achieving a high performance. In adaptive IS (AIS) methods, a set of proposals is iteratively improved. AIS is a relevant and timely methodology although many limitations remain yet to be overcome, e.g., the curse of dimensionality in high-dimensional and multi-modal problems. Moreover, the Hamiltonian Monte Carlo (HMC) algorithm has become increasingly popular in machine learning and statistics. HMC has several appealing features such as its exploratory behavior, especially in high-dimensional targets, when other methods suffer. In this letter, we introduce the novel Hamiltonian adaptive importance sampling (HAIS) method. HAIS implements a two-step adaptive process with parallel HMC chains that cooperate at each iteration. The proposed HAIS efficiently adapts a population of proposals, extracting the advantages of HMC. HAIS can be understood as a particular instance of the generic layered AIS family with an additional resampling step. HAIS achieves a significant performance improvement in high-dimensional problems w.r.t. state-of-the-art algorithms. We discuss the statistical properties of HAIS and show its high performance in two challenging examples.
引用
收藏
页码:713 / 717
页数:5
相关论文
共 50 条
  • [1] ADAPTIVE IMPORTANCE SAMPLING
    STADLER, JS
    ROY, S
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 1993, 11 (03) : 309 - 316
  • [2] Adaptive mixture importance sampling
    Raghavan, N
    Cox, DD
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 1998, 60 (03) : 237 - 259
  • [3] An adaptive importance sampling technique
    Pennanen, T
    Koivu, M
    [J]. MONTE CARLO AND QUASI-MONTE CARLO METHODS 2004, 2006, : 443 - +
  • [4] Safe Adaptive Importance Sampling
    Stich, Sebastian U.
    Raj, Anant
    Jaggi, Martin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] Adaptive Multiple Importance Sampling
    Cornuet, Jean-Marie
    Marin, Jean-Michel
    Mira, Antonietta
    Robert, Christian P.
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2012, 39 (04) : 798 - 812
  • [6] Annealed Adaptive Importance Sampling
    Center, Julian L., Jr.
    [J]. BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2008, 1073 : 119 - 126
  • [7] Layered adaptive importance sampling
    Martino, L.
    Elvira, V.
    Luengo, D.
    Corander, J.
    [J]. STATISTICS AND COMPUTING, 2017, 27 (03) : 599 - 623
  • [8] Implicitly adaptive importance sampling
    Topi Paananen
    Juho Piironen
    Paul-Christian Bürkner
    Aki Vehtari
    [J]. Statistics and Computing, 2021, 31
  • [9] Implicitly adaptive importance sampling
    Paananen, Topi
    Piironen, Juho
    Burkner, Paul-Christian
    Vehtari, Aki
    [J]. STATISTICS AND COMPUTING, 2021, 31 (02)
  • [10] Layered adaptive importance sampling
    L. Martino
    V. Elvira
    D. Luengo
    J. Corander
    [J]. Statistics and Computing, 2017, 27 : 599 - 623