Heretical Multiple Importance Sampling

被引:27
|
作者
Elvira, Victor [1 ]
Martino, Luca [2 ]
Luengo, David [3 ]
Bugallo, Monica F. [4 ]
机构
[1] Univ Carlos III Madrid, Dept Signal Theory & Commun, Leganes 28911, Spain
[2] Univ Valencia, Image Proc Lab, Valencia 46010, Spain
[3] Univ Politecn Madrid, Dept Signal Theory & Commun, E-28040 Madrid, Spain
[4] SUNY Stony Brook, Dept Elect & Comp Engn, Stony Brook, NY 11794 USA
关键词
Biased estimation; deterministic mixture (DM); Monte Carlo methods; multiple importance sampling (MIS); POPULATION MONTE-CARLO;
D O I
10.1109/LSP.2016.2600678
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multiple importance sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a tradeoff between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel "heretical" MIS framework, where the clustering is performed a posteriori with the goal of reducing the variance of the importance sampling weights. This approach yields biased estimators with a potentially large reduction in variance. Numerical examples show that heretical MIS estimators can outperform, in terms of mean squared error, both the standard and the partial MIS estimators, achieving a performance close to that of DM with less computational cost.
引用
收藏
页码:1474 / 1478
页数:5
相关论文
共 50 条
  • [1] Optimal Multiple Importance Sampling
    Kondapaneni, Ivo
    Vevoda, Petr
    Grittmann, Pascal
    Skrivan, Tomas
    Slusallek, Philipp
    Krivanek, Jaroslav
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2019, 38 (04):
  • [2] Adaptive Multiple Importance Sampling
    Cornuet, Jean-Marie
    Marin, Jean-Michel
    Mira, Antonietta
    Robert, Christian P.
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2012, 39 (04) : 798 - 812
  • [3] Generalized Multiple Importance Sampling
    Elvira, Victor
    Martino, Luca
    Luengo, David
    Bugallo, Monica F.
    [J]. STATISTICAL SCIENCE, 2019, 34 (01) : 129 - 155
  • [4] Marginal Multiple Importance Sampling
    West, Rex
    Georgiev, Iliyan
    Hachisuka, Toshiya
    [J]. PROCEEDINGS SIGGRAPH ASIA 2022, 2022,
  • [5] Continuous Multiple Importance Sampling
    West, Rex
    Georgiev, Iliyan
    Gruson, Adrien
    Hachisuka, Toshiya
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2020, 39 (04):
  • [6] Multiple Importance Sampling for PET
    Szirmay-Kalos, Laszlo
    Magdics, Milan
    Toth, Balazs
    [J]. IEEE TRANSACTIONS ON MEDICAL IMAGING, 2014, 33 (04) : 970 - 978
  • [7] Efficient Multiple Importance Sampling Estimators
    Elvira, Vctor
    Martino, Luca
    Luengo, David
    Bugallo, Monica F.
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (10) : 1757 - 1761
  • [8] Efficient Adaptive Multiple Importance Sampling
    El-Laham, Yousef
    Martino, Luca
    Elvira, Victor
    Bugallo, Monica F.
    [J]. 2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [9] A linear heuristic for multiple importance sampling
    Sbert, Mateu
    Szirmay-Kalos, Laszlo
    [J]. EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2023, 2023 (01)
  • [10] A linear heuristic for multiple importance sampling
    Mateu Sbert
    László Szirmay-Kalos
    [J]. EURASIP Journal on Advances in Signal Processing, 2023