CONCENTRATION OF DISCREPANCY-BASED APPROXIMATE BAYESIAN COMPUTATION VIA RADEMACHER COMPLEXITY

被引:0
|
作者
Legramanti, Sirio [1 ]
Durante, Daniele [2 ,3 ]
Alquier, Pierre [4 ]
机构
[1] Univ Bergamo, Dept Econ, Bergamo, Italy
[2] Bocconi Univ, Dept Decis Sci, Milan, Italy
[3] Bocconi Univ, Inst Data Sci & Analyt, Milan, Italy
[4] ESSEC Business Sch, Dept Informat Syst Decis Sci & Stat, Cergy, France
来源
ANNALS OF STATISTICS | 2025年 / 53卷 / 01期
关键词
ABC; integral probability semimetrics; MMD; Rademacher complexity; Wasserstein distance; WASSERSTEIN DISTANCE; EMPIRICAL MEASURES; CONVERGENCE; STATISTICS;
D O I
10.1214/24-AOS2453
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
There has been an increasing interest on summary-free solutions for approximate Bayesian computation (ABC) that replace distances among summaries with discrepancies between the empirical distributions of the observed data and the synthetic samples generated under the proposed parameter values. The success of these strategies has motivated theoretical studies on the limiting properties of the induced posteriors. However, there is still the lack of a theoretical framework for summary-free ABC that (i) is unified, instead of discrepancy-specific, (ii) does not necessarily require to constrain the analysis to data generating processes and statistical models meeting specific regularity conditions, but rather facilitates the derivation of limiting properties that hold uniformly, and (iii) relies on verifiable assumptions that provide more explicit concentration bounds clarifying which factors govern the limiting behavior of the ABC posterior. We address this gap via a novel theoretical framework that introduces the concept of Rademacher complexity in the analysis of the limiting properties for discrepancy-based ABC posteriors, including in non-i.i.d. and misspecified settings. This yields a unified theory that relies on constructive arguments and provides more informative asymptotic results and uniform concentration bounds, even in those settings not covered by current studies. These key advancements are obtained by relating the asymptotic properties of summary-free ABC posteriors to the behavior of the Rademacher complexity associated with the chosen discrepancy within the family of integral probability semimetrics (IPS). The IPS class extends summary-based distances, and also includes the widely implemented Wasserstein distance and maximum mean discrepancy (MMD), among others. As clarified in specialized theoretical analyses of popular IPS discrepancies and via illustrative simulations, this new perspective improves the understanding of summary-free ABC.
引用
收藏
页码:37 / 60
页数:24
相关论文
共 50 条
  • [1] Approximate Inference via Weighted Rademacher Complexity
    Kuck, Jonathan
    Sabharwal, Ashish
    Ermon, Stefano
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 6376 - 6383
  • [2] Summary statistics and discrepancy measures for approximate Bayesian computation via surrogate posteriors
    Forbes, Florence
    Nguyen, Hien Duy
    Nguyen, TrungTin
    Arbel, Julyan
    STATISTICS AND COMPUTING, 2022, 32 (05)
  • [3] Summary statistics and discrepancy measures for approximate Bayesian computation via surrogate posteriors
    Florence Forbes
    Hien Duy Nguyen
    TrungTin Nguyen
    Julyan Arbel
    Statistics and Computing, 2022, 32
  • [4] Reduction of Petri net maintenance modeling complexity via Approximate Bayesian Computation
    Chiachio, Manuel
    Saleh, Ali
    Naybour, Susannah
    Chiachio, Juan
    Andrews, John
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2022, 222
  • [5] Approximate Bayesian Computation via Classification
    Wang, Yuexi
    Kaji, Tetsuya
    Rockova, Veronika
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [6] Approximate Bayesian Computation via Classification
    Wang, Yuexi
    Kaji, Tetsuya
    Rockova, Veronika
    Journal of Machine Learning Research, 2022, 23
  • [7] Filtering via approximate Bayesian computation
    Jasra, Ajay
    Singh, Sumeetpal S.
    Martin, James S.
    McCoy, Emma
    STATISTICS AND COMPUTING, 2012, 22 (06) : 1223 - 1237
  • [8] Filtering via approximate Bayesian computation
    Ajay Jasra
    Sumeetpal S. Singh
    James S. Martin
    Emma McCoy
    Statistics and Computing, 2012, 22 : 1223 - 1237
  • [9] Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
    Jiang, Bai
    Wu, Tung-Yu
    Wong, Wing Hung
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [10] Approximate Bayesian Computation Via the Energy Statistic
    Hien Duy Nguyen
    Arbel, Julyan
    Lu, Hongliang
    Forbes, Florence
    IEEE ACCESS, 2020, 8 : 131683 - 131698