Optimization of Annealed Importance Sampling Hyperparameters

被引:0
|
作者
Goshtasbpour, Shirin [1 ,2 ]
Perez-Cruz, Fernando [1 ,2 ]
机构
[1] Swiss Fed Inst Technol, Dept Comp Sci, Ramistr 101, CH-8092 Zurich, Switzerland
[2] Swiss Data Sci Ctr, Turnerstr 1, CH-8092 Zurich, Switzerland
关键词
Annealed importance sampling; Partition function estimation; Generative models; INFERENCE;
D O I
10.1007/978-3-031-26419-1_11
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models. Although AIS is guaranteed to provide unbiased estimate for any set of hyperparameters, the common implementations rely on simple heuristics such as the geometric average bridging distributions between initial and the target distribution which affect the estimation performance when the computation budget is limited. In order to reduce the number of sampling iterations, we present a parameteric AIS process with flexible intermediary distributions defined by a residual density with respect to the geometric mean path. Our method allows parameter sharing between annealing distributions, the use of fix linear schedule for discretization and amortization of hyperparameter selection in latent variable models. We assess the performance of Optimized-Path AIS for marginal likelihood estimation of deep generative models and compare it to compare it to more computationally intensive AIS.
引用
收藏
页码:174 / 190
页数:17
相关论文
共 50 条
  • [42] Evolutionary Optimization of Hyperparameters in Deep Learning Models
    Kim, Jin-Young
    Cho, Sung-Bae
    [J]. 2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 831 - 837
  • [43] Rectangularization of Gaussian process regression for optimization of hyperparameters
    Manzhos, Sergei
    Ihara, Manabu
    [J]. MACHINE LEARNING WITH APPLICATIONS, 2023, 13
  • [44] Automatic tuning of hyperparameters using Bayesian optimization
    Victoria, A. Helen
    Maragatham, G.
    [J]. EVOLVING SYSTEMS, 2021, 12 (01) : 217 - 223
  • [45] Demystify Hyperparameters for Stochastic Optimization with Transferable Representations
    Sun, Jianhui
    Huai, Mengdi
    Jha, Kishlay
    Zhang, Aidong
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1706 - 1716
  • [46] On the use of Metaheuristics in Hyperparameters Optimization of Gaussian Processes
    Palar, Pramudita Satria
    Zuhal, Lavi Rizki
    Shimoyama, Koji
    [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 263 - 264
  • [47] Wildlife conservation planning using stochastic optimization and importance sampling
    Haight, RG
    Travis, LE
    [J]. FOREST SCIENCE, 1997, 43 (01) : 129 - 139
  • [48] Importance sampling in stochastic optimization: An application to intertemporal portfolio choice
    Ekblom, J.
    Blomvall, J.
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2020, 285 (01) : 106 - 119
  • [49] Monte Carlo importance sampling optimization for system reliability applications
    Campioni, L
    Vestrucci, P
    [J]. ANNALS OF NUCLEAR ENERGY, 2004, 31 (09) : 1005 - 1025
  • [50] Reliability-based optimization using bridge importance sampling
    Beaurepaire, P.
    Jensen, H. A.
    Schueller, G. I.
    Valdebenito, M. A.
    [J]. PROBABILISTIC ENGINEERING MECHANICS, 2013, 34 : 48 - 57