Function Space Bayesian Pseudocoreset for Bayesian Neural Networks

被引:0
|
作者
Kim, Balhae [1 ]
Lee, Hyungi [1 ]
Lee, Juho [1 ,2 ]
机构
[1] KAIST AI, Daejeon, South Korea
[2] AITRICS, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Bayesian pseudocoreset is a compact synthetic dataset summarizing essential information of a large-scale dataset and thus can be used as a proxy dataset for scalable Bayesian inference. Typically, a Bayesian pseudocoreset is constructed by minimizing a divergence measure between the posterior conditioning on the pseudocoreset and the posterior conditioning on the full dataset. However, evaluating the divergence can be challenging, particularly for the models like deep neural networks having high-dimensional parameters. In this paper, we propose a novel Bayesian pseudocoreset construction method that operates on a function space. Unlike previous methods, which construct and match the coreset and full data posteriors in the space of model parameters (weights), our method constructs variational approximations to the coreset posterior on a function space and matches it to the full data posterior in the function space. By working directly on the function space, our method could bypass several challenges that may arise when working on a weight space, including limited scalability and multi-modality issue. Through various experiments, we demonstrate that the Bayesian pseudocoresets constructed from our method enjoys enhanced uncertainty quantification and better robustness across various model architectures.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Classification of ransomwaresusing Artificial Neural Networks and Bayesian Networks
    Madani, Houria
    Ouerdi, Noura
    Palisse, Aurelien
    Lanet, Jean-Louis
    Azizi, Abdelmalek
    2019 THIRD INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING IN DATA SCIENCES (ICDS 2019), 2019,
  • [42] The ridgelet prior: A covariance function approach to prior specification for bayesian neural networks
    Matsubara, Takuo
    Oates, Chris J.
    Briol, François-Xavier
    Journal of Machine Learning Research, 2021, 22 : 1 - 57
  • [43] The Ridgelet Prior: A Covariance Function Approach to Prior Specification for Bayesian Neural Networks
    Matsubara, Takuo
    Oates, Chris J.
    Briol, Francois-Xavier
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [44] Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
    Chernoded, Andrey
    Dudko, Lev
    Myagkov, Igor
    Volkov, Petr
    XXIII INTERNATIONAL WORKSHOP HIGH ENERGY PHYSICS AND QUANTUM FIELD THEORY (QFTHEP 2017), 2017, 158
  • [45] Bayesian inference with finitely wide neural networks
    Lu, Chi-Ken
    PHYSICAL REVIEW E, 2023, 108 (01)
  • [46] SELF-COMPRESSION IN BAYESIAN NEURAL NETWORKS
    Carannante, Giuseppina
    Dera, Dimah
    Rasool, Ghulam
    Bouaynaya, Nidhal C.
    PROCEEDINGS OF THE 2020 IEEE 30TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2020,
  • [47] Dynamic Bayesian networks for integrated neural computation
    Labatut, V
    Pastor, J
    1ST INTERNATIONAL IEEE EMBS CONFERENCE ON NEURAL ENGINEERING 2003, CONFERENCE PROCEEDINGS, 2003, : 537 - 540
  • [48] Probabilistic Bayesian Neural Networks for Efficient Inference
    Ishak, Md
    Alawad, Mohammed
    PROCEEDING OF THE GREAT LAKES SYMPOSIUM ON VLSI 2024, GLSVLSI 2024, 2024, : 724 - 729
  • [49] BAYESFLOW: AMORTIZED BAYESIAN WORKFLOWS WITH NEURAL NETWORKS
    Radev, Stefan T.
    Schmitt, Marvin
    Schumacher, Lukas
    Elsemüller, Lasse
    Pratz, Valentin
    Schälte, Yannik
    Köthe, Ullrich
    Bürkner, Paul-Christian
    arXiv, 2023,
  • [50] Robust full Bayesian methods for neural networks
    Andrieu, C
    de Freitas, JFG
    Doucet, A
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 12, 2000, 12 : 379 - 385