A Deep Learning Method for Comparing Bayesian Hierarchical Models

被引:1
|
作者
Elsemueller, Lasse [1 ]
Schnuerch, Martin [2 ]
Buerkner, Paul-Christian [3 ]
Radev, Stefan T. [4 ]
机构
[1] Heidelberg Univ, Inst Psychol, Hauptstr 47, D-69117 Heidelberg, Germany
[2] Univ Mannheim, Dept Psychol, Mannheim, Germany
[3] TU Dortmund Univ, Dept Stat, Dortmund, Germany
[4] Heidelberg Univ, Cluster Excellence STRUCTURES, Heidelberg, Germany
关键词
Bayesian statistics; model comparison; hierarchical modeling; deep learning; cognitive modeling; PROCESSING TREE MODELS; MONTE-CARLO; NORMALIZING CONSTANTS; CHOICE;
D O I
10.1037/met0000645
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Bayesian model comparison (BMC) offers a principled approach to assessing the relative merits of competing computational models and propagating uncertainty into model selection decisions. However, BMC is often intractable for the popular class of hierarchical models due to their high-dimensional nested parameter structure. To address this intractability, we propose a deep learning method for performing BMC on any set of hierarchical models which can be instantiated as probabilistic programs. Since our method enables amortized inference, it allows efficient re-estimation of posterior model probabilities and fast performance validation prior to any real-data application. In a series of extensive validation studies, we benchmark the performance of our method against the state-of-the-art bridge sampling method and demonstrate excellent amortized inference across all BMC settings. We then showcase our method by comparing four hierarchical evidence accumulation models that have previously been deemed intractable for BMC due to partly implicit likelihoods. Additionally, we demonstrate how transfer learning can be leveraged to enhance training efficiency. We provide reproducible code for all analyses and an open-source implementation of our method.
引用
收藏
页数:30
相关论文
共 50 条
  • [1] Hierarchical Bayesian Models of Subtask Learning
    Anglim, Jeromy
    Wynton, Sarah K. A.
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2015, 41 (04) : 957 - 974
  • [2] Learning overhypotheses with hierarchical Bayesian models
    Kemp, Charles
    Perfors, Amy
    Tenenbaum, Joshua B.
    DEVELOPMENTAL SCIENCE, 2007, 10 (03) : 307 - 321
  • [3] Hierarchical Bayesian models for regularization in sequential learning
    de Freitas, JFG
    Niranjan, M
    Gee, AH
    NEURAL COMPUTATION, 2000, 12 (04) : 933 - 953
  • [4] Measures of Bayesian learning and identifiability in hierarchical models
    Xie, Yang
    Carlin, Bradley P.
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2006, 136 (10) : 3458 - 3477
  • [5] Comparing MCMC and INLA for disease mapping with Bayesian hierarchical models
    Tom De Smedt
    Koen Simons
    An Van Nieuwenhuyse
    Geert Molenberghs
    Archives of Public Health, 73 (Suppl 1)
  • [6] Comparing two multinomial samples using hierarchical Bayesian models
    A. R. Masegosa
    A. Torres
    M. Morales
    A. Salmerón
    Progress in Artificial Intelligence, 2020, 9 : 145 - 154
  • [7] Bayesian Distillation of Deep Learning Models
    Grabovoy, A. V.
    Strijov, V. V.
    AUTOMATION AND REMOTE CONTROL, 2021, 82 (11) : 1846 - 1856
  • [8] Comparing two multinomial samples using hierarchical Bayesian models
    Masegosa, A. R.
    Torres, A.
    Morales, M.
    Salmeron, A.
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2020, 9 (02) : 145 - 154
  • [9] Bayesian Distillation of Deep Learning Models
    A. V. Grabovoy
    V. V. Strijov
    Automation and Remote Control, 2021, 82 : 1846 - 1856
  • [10] Learning with Hierarchical-Deep Models
    Salakhutdinov, Ruslan
    Tenenbaum, Joshua B.
    Torralba, Antonio
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) : 1958 - 1971