A Deep Learning Method for Comparing Bayesian Hierarchical Models

被引:1
|
作者
Elsemueller, Lasse [1 ]
Schnuerch, Martin [2 ]
Buerkner, Paul-Christian [3 ]
Radev, Stefan T. [4 ]
机构
[1] Heidelberg Univ, Inst Psychol, Hauptstr 47, D-69117 Heidelberg, Germany
[2] Univ Mannheim, Dept Psychol, Mannheim, Germany
[3] TU Dortmund Univ, Dept Stat, Dortmund, Germany
[4] Heidelberg Univ, Cluster Excellence STRUCTURES, Heidelberg, Germany
关键词
Bayesian statistics; model comparison; hierarchical modeling; deep learning; cognitive modeling; PROCESSING TREE MODELS; MONTE-CARLO; NORMALIZING CONSTANTS; CHOICE;
D O I
10.1037/met0000645
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Bayesian model comparison (BMC) offers a principled approach to assessing the relative merits of competing computational models and propagating uncertainty into model selection decisions. However, BMC is often intractable for the popular class of hierarchical models due to their high-dimensional nested parameter structure. To address this intractability, we propose a deep learning method for performing BMC on any set of hierarchical models which can be instantiated as probabilistic programs. Since our method enables amortized inference, it allows efficient re-estimation of posterior model probabilities and fast performance validation prior to any real-data application. In a series of extensive validation studies, we benchmark the performance of our method against the state-of-the-art bridge sampling method and demonstrate excellent amortized inference across all BMC settings. We then showcase our method by comparing four hierarchical evidence accumulation models that have previously been deemed intractable for BMC due to partly implicit likelihoods. Additionally, we demonstrate how transfer learning can be leveraged to enhance training efficiency. We provide reproducible code for all analyses and an open-source implementation of our method.
引用
收藏
页数:30
相关论文
共 50 条
  • [21] Hierarchical Bayesian models of delusion
    Williams, Daniel
    CONSCIOUSNESS AND COGNITION, 2018, 61 : 129 - 147
  • [22] Bayesian Hierarchical Pointing Models
    Zhao, Hang
    Gu, Sophia
    Yu, Chun
    Bi, Xiaojun
    PROCEEDINGS OF THE 35TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, UIST 2022, 2022,
  • [23] Hierarchical Bayesian models of reinforcement learning: Introduction and comparison to alternative methods
    van Geen, Camilla
    Gerraty, Raphael T.
    JOURNAL OF MATHEMATICAL PSYCHOLOGY, 2021, 105
  • [24] Learning the Relevant Percepts of Modular Hierarchical Bayesian Driver Models Using a Bayesian Information Criterion
    Eilers, Mark
    Moebus, Claus
    DIGITAL HUMAN MODELING, 2011, 6777 : 463 - 472
  • [25] Extremely deep Bayesian learning with Gromov's method
    Daum, Fred
    Huang, Jim
    Noushin, Arjang
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXVIII, 2019, 11018
  • [26] An Adaptive Empirical Bayesian Method for Sparse Deep Learning
    Deng, Wei
    Zhang, Xiao
    Liang, Faming
    Lin, Guang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [27] Bayesian Active Learning for Choice Models With Deep Gaussian Processes
    Yang, Jie
    Klabjan, Diego
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (02) : 1080 - 1092
  • [28] Deep Learning or Deep Ignorance? Comparing Untrained Recurrent Models in Educational Contexts
    Botelho, Anthony F.
    Prihar, Ethan
    Heffernan, Neil T.
    ARTIFICIAL INTELLIGENCE IN EDUCATION, PT I, 2022, 13355 : 281 - 293
  • [29] Bayesian hierarchical dictionary learning
    Waniorek, N.
    Calvetti, D.
    Somersalo, E.
    INVERSE PROBLEMS, 2023, 39 (02)
  • [30] Learning decomposed hierarchical feature for better transferability of deep models
    Yang, Jianfei
    Qian, Hanjie
    Zou, Han
    Xie, Lihua
    INFORMATION SCIENCES, 2021, 580 : 383 - 397