Annealed Importance Sampling for Neural Mass Models

被引:7
|
作者
Penny, Will [1 ]
Sengupta, Biswa [1 ]
机构
[1] UCL, Wellcome Trust Ctr Neuroimaging, London, England
基金
英国工程与自然科学研究理事会; 英国惠康基金;
关键词
COMPUTING BAYES FACTORS; MARGINAL LIKELIHOOD; METROPOLIS; RESPONSES;
D O I
10.1371/journal.pcbi.1004797
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] Reflectance models with fast importance sampling
    Neumann, L
    Neumann, A
    Szirmay-Kalos, L
    [J]. COMPUTER GRAPHICS FORUM, 1999, 18 (04) : 249 - 265
  • [22] Importance sampling in lattice pricing models
    Nielsen, SS
    [J]. INTERFACES IN COMPUTER SCIENCE AND OPERATIONS RESEARCH: ADVANCES IN METAHEURISTICS, OPTIMIZATION, AND STOCHASTIC MODELING TECHNOLOGIES, 1997, : 281 - 296
  • [23] Importance sampling in neural detector training phase
    Andina, D
    Martínez-Antorrena, J
    Melgar, I
    [J]. Soft Computing with Industrial Applications, Vol 17, 2004, 17 : 43 - 48
  • [24] Accelerating HEP simulations with Neural Importance Sampling
    Deutschmann, Nicolas
    Goetz, Niklas
    [J]. JOURNAL OF HIGH ENERGY PHYSICS, 2024, 2024 (03)
  • [25] Applying Neural Importance Sampling to gluon scattering
    Bothmann, Enrico
    Janssen, Timo
    Knobbe, Max
    Schmale, Tobias
    Schumann, Steffen
    [J]. EIGHTH ANNUAL CONFERENCE ON LARGE HADRON COLLIDER PHYSICS, LHCP2020, 2021,
  • [26] Exploring phase space with Neural Importance Sampling
    Bothmann, Enrico
    Janssen, Timo
    Knobbe, Max
    Schmale, Tobias
    Schumann, Steffen
    [J]. SCIPOST PHYSICS, 2020, 8 (04):
  • [27] Stochastic Gradient Annealed Importance Sampling for Efficient Online Marginal Likelihood Estimation
    Cameron, Scott A.
    Eggers, Hans C.
    Kroon, Steve
    [J]. ENTROPY, 2019, 21 (11)
  • [28] Adaptive importance sampling for network growth models
    Guetz, Adam N.
    Holmes, Susan P.
    [J]. ANNALS OF OPERATIONS RESEARCH, 2011, 189 (01) : 187 - 203
  • [29] Adaptive importance sampling for network growth models
    Adam N. Guetz
    Susan P. Holmes
    [J]. Annals of Operations Research, 2011, 189 : 187 - 203
  • [30] MadNIS - Neural multi-channel importance sampling
    Heimel, Theo
    Winterhalder, Ramon
    Butter, Anja
    Isaacson, Joshua
    Krause, Claudius
    Maltoni, Fabio
    Mattelaer, Olivier
    Plehn, Tilman
    [J]. SCIPOST PHYSICS, 2023, 15 (04):