Neural Sampling in Hierarchical Exponential-family Energy-based Models

被引:0
|
作者
Dong, Xingsi [1 ,2 ,3 ]
Wu, Si [1 ,2 ,3 ]
机构
[1] Acad Adv Interdisciplinary Studies, PKU Tsinghua Ctr Life Sci, Beijing, Peoples R China
[2] Peking Univ, Sch Psychol & Cognit Sci, Beijing Key Lab Behav & Mental Hlth, Beijing, Peoples R China
[3] Peking Univ, Ctr Quantitat Biol, IDG McGovern Inst Brain Res, Beijing, Peoples R China
关键词
RECEPTIVE-FIELDS; BAYESIAN-INFERENCE; CUE INTEGRATION; VISUAL-CORTEX; MACAQUE V1; BRAIN; REPRESENTATIONS; ALGORITHM; GAP; V2;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian brain theory suggests that the brain employs generative models to understand the external world. The sampling-based perspective posits that the brain infers the posterior distribution through samples of stochastic neuronal responses. Additionally, the brain continually updates its generative model to approach the true distribution of the external world. In this study, we introduce the Hierarchical Exponential-family Energy-based (HEE) model, which captures the dynamics of inference and learning. In the HEE model, we decompose the partition function into individual layers and leverage a group of neurons with shorter time constants to sample the gradient of the decomposed normalization term. This allows our model to estimate the partition function and perform inference simultaneously, circumventing the negative phase encountered in conventional energy-based models (EBMs). As a result, the learning process is localized both in time and space, and the model is easy to converge. To match the brain's rapid computation, we demonstrate that neural adaptation can serve as a momentum term, significantly accelerating the inference process. On natural image datasets, our model exhibits representations akin to those observed in the biological visual system. Furthermore, for the machine learning community, our model can generate observations through joint or marginal generation. We show that marginal generation outperforms joint generation and achieves performance on par with other EBMs.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] hergm: Hierarchical Exponential-Family Random Graph Models
    Schweinberger, Michael
    Luna, Pamela
    JOURNAL OF STATISTICAL SOFTWARE, 2018, 85 (01): : 1 - 39
  • [2] On Exponential-Family INGARCH Models
    Huang, Alan
    Fung, Thomas
    Macaskill, Kyle
    Aukes, Rowan
    JOURNAL OF TIME SERIES ANALYSIS, 2025,
  • [3] Understanding networks with exponential-family random network models
    Wang, Zeyi
    Fellows, Ian E.
    Handcock, Mark S.
    SOCIAL NETWORKS, 2024, 78 : 81 - 91
  • [4] Exponential-family random graph models for valued networks
    Krivitsky, Pavel N.
    ELECTRONIC JOURNAL OF STATISTICS, 2012, 6 : 1100 - 1128
  • [5] Model-based clustering of semiparametric temporal exponential-family random graph models
    Lee, Kevin H.
    Agarwal, Amal
    Zhang, Anna Y.
    Xue, Lingzhou
    STAT, 2022, 11 (01):
  • [6] The Decoupled Extended Kalman Filter for Dynamic Exponential-Family Factorization Models
    Gomez-Uribe, Carlos A.
    Karrer, Brian
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [7] Exponential-Family Random Graph Models for Multi-Layer Networks
    Krivitsky, Pavel N.
    Koehly, Laura M.
    Marcum, Christopher Steven
    PSYCHOMETRIKA, 2020, 85 (03) : 630 - 659
  • [8] Exponential-Family Random Graph Models for Multi-Layer Networks
    Pavel N. Krivitsky
    Laura M. Koehly
    Christopher Steven Marcum
    Psychometrika, 2020, 85 : 630 - 659
  • [10] The decoupled extended kalman filter for dynamic exponential-family factorization models
    Gomez-Uribe, Carlos A.
    Karrer, Brian
    Journal of Machine Learning Research, 2021, 22