Neural Sampling in Hierarchical Exponential-family Energy-based Models

被引:0
|
作者
Dong, Xingsi [1 ,2 ,3 ]
Wu, Si [1 ,2 ,3 ]
机构
[1] Acad Adv Interdisciplinary Studies, PKU Tsinghua Ctr Life Sci, Beijing, Peoples R China
[2] Peking Univ, Sch Psychol & Cognit Sci, Beijing Key Lab Behav & Mental Hlth, Beijing, Peoples R China
[3] Peking Univ, Ctr Quantitat Biol, IDG McGovern Inst Brain Res, Beijing, Peoples R China
关键词
RECEPTIVE-FIELDS; BAYESIAN-INFERENCE; CUE INTEGRATION; VISUAL-CORTEX; MACAQUE V1; BRAIN; REPRESENTATIONS; ALGORITHM; GAP; V2;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian brain theory suggests that the brain employs generative models to understand the external world. The sampling-based perspective posits that the brain infers the posterior distribution through samples of stochastic neuronal responses. Additionally, the brain continually updates its generative model to approach the true distribution of the external world. In this study, we introduce the Hierarchical Exponential-family Energy-based (HEE) model, which captures the dynamics of inference and learning. In the HEE model, we decompose the partition function into individual layers and leverage a group of neurons with shorter time constants to sample the gradient of the decomposed normalization term. This allows our model to estimate the partition function and perform inference simultaneously, circumventing the negative phase encountered in conventional energy-based models (EBMs). As a result, the learning process is localized both in time and space, and the model is easy to converge. To match the brain's rapid computation, we demonstrate that neural adaptation can serve as a momentum term, significantly accelerating the inference process. On natural image datasets, our model exhibits representations akin to those observed in the biological visual system. Furthermore, for the machine learning community, our model can generate observations through joint or marginal generation. We show that marginal generation outperforms joint generation and achieves performance on par with other EBMs.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Model-based clustering of time-evolving networks through temporal exponential-family random graph models
    Lee, Kevin H.
    Xue, Lingzhou
    Hunter, David R.
    JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 175
  • [22] The physics of energy-based models
    Huembeli, Patrick
    Arrazola, Juan Miguel
    Killoran, Nathan
    Mohseni, Masoud
    Wittek, Peter
    QUANTUM MACHINE INTELLIGENCE, 2022, 4 (01)
  • [23] Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling
    Grathwohl, Will
    Wang, Kuan-Chieh
    Jacobsen, Jorn-Henrik
    Duvenaud, David
    Zemel, Richard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [24] Perturb-and-max-product: Sampling and learning in discrete energy-based models
    Lazaro-Gredilla, Miguel
    Dedieu, Antoine
    George, Dileep
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling
    Grathwohl, Will
    Wang, Kuan-Chieh
    Jacobsen, Jorn-Henrik
    Duvenaud, David
    Zemel, Richard
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [26] Conjugate Energy-Based Models
    Wu, Hao
    Esmaeili, Babak
    Wick, Michael
    Tristan, Jean-Baptiste
    van de Meent, Jan-Willem
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [27] The physics of energy-based models
    Patrick Huembeli
    Juan Miguel Arrazola
    Nathan Killoran
    Masoud Mohseni
    Peter Wittek
    Quantum Machine Intelligence, 2022, 4
  • [28] Using contrastive divergence to seed Monte Carlo MLE for exponential-family random graph models
    Krivitsky, Pavel N.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2017, 107 : 149 - 161
  • [29] A FRAMEWORK FOR PRUNING DEEP NEURAL NETWORKS USING ENERGY-BASED MODELS
    Salehinejad, Hojjat
    Valaee, Shahrokh
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3920 - 3924
  • [30] Hierarchical disparity estimation with energy-based regularization
    Kim, H
    Sohn, K
    2003 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL 1, PROCEEDINGS, 2003, : 373 - 376