Neural Sampling in Hierarchical Exponential-family Energy-based Models

被引:0
|
作者
Dong, Xingsi [1 ,2 ,3 ]
Wu, Si [1 ,2 ,3 ]
机构
[1] Acad Adv Interdisciplinary Studies, PKU Tsinghua Ctr Life Sci, Beijing, Peoples R China
[2] Peking Univ, Sch Psychol & Cognit Sci, Beijing Key Lab Behav & Mental Hlth, Beijing, Peoples R China
[3] Peking Univ, Ctr Quantitat Biol, IDG McGovern Inst Brain Res, Beijing, Peoples R China
关键词
RECEPTIVE-FIELDS; BAYESIAN-INFERENCE; CUE INTEGRATION; VISUAL-CORTEX; MACAQUE V1; BRAIN; REPRESENTATIONS; ALGORITHM; GAP; V2;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian brain theory suggests that the brain employs generative models to understand the external world. The sampling-based perspective posits that the brain infers the posterior distribution through samples of stochastic neuronal responses. Additionally, the brain continually updates its generative model to approach the true distribution of the external world. In this study, we introduce the Hierarchical Exponential-family Energy-based (HEE) model, which captures the dynamics of inference and learning. In the HEE model, we decompose the partition function into individual layers and leverage a group of neurons with shorter time constants to sample the gradient of the decomposed normalization term. This allows our model to estimate the partition function and perform inference simultaneously, circumventing the negative phase encountered in conventional energy-based models (EBMs). As a result, the learning process is localized both in time and space, and the model is easy to converge. To match the brain's rapid computation, we demonstrate that neural adaptation can serve as a momentum term, significantly accelerating the inference process. On natural image datasets, our model exhibits representations akin to those observed in the biological visual system. Furthermore, for the machine learning community, our model can generate observations through joint or marginal generation. We show that marginal generation outperforms joint generation and achieves performance on par with other EBMs.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Impact of survey design on estimation of exponential-family random graph models from egocentrically-sampled data
    Krivitsky, Pavel N. N.
    Morris, Martina
    Bojanowski, Michal
    SOCIAL NETWORKS, 2022, 69 : 22 - 34
  • [42] Energy-Based Analog Neural Network Framework
    Watfa, Mohamed
    Garcia-Ortiz, Alberto
    Sassatelli, Gilles
    2022 IEEE 35TH INTERNATIONAL SYSTEM-ON-CHIP CONFERENCE (IEEE SOCC 2022), 2022, : 107 - 112
  • [43] Energy-based analog neural network framework
    Watfa, Mohamed
    Garcia-Ortiz, Alberto
    Sassatelli, Gilles
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2023, 17
  • [44] Comparing the real-world performance of exponential-family random graph models and latent order logistic models for social network analysis
    Clark, Duncan A.
    Handcock, Mark S.
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-STATISTICS IN SOCIETY, 2022, 185 (02) : 566 - 587
  • [45] Model-Based Planning with Energy-Based Models
    Du, Yilun
    Lin, Toru
    Mordatch, Igor
    CONFERENCE ON ROBOT LEARNING, VOL 100, 2019, 100
  • [46] Energy-based models for sparse overcomplete representations
    Teh, YW
    Welling, M
    Osindero, S
    Hinton, GE
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (7-8) : 1235 - 1260
  • [47] Towards understanding retrosynthesis by energy-based models
    Sun, Ruoxi
    Dai, Hanjun
    Li, Li
    Kearnes, Steven
    Dai, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [48] SAMPLING MODELS WHICH ADMIT A GIVEN GENERAL EXPONENTIAL FAMILY AS A CONJUGATE FAMILY OF PRIORS
    BARLEV, SK
    ENIS, P
    LETAC, G
    ANNALS OF STATISTICS, 1994, 22 (03): : 1555 - 1586
  • [49] Rate -Distortion via Energy-Based Models
    Li, Qing
    Kim, Yongjune
    Guyot, Cyril
    2023 DATA COMPRESSION CONFERENCE, DCC, 2023, : 351 - 351
  • [50] Implicit Generation and Modeling with Energy-Based Models
    Du, Yilun
    Mordatch, Igor
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32