Entropy, mutual information, and systematic measures of structured spiking neural networks

被引:2
|
作者
Li, Wenjie [1 ]
Li, Yao [2 ]
机构
[1] Washington Univ, Dept Math & Stat, St Louis, MO 63130 USA
[2] Univ Massachusetts, Dept Math & Stat, Amherst, MA 01002 USA
关键词
Neural field models; Entropy; Mutual information; Degeneracy; Complexity; DEGENERACY; COMPLEXITY; CONNECTIONS; MODEL;
D O I
10.1016/j.jtbi.2020.110310
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The aim of this paper is to investigate various information-theoretic measures, including entropy, mutual information, and some systematic measures that are based on mutual information, for a class of structured spiking neuronal networks. In order to analyze and compute these information-theoretic measures for large networks, we coarse-grained the data by ignoring the order of spikes that fall into the same small time bin. The resultant coarse-grained entropy mainly captures the information contained in the rhythm produced by a local population of the network. We first show that these information theoretical measures are well-defined and computable by proving stochastic stability and the law of large numbers. Then we use three neuronal network examples, from simple to complex, to investigate these information-theoretic measures. Several analytical and computational results about properties of these information-theoretic measures are given. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Learning Spatiotemporally Encoded Pattern Transformations in Structured Spiking Neural Networks
    Gardner, Brian
    Sporea, Ioana
    Gruening, Andre
    NEURAL COMPUTATION, 2015, 27 (12) : 2548 - 2586
  • [22] Information Entropy and Mutual Information-based Uncertainty Measures in Rough Set Theory
    Sun, Lin
    Xu, Jiucheng
    APPLIED MATHEMATICS & INFORMATION SCIENCES, 2014, 8 (04): : 1973 - 1985
  • [23] Spiking neural networks for higher-level information fusion
    Bomberger, NA
    Waxman, AM
    Pait, FM
    MULTISENSOR, MULTISOURCE INFORMATION FUSION: ARCHITECTURES, ALGORITHMS, AND APPLICATONS 2004, 2004, 5434 : 249 - 260
  • [24] Information Bottleneck in Control Tasks with Recurrent Spiking Neural Networks
    Vasu, Madhavun Candadai
    Izquierdo, Eduardo J.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2017, PT I, 2017, 10613 : 236 - 244
  • [25] Computing of temporal information in spiking neural networks with ReRAM synapses
    Wang, W.
    Pedretti, G.
    Milo, V.
    Carboni, R.
    Calderoni, A.
    Ramaswamy, N.
    Spinelli, A. S.
    Ielmini, D.
    FARADAY DISCUSSIONS, 2019, 213 : 453 - 469
  • [26] Introduction to spiking neural networks: Information processing, learning and applications
    Ponulak, Filip
    Kasinski, Andrzej
    ACTA NEUROBIOLOGIAE EXPERIMENTALIS, 2011, 71 (04) : 409 - 433
  • [27] SPIKING NEURAL NETWORKS
    Ghosh-Dastidar, Samanwoy
    Adeli, Hojjat
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2009, 19 (04) : 295 - 308
  • [28] A multivariate extension of mutual information for growing neural networks
    Ball, Kenneth R.
    Grant, Christopher
    Mundy, William R.
    Shafer, Timothy J.
    NEURAL NETWORKS, 2017, 95 : 29 - 43
  • [29] Mutual Information-based RBM Neural Networks
    Peng, Kang-Hao
    Zhang, Heng
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 2458 - 2463
  • [30] Lyapunov exponents and mutual information of chaotic neural networks
    Mizutani, S
    Sano, T
    Uchiyama, T
    Sonehara, N
    NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 200 - 209