Linking neural responses to behavior with information-preserving population vectors

被引:7
|
作者
Sharpee, Tatyana O. [1 ]
Berkowitz, John A. [1 ]
机构
[1] Salk Inst Biol Studies, Computat Neurobiol Lab, 10010 North Torrey Pines Rd, La Jolla, CA 92037 USA
基金
美国国家科学基金会;
关键词
TRANSMISSION; VARIABILITY; ORIENTATION;
D O I
10.1016/j.cobeha.2019.03.004
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
All systems for processing signals, both artificial and within animals, must obey fundamental statistical laws for how information can be processed. We discuss here recent results using information theory that provide a blueprint for building circuits where signals can be read-out without information loss. Many properties that are necessary to build information-preserving circuits are actually observed in real neurons, at least approximately. One such property is the use of logistic nonlinearity for relating inputs to neural response probability. Such nonlinearities are common in neural and intracellular networks. With this nonlinearity type, there is a linear combination of neural responses that is guaranteed to preserve Shannon information contained in the response of a neural population, no matter how many neurons it contains. This read-out measure is related to a classic quantity known as the population vector that has been quite successful in relating neural responses to animal behavior in a wide variety of cases. Nevertheless, the population vector did not withstand the scrutiny of detailed information-theoretical analyses that showed that it discards substantial amounts of information contained in the responses of a neural population. We discuss recent theoretical results showing how to modify the population vector expression to make it 'information-preserving', and what is necessary in terms of neural circuit organization to allow for lossless information transfer. Implementing these strategies within artificial systems is likely to increase their efficiency, especially for brain-machine interfaces.
引用
收藏
页码:37 / 44
页数:8
相关论文
共 50 条
  • [1] On information-preserving transformations
    Harmanec, D
    Klir, GJ
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 1997, 26 (03) : 265 - 290
  • [2] Information-preserving operators
    Tsagareishvili, Vakhtang
    GEORGIAN MATHEMATICAL JOURNAL, 2014, 21 (03) : 343 - 349
  • [3] DCT BASED INFORMATION-PRESERVING POOLING FOR DEEP NEURAL NETWORKS
    Xu, Yuhao
    Nakayama, Hideki
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 894 - 898
  • [4] Information-Preserving Markov Aggregation
    Geiger, Bernhard C.
    Temmel, Christoph
    2013 IEEE INFORMATION THEORY WORKSHOP (ITW), 2013,
  • [5] Wavelets and information-preserving transformations
    Kim, YS
    QUANTUM COMMUNICATION, COMPUTING, AND MEASUREMENT, 1997, : 119 - 125
  • [6] INFORMATION-PRESERVING NETWORKS AND THE MIRRORED TRANSFORM
    Palmieri, Francesco A. N.
    Baldi, Mario
    Di Gennaro, Giovanni
    Buonanno, Amedeo
    2019 IEEE 29TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2019,
  • [7] A Study on Information-Preserving Schema Transformations
    Ndefo, Nonyelum
    Franconi, Enrico
    INTERNATIONAL JOURNAL OF SEMANTIC COMPUTING, 2020, 14 (01) : 27 - 53
  • [8] Information-Preserving Transforms: Two Graph Metrics for Simulated Spiking Neural Networks
    Duda, Alexander M.
    Levinson, Stephen E.
    COMPLEX ADAPTIVE SYSTEMS: EMERGING TECHNOLOGIES FOR EVOLVING SYSTEMS: SOCIO-TECHNICAL, CYBER AND BIG DATA, 2013, 20 : 14 - 21
  • [9] Neural Data Analysis and Reduction Using Improved Framework of Information-Preserving EMD
    Mehboob, Zareen
    Yin, Hujun
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2010, 2010, 6283 : 360 - 367
  • [10] REDUNDANCY REDUCTION WITH INFORMATION-PRESERVING NONLINEAR MAPS
    PARRA, L
    DECO, G
    MIESBACH, S
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1995, 6 (01) : 61 - 72