Jointly efficient encoding and decoding in neural populations

被引:0
|
作者
Malerba, Simone Blanco [1 ,2 ]
Micheli, Aurora [1 ]
Woodford, Michael [3 ]
da Silveira, Rava Azeredo [1 ,4 ,5 ]
机构
[1] Univ Paris, Univ PSL, Sorbonne Univ 3, Lab Phys Ecole Normale Super,ENS,CNRS, Paris, France
[2] Univ Med Ctr Hamburg Eppendorf, Inst Neural Informat Proc, Ctr Mol Neurobiol, Hamburg, Germany
[3] Columbia Univ, Dept Econ, New York, NY USA
[4] Inst Mol & Clin Ophthalmol Basel, Basel, Switzerland
[5] Univ Basel, Fac Sci, Basel, Switzerland
关键词
BAYESIAN-INFERENCE; FISHER INFORMATION; MUTUAL INFORMATION; MIXTURES; MODEL; CODE;
D O I
10.1371/journal.pcbi.1012240
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
The efficient coding approach proposes that neural systems represent as much sensory information as biological constraints allow. It aims at formalizing encoding as a constrained optimal process. A different approach, that aims at formalizing decoding, proposes that neural systems instantiate a generative model of the sensory world. Here, we put forth a normative framework that characterizes neural systems as jointly optimizing encoding and decoding. It takes the form of a variational autoencoder: sensory stimuli are encoded in the noisy activity of neurons to be interpreted by a flexible decoder; encoding must allow for an accurate stimulus reconstruction from neural activity. Jointly, neural activity is required to represent the statistics of latent features which are mapped by the decoder into distributions over sensory stimuli; decoding correspondingly optimizes the accuracy of the generative model. This framework yields in a family of encoding-decoding models, which result in equally accurate generative models, indexed by a measure of the stimulus-induced deviation of neural activity from the marginal distribution over neural activity. Each member of this family predicts a specific relation between properties of the sensory neurons-such as the arrangement of the tuning curve means (preferred stimuli) and widths (degrees of selectivity) in the population-as a function of the statistics of the sensory world. Our approach thus generalizes the efficient coding approach. Notably, here, the form of the constraint on the optimization derives from the requirement of an accurate generative model, while it is arbitrary in efficient coding models. Moreover, solutions do not require the knowledge of the stimulus distribution, but are learned on the basis of data samples; the constraint further acts as regularizer, allowing the model to generalize beyond the training data. Finally, we characterize the family of models we obtain through alternate measures of performance, such as the error in stimulus reconstruction. We find that a range of models admits comparable performance; in particular, a population of sensory neurons with broad tuning curves as observed experimentally yields both low reconstruction stimulus error and an accurate generative model that generalizes robustly to unseen data. Our brain represents the sensory world in the activity of populations of neurons. Two theories have addressed the nature of these representations. The first theory-efficient coding-posits that neurons encode as much information as possible about sensory stimuli, subject to resource constraints such as limits on energy consumption. The second one-generative modeling-focuses on decoding, and is organized around the idea that neural activity plays the role of a latent variable from which sensory stimuli can be simulated. Our work subsumes the two approaches in a unifying framework based on the mathematics of variational autoencoders. Unlike in efficient coding, which assumes full knowledge of stimulus statistics, here representations are learned from examples, in a joint optimization of encoding and decoding. This new framework yields a range of optimal representations, corresponding to different models of neural selectivity and reconstruction performances, depending on the resource constraint. The form of the constraint is not arbitrary but derives from the optimization framework, and its strength tunes the ability of the model to generalize beyond the training example. Central to the approach, and to the nature of the representations it implies, is the interplay of encoding and decoding, itself central to brain processing.
引用
收藏
页数:32
相关论文
共 50 条
  • [1] An Efficient and Perceptually Motivated Auditory Neural Encoding and Decoding Algorithm for Spiking Neural Networks
    Pan, Zihan
    Chua, Yansong
    Wu, Jibin
    Zhang, Malu
    Li, Haizhou
    Ambikairajah, Eliathamby
    [J]. FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [2] Efficient Sensory Encoding and Bayesian Inference with Heterogeneous Neural Populations
    Ganguli, Deep
    Simoncelli, Eero P.
    [J]. NEURAL COMPUTATION, 2014, 26 (10) : 2103 - 2134
  • [3] Encoding and decoding time in neural development
    Toma, Kenichi
    Wang, Tien-Cheng
    Hanashima, Carina
    [J]. DEVELOPMENT GROWTH & DIFFERENTIATION, 2016, 58 (01) : 59 - 72
  • [4] Efficient Encoding and Decoding with Permutation Arrays
    Lin, Te-Tsung
    Tsai, Shi-Chun
    Tzeng, Wen-Guey
    [J]. 2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 211 - 214
  • [5] The design and implementation of neural network encoding and decoding
    Electronic Information School, Wuhan University, Hubei
    430072, China
    不详
    850000, China
    [J]. Int. J. Simul. Syst. Sci. Technol, 38 (17.1-17.5):
  • [6] Neural Encoding and Decoding With Distributed Sentence Representations
    Sun, Jingyuan
    Wang, Shaonan
    Zhang, Jiajun
    Zong, Chengqing
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (02) : 589 - 603
  • [7] Efficient encoding and decoding schemes for balanced codes
    Youn, JH
    Bose, B
    [J]. IEEE TRANSACTIONS ON COMPUTERS, 2003, 52 (09) : 1229 - 1232
  • [8] Efficient Data Encoding and Decoding for Quantum Computing
    Mahmud, Naveed
    Jeng, Mingyoung Joshua
    Nobel, Md. Alvir Islam
    Chaudhary, Manu
    Islam, S. M. Ishraq Ul
    Levy, David
    El-Araby, Esam
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON QUANTUM COMPUTING AND ENGINEERING (QCE 2022), 2022, : 765 - 768
  • [9] Efficient encoding and decoding algorithms for distributed resampling
    Vamsi Kiran D.
    Srinivasa Rao C.
    Ramakrishnan R.
    Gupta P.
    [J]. Journal of the Indian Society of Remote Sensing, 2004, 32 (3) : 269 - 286
  • [10] Polar codes: Encoding/decoding and rate-compatible jointly design for HARQ system
    Zeng, Qiaoli
    Zhou, Quan
    He, Xiangkun
    Sun, Youming
    Li, Xiangcheng
    Chen, Haiqiang
    [J]. Intelligent and Converged Networks, 2021, 2 (04): : 334 - 346