Theories of Error Back-Propagation in the Brain

被引:227
|
作者
Whittington, James C. R. [1 ,2 ]
Bogacz, Rafal [1 ]
机构
[1] Univ Oxford, Nuffield Dept Clin Neurosci, MRC Brain Network Dynam Unit, Oxford OX3 9DU, England
[2] Univ Oxford, Wellcome Ctr Integrat Neuroimaging, Ctr Funct Magnet Resonance Imaging Brain, Oxford OX3 9DU, England
基金
英国医学研究理事会; 英国工程与自然科学研究理事会;
关键词
SYNAPTIC PLASTICITY; LEARNING ALGORITHM; NEURAL-NETWORKS; SITES; REINFORCEMENT; REPRESENTATIONS; APPROXIMATION; FEEDFORWARD; INHIBITION; ACTIVATION;
D O I
10.1016/j.tics.2018.12.005
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
This review article summarises recently proposed theories on how neural circuits in the brain could approximate the error back-propagation algorithm used by artificial neural networks. Computational models implementing these theories achieve learning as efficient as artificial neural networks, but they use simple synaptic plasticity rules based on activity of presynaptic and postsynaptic neurons. The models have similarities, such as including both feedforward and feedback connections, allowing information about error to propagate throughout the network. Furthermore, they incorporate experimental evidence on neural connectivity, responses, and plasticity. These models provide insights on how brain networks might be organised such that modification of synaptic weights on multiple levels of cortical hierarchy leads to improved performance on tasks.
引用
收藏
页码:235 / 250
页数:16
相关论文
共 50 条
  • [31] ACCELERATED LEARNING IN BACK-PROPAGATION NETS
    SCHMIDHUBER, J
    CONNECTIONISM IN PERSPECTIVE, 1989, : 439 - 445
  • [32] ACCELERATING THE CONVERGENCE OF THE BACK-PROPAGATION METHOD
    VOGL, TP
    MANGIS, JK
    RIGLER, AK
    ZINK, WT
    ALKON, DL
    BIOLOGICAL CYBERNETICS, 1988, 59 (4-5) : 257 - 263
  • [33] Lens design optimization by back-propagation
    Wang, Congli
    Chen, Ni
    Heidrich, Wolfgang
    INTERNATIONAL OPTICAL DESIGN CONFERENCE 2021, 2021, 12078
  • [34] Correlation of multiple neuronal spike trains using the back-propagation error correction algorithm
    Tam, D.C.
    Perkel, D.H.
    Tucker, W.S.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [35] On weights initialization of back-propagation networks
    Univ of Stuttgart, Stuttgart, Germany
    Neural Network World, 1 (89-100):
  • [36] BACK-PROPAGATION LEARNING IN EXPERT NETWORKS
    LACHER, RC
    HRUSKA, SI
    KUNCICKY, DC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 62 - 72
  • [37] An Interpretation of Forward-Propagation and Back-Propagation of DNN
    Xie, Guotian
    Lai, Jianhuang
    PATTERN RECOGNITION AND COMPUTER VISION, PT II, 2018, 11257 : 3 - 15
  • [38] Reviving and Improving Recurrent Back-Propagation
    Liao, Renjie
    Xiong, Yuwen
    Fetaya, Ethan
    Zhang, Lisa
    Yoon, KiJung
    Pitkow, Xaq
    Urtasun, Raquel
    Zemel, Richard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [39] CT image reconstruction by back-propagation
    Nakao, Z
    Ali, FEF
    Chen, YW
    FIRST INTERNATIONAL CONFERENCE ON KNOWLEDGE-BASED INTELLIGENT ELECTRONIC SYSTEMS, PROCEEDINGS 1997 - KES '97, VOLS 1 AND 2, 1997, : 323 - 326
  • [40] A parallel back-propagation adder structure
    Herrfeld, A
    Hentschke, S
    INTERNATIONAL JOURNAL OF ELECTRONICS, 1998, 85 (03) : 273 - 291