A fast training method for memristor crossbar based multi-layer neural networks

被引:0
|
作者
Raqibul Hasan
Tarek M. Taha
Chris Yakopcic
机构
[1] University of Dayton,Department of Electrical and Computer Engineering
关键词
Neural networks; Memristor crossbars; Training algorithm; On-chip training;
D O I
暂无
中图分类号
学科分类号
摘要
Memristor crossbar arrays carry out multiply–add operations in parallel in the analog domain which is the dominant operation in a neural network application. On-chip training of memristor neural network systems have the significant advantage of being able to get around device variability and faults. This paper presents a novel technique for on-chip training of multi-layer neural networks implemented using a single crossbar per layer and two memristors per synapse. Using two memristors per synapse provides double the synaptic weight precision when compared to a design that uses only one memristor per synapse. Proposed system utilizes a novel variant of the back-propagation (BP) algorithm to reduce both circuit area and training time. During training, all the memristors in a crossbar are updated in four steps in parallel. We evaluated the training of the proposed system with some nonlinearly separable datasets through detailed SPICE simulations which take crossbar wire resistance and sneak-paths into consideration. The proposed training algorithm trained the nonlinearly separable functions with a slight loss in accuracy compared to training with the traditional BP algorithm.
引用
收藏
页码:443 / 454
页数:11
相关论文
共 50 条
  • [41] Training multi-layer spiking neural networks using NormAD based spatio-temporal error backpropagation
    Anwani, Navin
    Rajendran, Bipin
    [J]. NEUROCOMPUTING, 2020, 380 : 67 - 77
  • [42] Image compression using multi-layer neural networks
    AbdelWahhab, O
    Fahmy, MM
    [J]. SECOND IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS, PROCEEDINGS, 1997, : 179 - 183
  • [43] Spatial complexity in multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    Lin, Song-Sun
    Lin, Yin-Heng
    [J]. JOURNAL OF DIFFERENTIAL EQUATIONS, 2009, 246 (02) : 552 - 580
  • [44] Why Squashing Functions in Multi-Layer Neural Networks
    Urenda, Julio C.
    Csiszar, Orsolya
    Csiszar, Gabor
    Dombi, Jozsef
    Kosheleva, Olga
    Kreinovich, Vladik
    Eigner, Gyorgy
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 1705 - 1711
  • [45] Minimal Multi-Layer Modifications of Deep Neural Networks
    Refaeli, Idan
    Katz, Guy
    [J]. SOFTWARE VERIFICATION AND FORMAL METHODS FOR ML-ENABLED AUTONOMOUS SYSTEMS, FOMLAS 2022, NSV 2022, 2022, 13466 : 46 - 66
  • [46] RECURSIVE ASSEMBLY OF MULTI-LAYER PERCEPTRON NEURAL NETWORKS
    Motato, Eliot
    Radcliffe, Clark
    [J]. 7TH ANNUAL DYNAMIC SYSTEMS AND CONTROL CONFERENCE, 2014, VOL 2, 2014,
  • [47] Application of multi-layer neural networks to image compression
    Ahmed, OA
    Fahmy, MM
    [J]. ISCAS '97 - PROCEEDINGS OF 1997 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS I - IV: CIRCUITS AND SYSTEMS IN THE INFORMATION AGE, 1997, : 1273 - 1276
  • [48] Realization problem of multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    [J]. NEURAL NETWORKS, 2015, 70 : 9 - 17
  • [49] The effects of quantization on multi-layer feedforward neural networks
    Jiang, MG
    Gielen, G
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2003, 17 (04) : 637 - 661
  • [50] Adaptive Multi-layer Contrastive Graph Neural Networks
    Shuhao Shi
    Pengfei Xie
    Xu Luo
    Kai Qiao
    Linyuan Wang
    Jian Chen
    Bin Yan
    [J]. Neural Processing Letters, 2023, 55 : 4757 - 4776