A fast training method for memristor crossbar based multi-layer neural networks

被引:0
|
作者
Raqibul Hasan
Tarek M. Taha
Chris Yakopcic
机构
[1] University of Dayton,Department of Electrical and Computer Engineering
关键词
Neural networks; Memristor crossbars; Training algorithm; On-chip training;
D O I
暂无
中图分类号
学科分类号
摘要
Memristor crossbar arrays carry out multiply–add operations in parallel in the analog domain which is the dominant operation in a neural network application. On-chip training of memristor neural network systems have the significant advantage of being able to get around device variability and faults. This paper presents a novel technique for on-chip training of multi-layer neural networks implemented using a single crossbar per layer and two memristors per synapse. Using two memristors per synapse provides double the synaptic weight precision when compared to a design that uses only one memristor per synapse. Proposed system utilizes a novel variant of the back-propagation (BP) algorithm to reduce both circuit area and training time. During training, all the memristors in a crossbar are updated in four steps in parallel. We evaluated the training of the proposed system with some nonlinearly separable datasets through detailed SPICE simulations which take crossbar wire resistance and sneak-paths into consideration. The proposed training algorithm trained the nonlinearly separable functions with a slight loss in accuracy compared to training with the traditional BP algorithm.
引用
收藏
页码:443 / 454
页数:11
相关论文
共 50 条
  • [1] A fast training method for memristor crossbar based multi-layer neural networks
    Hasan, Raqibul
    Taha, Tarek M.
    Yakopcic, Chris
    [J]. ANALOG INTEGRATED CIRCUITS AND SIGNAL PROCESSING, 2017, 93 (03) : 443 - 454
  • [2] On-chip training of memristor crossbar based multi-layer neural networks
    Hasan, Raqibul
    Taha, Tarek M.
    Yakopcic, Chris
    [J]. MICROELECTRONICS JOURNAL, 2017, 66 : 31 - 40
  • [3] Learning Method for Ex-situ Training of Memristor Crossbar based Multi-Layer Neural Network
    Bala, Anu
    Adeyemo, Adedotun
    Yang, Xiaohan
    Jabir, Abusaleh
    [J]. 2017 9TH INTERNATIONAL CONGRESS ON ULTRA MODERN TELECOMMUNICATIONS AND CONTROL SYSTEMS AND WORKSHOPS (ICUMT), 2017, : 305 - 310
  • [4] Distributed Training for Multi-Layer Neural Networks by Consensus
    Liu, Bo
    Ding, Zhengtao
    Lv, Chen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (05) : 1771 - 1778
  • [5] Stability analysis for memristor-based stochastic multi-layer neural networks with coupling disturbance
    Xiang, Jianglian
    Ren, Junwu
    Tan, Manchun
    [J]. CHAOS SOLITONS & FRACTALS, 2022, 165
  • [6] Memristor Based Neuromorphic Circuit for Ex-Situ Training of Multi-Layer Neural Network Algorithms
    Yakopcic, Chris
    Hasan, Raqibul
    Taha, Tarek M.
    [J]. 2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [7] Defect-Tolerant Crossbar Training of Memristor Ternary Neural Networks
    Pham, Khoa Van
    Nguyen, Tien Van
    Min, Kyeong-Sik
    [J]. 2019 26TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (ICECS), 2019, : 486 - 489
  • [8] Fault tolerant multi-layer neural networks with GA training
    Sugawara, E
    Fukushi, M
    Horiguchi, S
    [J]. 18TH IEEE INTERNATIONAL SYMPOSIUM ON DEFECT AND FAULT TOLERANCE IN VLSI SYSTEMS, PROCEEDINGS, 2003, : 328 - 335
  • [9] Chaos and multi-layer attractors in asymmetric neural networks coupled with discrete fractional memristor
    He, Shaobo
    Vignesh, D.
    Rondoni, Lamberto
    Banerjee, Santo
    [J]. NEURAL NETWORKS, 2023, 167 : 572 - 587
  • [10] An Extreme Learning Machine Based Pretraining Method for Multi-Layer Neural Networks
    Noinongyao, Pavit
    Watchareeruetai, Ukrit
    [J]. 2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 608 - 613