Activity-difference training of deep neural networks using memristor crossbars

被引:0
|
作者
Su-in Yi
Jack D. Kendall
R. Stanley Williams
Suhas Kumar
机构
[1] Texas A&M University,
[2] Rain Neuromorphics,undefined
[3] Sandia National Laboratories,undefined
来源
Nature Electronics | 2023年 / 6卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Artificial neural networks have rapidly progressed in recent years, but are limited by the high energy costs required to train them on digital hardware. Emerging analogue hardware, such as memristor arrays, could offer improved energy efficiencies. However, the widely used backpropagation training algorithms are generally incompatible with such hardware because of mismatches between the analytically calculated training information and the imprecision of actual analogue devices. Here we report activity-difference-based training on co-designed tantalum oxide analogue memristor crossbars. Our approach, which we term memristor activity-difference energy minimization, treats the network parameters as a constrained optimization problem, and numerically calculates local gradients via Hopfield-like energy minimization using behavioural differences in the hardware targeted by the training. We use the technique to train one-layer and multilayer neural networks that can classify Braille words with high accuracy. With modelling, we show that our approach can offer over four orders of magnitude energy advantage compared with digital approaches for scaled-up problem sizes.
引用
收藏
页码:45 / 51
页数:6
相关论文
共 50 条
  • [1] Activity-difference training of deep neural networks using memristor crossbars
    Yi, Su-in
    Kendall, Jack D.
    Williams, R. Stanley
    Kumar, Suhas
    [J]. NATURE ELECTRONICS, 2023, 6 (01) : 45 - 51
  • [2] Efficient Identification of Critical Faults in Memristor Crossbars for Deep Neural Networks
    Chen, Ching-Yuan
    Chakrabarty, Krishnendu
    [J]. PROCEEDINGS OF THE 2021 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2021), 2021, : 1074 - 1077
  • [3] Spectral Ranking in Complex Networks Using Memristor Crossbars
    Korkmaz, Anil
    Zoppo, Gianluca
    Marrone, Francesco
    Yi, Su-In
    Williams, R. Stanley
    Corinto, Fernando
    Palermo, Samuel
    [J]. IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2023, 13 (01) : 357 - 370
  • [4] On-chip Training of Memristor Based Deep Neural Networks
    Hasan, Raqibul
    Taha, Tarek M.
    Yakopcic, Chris
    [J]. 2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 3527 - 3534
  • [5] Comparative Study on Quantization-Aware Training of Memristor Crossbars for Reducing Inference Power of Neural Networks at The Edge
    Tien Van Nguyen
    An, Jiyong
    Min, Kyeong-Sik
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Ex-situ training of large memristor crossbars for neural network applications
    Raqibul Hasan
    Chris Yakopcic
    Tarek M. Taha
    [J]. Analog Integrated Circuits and Signal Processing, 2019, 99 : 1 - 10
  • [7] Synapse-Neuron-Aware Training Scheme of Defect-Tolerant Neural Networks with Defective Memristor Crossbars
    An, Jiyong
    Oh, Seokjin
    Nguyen, Tien Van
    Min, Kyeong-Sik
    [J]. MICROMACHINES, 2022, 13 (02)
  • [8] Ex-situ training of large memristor crossbars for neural network applications
    Hasan, Raqibul
    Yakopcic, Chris
    Taha, Tarek M.
    [J]. ANALOG INTEGRATED CIRCUITS AND SIGNAL PROCESSING, 2019, 99 (01) : 1 - 10
  • [9] Ziksa: On-Chip Learning Accelerator with Memristor Crossbars for Multilevel Neural Networks
    Zyarah, Abdullah M.
    Soures, Nicholas
    Hays, Lydia
    Jacobs-Gedrim, Robin B.
    Agarwal, Sapan
    Marinella, Matthew
    Kudithipudi, Dhireesha
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017,
  • [10] Area-Efficient Mapping of Convolutional Neural Networks to Memristor Crossbars Using Sub-Image Partitioning
    Oh, Seokjin
    An, Jiyong
    Min, Kyeong-Sik
    [J]. MICROMACHINES, 2023, 14 (02)