Neuromorphic Computing Using Random Synaptic Feedback Weights for Error Backpropagation in NAND Flash Memory-Based Synaptic Devices

被引:10
|
作者
Lee, Sung-Tae [1 ]
Lee, Jong-Ho [2 ,3 ]
机构
[1] Hongik Univ, Sch Elect & Elect Engn, Seoul 04066, South Korea
[2] Seoul Natl Univ, Dept Elect & Comp Engn, Seoul 08826, South Korea
[3] Seoul Natl Univ, Interuniv Semicond Res Ctr ISRC, Seoul 08826, South Korea
基金
新加坡国家研究基金会;
关键词
Backpropagation; Flash memories; Computer architecture; System-on-chip; Neural networks; Standards; Transistors; Hardware neural networks; in-memory computing; NAND flash memory; neuromorphic; on-chip learning; synaptic device;
D O I
10.1109/TED.2023.3237670
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This work proposes utilizing separate synaptic string array for error backpropagation in NAND flash memory-based synaptic architecture with random synaptic feedback weight. To enable error backpropagation, forward and backward propagations are processed in separate synaptic devices in forward and backward synaptic arrays, respectively. In addition, synaptic weights in forward synaptic array are updated at each iteration, while those in backward synaptic array are fixed to reduce burden of peripheral circuits and power consumption. The optimal conductance response is investigated considering the linearity of the conductance response and the ratio of maximum and minimum currents. Reliability characteristics are verified by retention, endurance, and pass bias disturbance measurement results. Hardware-based neural networks with random synaptic weight achieve an inference accuracy of 95.41% which is comparable to that of 95.58% obtained with transposed weight. Hardware-based neural network simulations demonstrate that the inference accuracy of the proposed on-chip learning scheme hardly decreases compared to that of the off-chip learning even with increasing device variation.
引用
收藏
页码:1019 / 1024
页数:6
相关论文
共 50 条
  • [1] Random synaptic feedback weights support error backpropagation for deep learning
    Lillicrap, Timothy P.
    Cownden, Daniel
    Tweed, Douglas B.
    Akerman, Colin J.
    NATURE COMMUNICATIONS, 2016, 7
  • [2] Random synaptic feedback weights support error backpropagation for deep learning
    Timothy P. Lillicrap
    Daniel Cownden
    Douglas B. Tweed
    Colin J. Akerman
    Nature Communications, 7
  • [3] Synaptic Devices Based on 3-D AND Flash Memory Architecture for Neuromorphic Computing
    Noh, Yoohyun
    Seo, Yungtak
    Park, Byunggook
    Lee, Jong-Ho
    2019 IEEE 11TH INTERNATIONAL MEMORY WORKSHOP (IMW 2019), 2019, : 165 - 168
  • [4] Flash Memory for Synaptic Plasticity in Neuromorphic Computing: A Review
    Im, Jisung
    Pak, Sangyeon
    Woo, Sung-Yun
    Shin, Wonjun
    Lee, Sung-Tae
    BIOMIMETICS, 2025, 10 (02)
  • [5] Ion-Driven Electrochemical Random-Access Memory-Based Synaptic Devices for Neuromorphic Computing Systems: A Mini-Review
    Kang, Heebum
    Seo, Jongseon
    Kim, Hyejin
    Kim, Hyun Wook
    Hong, Eun Ryeong
    Kim, Nayeon
    Lee, Daeseok
    Woo, Jiyong
    MICROMACHINES, 2022, 13 (03)
  • [6] Review of neuromorphic computing based on NAND flash memory
    Lee, Sung-Tae
    Lee, Jong-Ho
    NANOSCALE HORIZONS, 2024, 9 (09) : 1475 - 1492
  • [7] Synaptic devices based on silicon carbide for neuromorphic computing
    Ye, Boyu
    Liu, Xiao
    Wu, Chao
    Yan, Wensheng
    Pi, Xiaodong
    JOURNAL OF SEMICONDUCTORS, 2025, 46 (02)
  • [8] Synaptic devices based on silicon carbide for neuromorphic computing
    Boyu Ye
    Xiao Liu
    Chao Wu
    Wensheng Yan
    Xiaodong Pi
    Journal of Semiconductors, 2025, 46 (02) : 41 - 55
  • [9] Transistor-Based Synaptic Devices for Neuromorphic Computing
    Huang, Wen
    Zhang, Huixing
    Lin, Zhengjian
    Hang, Pengjie
    Li, Xing'ao
    CRYSTALS, 2024, 14 (01)
  • [10] Nanowire-based synaptic devices for neuromorphic computing
    Chen, Xue
    Chen, Bingkun
    Zhao, Pengfei
    Roy, Vellaisamy A. L.
    Han, Su-Ting
    Zhou, Ye
    MATERIALS FUTURES, 2023, 2 (02):