An Enhanced Floating Gate Memory for the Online Training of Analog Neural Networks

被引:1
|
作者
Gan, Lurong [1 ]
Wang, Chen [1 ]
Chen, Lin [1 ]
Zhu, Hao [1 ]
Sun, Qingqing [1 ]
Zhang, David Wei [1 ]
机构
[1] Fudan Univ, Sch Microelect, State Key Lab ASIC & Syst, Shanghai 200433, Peoples R China
关键词
Neural network; FG memory; U-shaped channel; erasing speed; online training; SYNAPSES;
D O I
10.1109/JEDS.2020.2964820
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Floating gate (FG) memory has long erasing time, which limits its application as an electronic synapse in online training. This paper proposes a novel enhanced floating gate memory (EFM) by TCAD simulation. Here, three other structures are simulated just for comparison. The simulation results show that the erasing speed is about 34ns while the other three need the time over 1.8ms, which makes the operation speed of long-term potentiation (LTP) more symmetrical to long-term depression (LTD). In addition, both LTP and LTD are approximately linear in the simulation results. The speed, linearity, and symmetry of weight update are the keys to online training of analog neural networks. These excellent performances indicated a potential application of EFM in analog neuro-inspired computing.
引用
收藏
页码:84 / 91
页数:8
相关论文
共 50 条
  • [21] Accelerating Deep Neural Networks with Analog Memory Devices
    Ambrogio, Stefano
    Narayanan, Pritish
    Tsai, Hsinyu
    Mackin, Charles
    Spoon, Katherine
    Chen, An
    Fasoli, Andrea
    Friz, Alexander
    Burr, Geoffrey W.
    2020 2ND IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2020), 2020, : 149 - 152
  • [22] Accelerating Deep Neural Networks with Analog Memory Devices
    Spoon, Katie
    Ambrogio, Stefano
    Narayanan, Pritish
    Tsai, Hsinyu
    Mackin, Charles
    Chen, An
    Fasoli, Andrea
    Friz, Alexander
    Burr, Geoffrey W.
    2020 IEEE INTERNATIONAL MEMORY WORKSHOP (IMW 2020), 2020, : 111 - 114
  • [23] Feedback Control for Online Training of Neural Networks
    Zhao, Zilong
    Cerf, Sophie
    Robu, Bogdan
    Marchand, Nicolas
    2019 3RD IEEE CONFERENCE ON CONTROL TECHNOLOGY AND APPLICATIONS (IEEE CCTA 2019), 2019, : 136 - 141
  • [24] HARDWARE IMPLEMENTATION OF NEW ANALOG MEMORY FOR NEURAL NETWORKS
    NAKAJIMA, K
    SATO, S
    KITAURA, T
    MUROTA, J
    SAWADA, Y
    IEICE TRANSACTIONS ON ELECTRONICS, 1995, E78C (01) : 101 - 105
  • [25] ACCELERATING DEEP NEURAL NETWORKS WITH ANALOG MEMORY DEVICES
    Burr, Geoffrey W.
    Ambrogio, Stefano
    Narayanan, Pritish
    Tsai, Hsinyu
    Mackin, Charles
    Chen, An
    2019 CHINA SEMICONDUCTOR TECHNOLOGY INTERNATIONAL CONFERENCE (CSTIC), 2019,
  • [26] Hardware implementation of new analog memory for neural networks
    Tohoku Univ, Sendai-shi, Japan
    IEICE Trans Electron, 1 (101-105):
  • [28] ANALOG FLOATING-GATE SYNAPSES FOR GENERAL-PURPOSE VLSI NEURAL COMPUTATION
    LEE, BW
    SHEU, BJ
    YANG, H
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (06): : 654 - 658
  • [29] A Novel Device - Floating Gate Transistor for Storing Weight of Neural Networks
    Krajmer, Mario
    Racko, Juraj
    Durackova, Daniela
    PROCEEDINGS OF THE 15TH CONFERENCE ON MICROWAVE TECHNIQUES, COMITE 2010, 2010, : 215 - 217
  • [30] A supervised neural network layer of continuously adapting, analog floating-gate nodes
    Dugger, J
    Srinivasan, V
    Hasler, P
    CONFERENCE RECORD OF THE THIRTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1 AND 2, 2003, : 2031 - 2035