In-Memory Computing for Machine Learning and Deep Learning

被引:5
|
作者
Lepri, N. [1 ,2 ]
Glukhov, A. [1 ,2 ]
Cattaneo, L. [1 ,2 ]
Farronato, M. [1 ,2 ]
Mannocci, P. [1 ,2 ]
Ielmini, D. [1 ,2 ]
机构
[1] Politecn Milan, Dipartimento Elettron Informaz & Bioingn, I-20133 Milan, Italy
[2] IU NET, I-20133 Milan, Italy
关键词
Random access memory; Nonvolatile memory; Magnetic tunneling; Transistors; FeFETs; Deep learning; Phase change materials; In-memory computing; deep learning; deep neural network; emerging memory technologies matrix-vector multiplication; PHASE-CHANGE MEMORY; CROSSBAR ARRAY; NEURAL-NETWORKS; SRAM MACRO; LINE RESISTANCE; PART I; ACCELERATOR; TECHNOLOGY; DEVICES; STATES;
D O I
10.1109/JEDS.2023.3265875
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In-memory computing (IMC) aims at executing numerical operations via physical processes, such as current summation and charge collection, thus accelerating common computing tasks including the matrix-vector multiplication. While extremely promising for memory-intensive processing such as machine learning and deep learning, the IMC design and realization must face significant challenges due to device and circuit nonidealities. This work provides an overview of the research trends and options for IMC-based implementations of deep learning accelerators with emerging memory technologies. The device technologies, the computing primitives, and the digital/analog/mixed design approaches are presented. Finally, the major device issues and metrics for IMC are discussed and benchmarked.
引用
收藏
页码:587 / 601
页数:15
相关论文
共 50 条
  • [41] Tree-based machine learning performed in-memory with memristive analog CAM
    Pedretti, Giacomo
    Graves, Catherine E.
    Serebryakov, Sergey
    Mao, Ruibin
    Sheng, Xia
    Foltin, Martin
    Li, Can
    Strachan, John Paul
    [J]. NATURE COMMUNICATIONS, 2021, 12 (01)
  • [42] Tree-based machine learning performed in-memory with memristive analog CAM
    Giacomo Pedretti
    Catherine E. Graves
    Sergey Serebryakov
    Ruibin Mao
    Xia Sheng
    Martin Foltin
    Can Li
    John Paul Strachan
    [J]. Nature Communications, 12
  • [43] Machine learning and deep learning
    Janiesch, Christian
    Zschech, Patrick
    Heinrich, Kai
    [J]. ELECTRONIC MARKETS, 2021, 31 (03) : 685 - 695
  • [44] Machine learning and deep learning
    Christian Janiesch
    Patrick Zschech
    Kai Heinrich
    [J]. Electronic Markets, 2021, 31 : 685 - 695
  • [45] DB4ML-An In-Memory Database Kernel with Machine Learning Support
    Jasny, Matthias
    Ziegler, Tobias
    Kraska, Tim
    Roehm, Uwe
    Binnig, Carsten
    [J]. SIGMOD'20: PROCEEDINGS OF THE 2020 ACM SIGMOD INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2020, : 159 - 173
  • [46] Energy-efficient and reliable in-memory classifier for machine-learning applications
    Clay, James
    Elango, Naveena
    Priya, Sheena Ratnam
    Jiang, Shixiong
    Sridhar, Ramalingam
    [J]. IET COMPUTERS AND DIGITAL TECHNIQUES, 2019, 13 (06): : 443 - 452
  • [47] Raven: Belady-Guided, Predictive (Deep) Learning for In-Memory and Content Caching
    Hu, Xinyue
    Ramadan, Eman
    Ye, Wei
    Tian, Feng
    Zhang, Zhi-Li
    [J]. PROCEEDINGS OF THE 18TH INTERNATIONAL CONFERENCE ON EMERGING NETWORKING EXPERIMENTS AND TECHNOLOGIES, CONEXT 2022, 2022, : 72 - 90
  • [48] Modeling and simulating in-memory memristive deep learning systems: An overview of current efforts
    Lammie, Corey
    Xiang, Wei
    Azghadi, Mostafa Rahimi
    [J]. ARRAY, 2022, 13
  • [49] Implementation of Learning Analytics Framework for MOOCs using State-of-the-art In-Memory Computing
    Laveti, Ramesh Naidu
    Kuppili, Swetha
    Ch, Janaki
    Pal, Supriya N.
    Babu, N. Sarat Chandra
    [J]. 2017 5TH NATIONAL CONFERENCE ON E-LEARNING & E-LEARNING TECHNOLOGIES (ELELTECH), 2017,
  • [50] Cloud Computing Security: Machine and Deep Learning Models Analysis
    Mishra, Janmaya Kumar
    Janarthanan, Midhunchakkaravarthy
    [J]. MACROMOLECULAR SYMPOSIA, 2023, 407 (01)