Hierarchical Approximate Memory for Deep Neural Network Applications

被引:2
|
作者
Ha, Minho [1 ]
Hwang, Seokha [2 ]
Kim, Jeonghun [1 ]
Lee, Youngjoo [1 ]
Lee, Sunggu [1 ]
机构
[1] Pohang Univ Sci & Technol, Dept Elect Engn, Pohang 37673, South Korea
[2] Samsung Elect, Memory Business, Hwasung 18448, South Korea
关键词
approximate computing; deep neural network; low power memory systems;
D O I
10.1109/IEEECONF51394.2020.9443540
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Power consumed by a computer memory system can be significantly reduced if a certain level of error is permitted in the data stored in memory. Such an approximate memory approach is viable for use in applications developed using deep neural networks (DNNs) because such applications are typically error-resilient. In this paper, the use of hierarchical approximate memory for DNNs is studied and modeled. Although previous research has focused on approximate memory for specific memory technologies, this work proposes to consider approximate memory for the entire memory hierarchy of a computer system by considering the error budget for a given target application. This paper proposes a system model based on the error budget (amount by which the memory error rate can be permitted to rise to) for a target application and the power usage characteristics of the constituent memory technologies of a memory hierarchy. Using DNN case studies involving SRAM, DRAM, and NAND, this paper shows that the overall memory power consumption can be reduced by up to 43.38% by using the proposed model to optimally divide up the available error budget.
引用
收藏
页码:261 / 266
页数:6
相关论文
共 50 条
  • [1] Approximate Adders for Deep Neural Network Accelerators
    Raghuram, S.
    Shashank, N.
    2022 35TH INTERNATIONAL CONFERENCE ON VLSI DESIGN (VLSID 2022) HELD CONCURRENTLY WITH 2022 21ST INTERNATIONAL CONFERENCE ON EMBEDDED SYSTEMS (ES 2022), 2022, : 210 - 215
  • [2] A HIERARCHICAL NEURAL NETWORK MODEL FOR ASSOCIATIVE MEMORY
    FUKUSHIMA, K
    BIOLOGICAL CYBERNETICS, 1984, 50 (02) : 105 - 113
  • [3] Hierarchical deep neural network for multivariate regression
    Du, Jun
    Xu, Yong
    PATTERN RECOGNITION, 2017, 63 : 149 - 157
  • [4] Hierarchical Deep Neural Network for Image Captioning
    Yuting Su
    Yuqian Li
    Ning Xu
    An-An Liu
    Neural Processing Letters, 2020, 52 : 1057 - 1067
  • [5] Hierarchical Deep Neural Network for Image Captioning
    Su, Yuting
    Li, Yuqian
    Xu, Ning
    Liu, An-An
    NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1057 - 1067
  • [6] Scalable Data Management on Hybrid Memory System for Deep Neural Network Applications
    Rang, Wei
    Yang, Donglin
    Li, Zhimin
    Cheng, Dazhao
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 1470 - 1480
  • [7] HIERARCHICAL NEURAL NETWORK MODEL FOR ASSOCIATIVE MEMORY.
    Fukushima, Kunihiko
    1600, (66):
  • [8] An Approximate Memory Architecture for Energy Saving in Deep Learning Applications
    Nguyen, Duy Thanh
    Hung, Nguyen Huy
    Kim, Hyun
    Lee, Hyuk-Jae
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2020, 67 (05) : 1588 - 1601
  • [9] Memory Efficient Deep Neural Network Training
    Shilova, Alena
    EURO-PAR 2021: PARALLEL PROCESSING WORKSHOPS, 2022, 13098 : 515 - 519
  • [10] Hardware Approximate Techniques for Deep Neural Network Accelerators: A Survey
    Armeniakos, Giorgos
    Zervakis, Georgios
    Soudris, Dimitrios
    Henkel, Joerg
    ACM COMPUTING SURVEYS, 2023, 55 (04)