Saving Memory Space in Deep Neural Networks by Recomputing: A Survey

被引:0
|
作者
Ulidowski, Irek [1 ,2 ]
机构
[1] Univ Leicester, Sch Comp & Math Sci, Leicester, Leics, England
[2] AGH Univ Sci & Technol, Dept Appl Informat, Krakow, Poland
来源
关键词
Deep Neural Networks; recomputing activations;
D O I
10.1007/978-3-031-38100-3_7
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Training a multilayered neural network involves execution of the network on the training data, followed by calculating the error between the predicted and actual output, and then performing backpropagation to update the network's weights in order to minimise the overall error. This process is repeated many times, with the network updating its weights until it produces the desired output with a satisfactory level of accuracy. It requires storage in memory of activation and gradient data for each layer during each training run of the network. This paper surveys the main approaches to recomputing the needed activation and gradient data instead of storing it in memory. We discuss how these approaches relate to reversible computation techniques.
引用
收藏
页码:89 / 105
页数:17
相关论文
共 50 条
  • [11] Deep Neural Networks and Tabular Data: A Survey
    Borisov, Vadim
    Leemann, Tobias
    Sessler, Kathrin
    Haug, Johannes
    Pawelczyk, Martin
    Kasneci, Gjergji
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 7499 - 7519
  • [12] A Survey of Accelerator Architectures for Deep Neural Networks
    Chen, Yiran
    Xie, Yuan
    Song, Linghao
    Chen, Fan
    Tang, Tianqi
    ENGINEERING, 2020, 6 (03) : 264 - 274
  • [13] Survey on Deep Convolutional Neural Networks in Mammography
    Abdelhafiz, Dina
    Nabavi, Sheida
    Ammar, Reda
    Yang, Clifford
    2017 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL ADVANCES IN BIO AND MEDICAL SCIENCES (ICCABS), 2017,
  • [14] A survey of quantization methods for deep neural networks
    Yang C.
    Zhang R.
    Huang L.
    Ti S.
    Lin J.
    Dong Z.
    Chen S.
    Liu Y.
    Yin X.
    Gongcheng Kexue Xuebao/Chinese Journal of Engineering, 2023, 45 (10): : 1613 - 1629
  • [15] A survey of model compression for deep neural networks
    Li J.-Y.
    Zhao Y.-K.
    Xue Z.-E.
    Cai Z.
    Li Q.
    Gongcheng Kexue Xuebao/Chinese Journal of Engineering, 2019, 41 (10): : 1229 - 1239
  • [16] Accelerating Deep Neural Networks implementation: A survey
    Dhouibi, Meriam
    Ben Salem, Ahmed Karim
    Saidi, Afef
    Ben Saoud, Slim
    IET COMPUTERS AND DIGITAL TECHNIQUES, 2021, 15 (02): : 79 - 96
  • [17] A Survey of Attacks and Defenses for Deep Neural Networks
    Machooka, Daniel
    Yuan, Xiaohong
    Esterline, Albert
    2023 IEEE INTERNATIONAL CONFERENCE ON CYBER SECURITY AND RESILIENCE, CSR, 2023, : 254 - 261
  • [18] A Survey on Evolutionary Construction of Deep Neural Networks
    Zhou, Xun
    Qin, A. K.
    Gong, Maoguo
    Tan, Kay Chen
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2021, 25 (05) : 894 - 912
  • [19] Model Compression for Deep Neural Networks: A Survey
    Li, Zhuo
    Li, Hengyi
    Meng, Lin
    COMPUTERS, 2023, 12 (03)
  • [20] Survey of scaling platforms for Deep Neural Networks
    Ratnaparkhi, Abhay A.
    Pilli, Emmanuel
    Joshi, R. C.
    2016 INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN COMMUNICATION TECHNOLOGIES (ETCT), 2016,