Channel-Wise Activation Map Pruning using MaxPool for Reducing Memory Accesses

被引:2
|
作者
Cho, Han [1 ]
Park, Jongsun [1 ]
机构
[1] Korea Univ, Sch Elect Engn, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Convolutional neural network; max-pool; activation compression; activation pruning; L2-norm;
D O I
10.1109/ISOCC56007.2022.10031452
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
While neural network pruning can reduce the amount of data transfers, pruning techniques such as fine-grained pruning cannot be efficiently implemented as indexing overhead is high. In order to design a hardware friendly pruning technique, structured pruning, which removes groups of data together to minimize the indexing overhead, is highly required. In this paper, to reduce the overall number of main memory accesses when implementing convolutional neural network (CNN) accelerators, we propose a hardware friendly L2-norm based structured channel-wise activation pruning using max-pooling. Max-pooling is typically deployed in CNN to decrease the height and width of CNN activation maps. The simulation results of the proposed technique shows that the activation maps of ResNet20 and ResNet56 can be reduced to 48% and 51%, respectively, with less than 1% accuracy degradation on CIFAR-10.
引用
收藏
页码:71 / 72
页数:2
相关论文
共 36 条
  • [1] Communication Efficient Federated Learning via Channel-wise Dynamic Pruning
    Tao, Bo
    Chen, Cen
    Chen, Huimin
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,
  • [2] Joint Pruning and Channel-Wise Mixed-Precision Quantization for Efficient Deep Neural Networks
    Motetti, Beatrice Alessandra
    Risso, Matteo
    Burrello, Alessio
    Macii, Enrico
    Poncino, Massimo
    Pagliari, Daniele Jahier
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (11) : 2619 - 2633
  • [3] Emotion Recognition based BCI using Channel-wise Features
    Jin, Longbin
    CHI'20: EXTENDED ABSTRACTS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2020,
  • [4] Polyphonic sound event localization and detection using channel-wise FusionNet
    Spoorthy, V.
    Kooolagudi, Shashidhar G.
    APPLIED INTELLIGENCE, 2024, 54 (06) : 5015 - 5026
  • [5] Brain Functional Connectivity Analysis and Crucial Channel Selection Using Channel-Wise CNN
    Wang, Jiaxing
    Wang, Weiqun
    Hou, Zeng-Guang
    Liang, Xu
    Ren, Shixin
    Peng, Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 40 - 49
  • [6] Channel-wise quantization without accuracy degradation using Δloss analysis
    Okado, Kazuki
    Matsumoto, Kengo
    Inoue, Atsuki
    Kawaguchi, Hiroshi
    Sakai, Yasufumi
    PROCEEDINGS OF 2022 7TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING TECHNOLOGIES, ICMLT 2022, 2022, : 56 - 61
  • [7] Improvement of Embedding Channel-Wise Activation in Soft-Attention Neural Image Captioning
    Li, Yanke
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING (ICVISP 2018), 2018,
  • [8] Improved Speech Emotion Recognition Using Channel-wise Global Head Pooling (CwGHP)
    Krishna Chauhan
    Kamalesh Kumar Sharma
    Tarun Varma
    Circuits, Systems, and Signal Processing, 2023, 42 : 5500 - 5522
  • [9] Improved Speech Emotion Recognition Using Channel-wise Global Head Pooling (CwGHP)
    Chauhan, Krishna
    Sharma, Kamalesh Kumar
    Varma, Tarun
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2023, 42 (09) : 5500 - 5522
  • [10] Take CARE: Improving Inherent Robustness of Spiking Neural Networks with Channel-wise Activation Recalibration Module
    Zhang, Yan
    Chen, Cheng
    Shen, Dian
    Wang, Meng
    Wang, Beilun
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 828 - 837