AlphaMEX: A smarter global pooling method for convolutional neural networks

被引:30
|
作者
Zhang, Boxue [1 ]
Zhao, Qi [1 ]
Feng, Wenquan [1 ]
Lyu, Shuchang [1 ]
机构
[1] Beihang Univ, Sch Elect & Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
CNN; Global Pooling; Feature-map sparsity; AlphaMEX; Network compression; OBJECT RECOGNITION; FIRE DETECTION; SURVEILLANCE;
D O I
10.1016/j.neucom.2018.07.079
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep convolutional neural networks have achieved great success on image classification. A series of feature extractors learned from CNN have been used in many computer vision tasks. Global pooling layer plays a very important role in deep convolutional neural networks. It is found that the input featuremaps of global pooling become sparse, as the increasing use of Batch Normalization and ReLU layer combination, which makes the original global pooling low efficiency. In this paper, we proposed a novel end-to-end trainable global pooling operator AlphaMEX Global Pool for convolutional neural network. A nonlinear smooth log-mean-exp function is designed, called AlphaMEX, to extract features effectively and make networks smarter. Compared to the original global pooling layer, our proposed method can improve classification accuracy without increasing any layers or too much redundant parameters. Experimental results on CIFAR-10/CIFAR100, SVHN and ImageNet demonstrate the effectiveness of the proposed method. The AlphaMEX-ResNet outperforms original ResNet-110 by 8.3% on CIFAR10+, and the top-1 error rate of AlphaMEX-DenseNet (k = 12) reaches 5.03% which outperforms original DenseNet (k = 12) by 4.0%. (c) 2018 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license.
引用
收藏
页码:36 / 48
页数:13
相关论文
共 50 条
  • [1] A improved pooling method for convolutional neural networks
    Zhao, Lei
    Zhang, Zhonglin
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [2] A Hybrid Pooling Method for Convolutional Neural Networks
    Tong, Zhiqiang
    Aihara, Kazuyuki
    Tanaka, Gouhei
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 454 - 461
  • [3] A improved pooling method for convolutional neural networks
    Lei Zhao
    Zhonglin Zhang
    Scientific Reports, 14
  • [4] Global Entropy Pooling layer for Convolutional Neural Networks
    Filus, Katarzyna
    Domanska, Joanna
    NEUROCOMPUTING, 2023, 555
  • [5] Universal pooling-A new pooling method for convolutional neural networks
    Hyun, Junhyuk
    Seong, Hongje
    Kim, Euntai
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 180
  • [6] A Pooling Method Developed for Use in Convolutional Neural Networks
    Akgul, Ismail
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 141 (01): : 751 - 770
  • [7] Multiactivation Pooling Method in Convolutional Neural Networks for Image Recognition
    Zhao, Qi
    Lyu, Shuchang
    Zhang, Boxue
    Feng, Wenquan
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2018,
  • [8] Kernel Pooling for Convolutional Neural Networks
    Cui, Yin
    Zhou, Feng
    Wang, Jiang
    Liu, Xiao
    Lin, Yuanqing
    Belongie, Serge
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 3049 - 3058
  • [9] Cascaded pooling for Convolutional Neural Networks
    Devi, Nilakshi
    Borah, Bhogeswar
    2018 FOURTEENTH INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING (ICINPRO) - 2018, 2018, : 155 - 159
  • [10] Pooling in Graph Convolutional Neural Networks
    Cheung, Mark
    Shi, John
    Jiang, Lavender
    Wright, Oren
    Moura, Jose M. F.
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 462 - 466