A Pooling Method Developed for Use in Convolutional Neural Networks

被引:0
|
作者
Akgul, Ismail [1 ]
机构
[1] Erzincan Binali Yildirim Univ, Fac Engn & Architecture, Dept Comp Engn, TR-24002 Erzincan, Turkiye
来源
关键词
Pooling; convolutional neural networks; deep learning;
D O I
10.32604/cmes.2024.052549
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
In convolutional neural networks, pooling methods are used to reduce both the size of the data and the number of parameters after the convolution of the models. These methods reduce the computational amount of convolutional neural networks, making the neural network more efficient. Maximum pooling, average pooling, and minimum pooling methods are generally used in convolutional neural networks. However, these pooling methods are not suitable for all datasets used in neural network applications. In this study, a new pooling approach to the literature is proposed to increase the efficiency and success rates of convolutional neural networks. This method, which we call MAM (Maximum Average Minimum) pooling, is more interactive than other traditional maximum pooling, average pooling, and minimum pooling methods and reduces data loss by calculating the more appropriate pixel value. The proposed MAM pooling method increases the performance of the neural network by calculating the optimal value during the training of convolutional neural networks. To determine the success accuracy of the proposed MAM pooling method and compare it with other traditional pooling methods, training was carried out on the LeNet-5 model using CIFAR-10, CIFAR-100, and MNIST datasets. According to the results obtained, the proposed MAM pooling method performed better than the maximum pooling, average pooling, and minimum pooling methods in all pool sizes on three different datasets.
引用
收藏
页码:751 / 770
页数:20
相关论文
共 50 条
  • [41] Term-Based Pooling in Convolutional Neural Networks for Text Classification
    Shuifei Zeng
    Yan Ma
    Xiaoyan Zhang
    Xiaofeng Du
    China Communications, 2020, 17 (04) : 109 - 124
  • [42] Learning Bag-of-Features Pooling for Deep Convolutional Neural Networks
    Passalis, Nikolaos
    Tefas, Anastasios
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5766 - 5774
  • [43] Term-Based Pooling in Convolutional Neural Networks for Text Classification
    Zeng, Shuifei
    Ma, Yan
    Zhang, Xiaoyan
    Du, Xiaofeng
    CHINA COMMUNICATIONS, 2020, 17 (04) : 109 - 124
  • [44] TEXT DETECTION BASED ON CONVOLUTIONAL NEURAL NETWORKS WITH SPATIAL PYRAMID POOLING
    Zhu, Rui
    Mao, Xiao-Jiao
    Zhu, Qi-Hai
    Li, Ning
    Yang, Yu-Bin
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 1032 - 1036
  • [45] Hybrid pooling for enhancement of generalization ability in deep convolutional neural networks
    Tong, Zhiqiang
    Tanaka, Gouhei
    NEUROCOMPUTING, 2019, 333 : 76 - 85
  • [46] A multilevel pooling scheme in convolutional neural networks for texture image recognition
    Lyra, Lucas O.
    Fabris, Antonio E.
    Florindo, Joao B.
    APPLIED SOFT COMPUTING, 2024, 152
  • [47] Discrete Cosine Transform Spectral Pooling Layers for Convolutional Neural Networks
    Smith, James S.
    Wilamowski, Bogdan M.
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2018, PT I, 2018, 10841 : 235 - 246
  • [48] Generalizing Pooling Functions in Convolutional Neural Networks: Mixed, Gated, and Tree
    Lee, Chen-Yu
    Gallagher, Patrick W.
    Tu, Zhuowen
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 464 - 472
  • [49] Steel Defect Classification with Max-Pooling Convolutional Neural Networks
    Masci, Jonathan
    Meier, Ueli
    Ciresan, Dan
    Schmidhuber, Juergen
    Fricout, Gabriel
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [50] Learning Pooling for Convolutional Neural Network
    Sun, Manli
    Song, Zhanjie
    Jiang, Xiaoheng
    Pan, Jing
    Pang, Yanwei
    NEUROCOMPUTING, 2017, 224 : 96 - 104