A Pooling Method Developed for Use in Convolutional Neural Networks

被引:0
|
作者
Akgul, Ismail [1 ]
机构
[1] Erzincan Binali Yildirim Univ, Fac Engn & Architecture, Dept Comp Engn, TR-24002 Erzincan, Turkiye
来源
关键词
Pooling; convolutional neural networks; deep learning;
D O I
10.32604/cmes.2024.052549
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
In convolutional neural networks, pooling methods are used to reduce both the size of the data and the number of parameters after the convolution of the models. These methods reduce the computational amount of convolutional neural networks, making the neural network more efficient. Maximum pooling, average pooling, and minimum pooling methods are generally used in convolutional neural networks. However, these pooling methods are not suitable for all datasets used in neural network applications. In this study, a new pooling approach to the literature is proposed to increase the efficiency and success rates of convolutional neural networks. This method, which we call MAM (Maximum Average Minimum) pooling, is more interactive than other traditional maximum pooling, average pooling, and minimum pooling methods and reduces data loss by calculating the more appropriate pixel value. The proposed MAM pooling method increases the performance of the neural network by calculating the optimal value during the training of convolutional neural networks. To determine the success accuracy of the proposed MAM pooling method and compare it with other traditional pooling methods, training was carried out on the LeNet-5 model using CIFAR-10, CIFAR-100, and MNIST datasets. According to the results obtained, the proposed MAM pooling method performed better than the maximum pooling, average pooling, and minimum pooling methods in all pool sizes on three different datasets.
引用
收藏
页码:751 / 770
页数:20
相关论文
共 50 条
  • [31] Deep Convolutional Neural Networks for Pedestrian Detection with Skip Pooling
    Liu, Jie
    Gao, Xingkun
    Bao, Nianyuan
    Tang, Jie
    Wu, Gangshan
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2056 - 2063
  • [32] Rank-based pooling for deep convolutional neural networks
    Shi, Zenglin
    Ye, Yangdong
    Wu, Yunpeng
    NEURAL NETWORKS, 2016, 83 : 21 - 31
  • [33] Implications of Pooling Strategies in Convolutional Neural Networks: A Deep Insight
    Sharma, Shallu
    Mehra, Rajesh
    FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2019, 44 (03) : 303 - 330
  • [34] Mixed fuzzy pooling in convolutional neural networks for image classification
    Teena Sharma
    Nishchal K. Verma
    Shahrukh Masood
    Multimedia Tools and Applications, 2023, 82 : 8405 - 8421
  • [35] Weighted pooling for image recognition of deep convolutional neural networks
    Xiaoning Zhu
    Qingyue Meng
    Bojian Ding
    Lize Gu
    Yixian Yang
    Cluster Computing, 2019, 22 : 9371 - 9383
  • [36] Max-Pooling Dropout for Regularization of Convolutional Neural Networks
    Wu, Haibing
    Gu, Xiaodong
    NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 46 - 54
  • [37] ADAPTIVE SALIENCE PRESERVING POOLING FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Yu Zhenyu
    Dai Shiyu
    Xing Yuxiang
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 513 - 518
  • [38] Mixed fuzzy pooling in convolutional neural networks for image classification
    Sharma, Teena
    Verma, Nishchal K.
    Masood, Shahrukh
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (06) : 8405 - 8421
  • [39] TI-POOLING: transformation-invariant pooling for feature learning in Convolutional Neural Networks
    Laptev, Dmitry
    Savinov, Nikolay
    Buhmann, Joachim M.
    Pollefeys, Marc
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 289 - 297
  • [40] Crowd density estimation based on convolutional neural networks with mixed pooling
    Zhang, Li
    Zheng, Hong
    Zhang, Ying
    Zhang, Dongming
    JOURNAL OF ELECTRONIC IMAGING, 2017, 26 (05)