Central Attention Mechanism for Convolutional Neural Networks

被引:0
|
作者
Geng, Y.X. [1 ]
Wang, L. [2 ]
Wang, Z.Y. [3 ]
Wang, Y.G. [1 ]
机构
[1] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[2] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[3] Automation Design Institute, Metallurgical Engineering Technology Co., Ltd., Dalian,116000, China
关键词
Tensors;
D O I
暂无
中图分类号
学科分类号
摘要
Model performance has been significantly enhanced by channel attention. The average pooling procedure creates skewness, lowering the performance of the network architecture. In the channel attention approach, average pooling is used to collect feature information to provide representative values. By leveraging the central limit theorem, we hypothesize that the strip-shaped average pooling operation will generate a one-dimensional tensor by considering the spatial position information of the feature map. The resulting tensor, obtained through average pooling, serves as the representative value for the features, mitigating skewness during the process. By incorporating the concept of the central limit theorem into the channel attention operation process, this study introduces a novel attention mechanism known as theCentral Attention Mechanism (CAM). Instead of directly using average pooling to generate channel representative values, the central attention approach employs star-stripe average pooling to normalize multiple feature representative values into a single representative value. In this way, strip-shaped average pooling can be utilized to collect data and generate a one-dimensional tensor, while star-stripe average pooling can provide feature representative values based on different spatial directions. To generate channel attention for the complementary input features, the activation of the feature representation value is performed for each channel. Our attention approach is flexible and can be seamlessly incorporated into various traditional network structures. Through rigorous testing, we demonstrate the effectiveness of our attention strategy, which can be applied to a wide range of computer vision applications and outperforms previous attention techniques. © (2024), (International Association of Engineers). All rights reserved.
引用
收藏
页码:1642 / 1648
相关论文
共 50 条
  • [1] Visualization of Convolutional Neural Networks with Attention Mechanism
    Yuan, Meng
    Tie, Bao
    Lin, Dawei
    HUMAN CENTERED COMPUTING, HCC 2021, 2022, 13795 : 82 - 93
  • [2] Probabilistic Attention Map: A Probabilistic Attention Mechanism for Convolutional Neural Networks
    Liu, Yifeng
    Tian, Jing
    Sensors, 2024, 24 (24)
  • [3] Condiment recognition using convolutional neural networks with attention mechanism
    Ni, Jiangong
    Zhao, Yifan
    Zhou, Zhigang
    Zhao, Longgang
    Han, Zhongzhi
    JOURNAL OF FOOD COMPOSITION AND ANALYSIS, 2023, 115
  • [4] Speech Emotion Recognition Using Convolutional Neural Networks with Attention Mechanism
    Mountzouris, Konstantinos
    Perikos, Isidoros
    Hatzilygeroudis, Ioannis
    Corchado, Juan M.
    Iglesias, Carlos A.
    Kim, Byung-Gyu
    Mehmood, Rashid
    Ren, Fuji
    Lee, In
    ELECTRONICS, 2023, 12 (20)
  • [5] Cross-layer Channel Attention Mechanism for Convolutional Neural Networks
    He, Ying
    Zhu, Yuesheng
    Li, Honghu
    THIRTEENTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2021), 2021, 11878
  • [6] Arrhythmia Detection Using Convolutional Neural Networks with Temporal Attention Mechanism
    Zubair, Muhammad
    Woo, Sungpil
    Lim, Sunhwan
    Park, Chan-Won
    12TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC 2021): BEYOND THE PANDEMIC ERA WITH ICT CONVERGENCE INNOVATION, 2021, : 1101 - 1103
  • [7] An Attention Module for Convolutional Neural Networks
    Zhu, Baozhou
    Hofstee, Peter
    Lee, Jinho
    Al-Ars, Zaid
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 167 - 178
  • [8] Reparameterized attention for convolutional neural networks
    Wu, Yiming
    Li, Ruixiang
    Yu, Yunlong
    Li, Xi
    PATTERN RECOGNITION LETTERS, 2022, 164 : 89 - 95
  • [9] Lesion Segmentation Framework Based on Convolutional Neural Networks with Dual Attention Mechanism
    Xie, Fei
    Zhang, Panpan
    Jiang, Tao
    She, Jiao
    Shen, Xuemin
    Xu, Pengfei
    Zhao, Wei
    Gao, Gang
    Guan, Ziyu
    ELECTRONICS, 2021, 10 (24)
  • [10] Algorithm for Skeleton Action Recognition by Integrating Attention Mechanism and Convolutional Neural Networks
    Liu, Jianhua
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (08) : 604 - 613