Central Attention Mechanism for Convolutional Neural Networks

被引:0
|
作者
Geng, Y.X. [1 ]
Wang, L. [2 ]
Wang, Z.Y. [3 ]
Wang, Y.G. [1 ]
机构
[1] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[2] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[3] Automation Design Institute, Metallurgical Engineering Technology Co., Ltd., Dalian,116000, China
关键词
Tensors;
D O I
暂无
中图分类号
学科分类号
摘要
Model performance has been significantly enhanced by channel attention. The average pooling procedure creates skewness, lowering the performance of the network architecture. In the channel attention approach, average pooling is used to collect feature information to provide representative values. By leveraging the central limit theorem, we hypothesize that the strip-shaped average pooling operation will generate a one-dimensional tensor by considering the spatial position information of the feature map. The resulting tensor, obtained through average pooling, serves as the representative value for the features, mitigating skewness during the process. By incorporating the concept of the central limit theorem into the channel attention operation process, this study introduces a novel attention mechanism known as theCentral Attention Mechanism (CAM). Instead of directly using average pooling to generate channel representative values, the central attention approach employs star-stripe average pooling to normalize multiple feature representative values into a single representative value. In this way, strip-shaped average pooling can be utilized to collect data and generate a one-dimensional tensor, while star-stripe average pooling can provide feature representative values based on different spatial directions. To generate channel attention for the complementary input features, the activation of the feature representation value is performed for each channel. Our attention approach is flexible and can be seamlessly incorporated into various traditional network structures. Through rigorous testing, we demonstrate the effectiveness of our attention strategy, which can be applied to a wide range of computer vision applications and outperforms previous attention techniques. © (2024), (International Association of Engineers). All rights reserved.
引用
收藏
页码:1642 / 1648
相关论文
共 50 条
  • [31] Detecting phishing websites through improving convolutional neural networks with Self-Attention mechanism
    Said, Yahia
    Alsheikhy, Ahmed A.
    Lahza, Husam
    Shawly, Tawfeeq
    AIN SHAMS ENGINEERING JOURNAL, 2024, 15 (04)
  • [32] Protein-Protein Interaction Site Prediction Based on Attention Mechanism and Convolutional Neural Networks
    Li, Yuguang
    Lu, Shuai
    Ma, Qiang
    Nan, Xiaofei
    Zhang, Shoutao
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (06) : 3820 - 3829
  • [33] Enhancing substance identification by Raman spectroscopy using deep neural convolutional networks with an attention mechanism
    Xie, Yuhao
    Wang, Zilong
    Chen, Qiang
    Tang, Heshan
    Huang, Jie
    Liang, Pei
    ANALYTICAL METHODS, 2024, 16 (34) : 5793 - 5801
  • [34] Attention Visualization of Gated Convolutional Neural Networks with Self Attention in Sentiment Analysis
    Yanagimto, Hidekazu
    Hashimoto, Kiyota
    Okada, Makoto
    2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND DATA ENGINEERING (ICMLDE 2018), 2018, : 77 - 82
  • [35] Convolutional Neural Network and Attention Mechanism for Bone Age Prediction
    Mahayossanunt, Yanisa
    Thannamitsomboon, Titichaya
    Keatmanee, Chadaporn
    2019 IEEE ASIA PACIFIC CONFERENCE ON CIRCUITS AND SYSTEMS (APCCAS 2019), 2019, : 249 - 252
  • [36] Speech Separation Using Convolutional Neural Network and Attention Mechanism
    Yuan, Chun-Miao
    Sun, Xue-Mei
    Zhao, Hu
    DISCRETE DYNAMICS IN NATURE AND SOCIETY, 2020, 2020
  • [37] Crop Pest Recognition in Real Agricultural Environment Using Convolutional Neural Networks by a Parallel Attention Mechanism
    Zhao, Shengyi
    Liu, Jizhan
    Bai, Zongchun
    Hu, Chunhua
    Jin, Yujie
    FRONTIERS IN PLANT SCIENCE, 2022, 13
  • [38] Real world image tampering localization combining the self-attention mechanism and convolutional neural networks
    Zhong H.
    Bian S.
    Wang C.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 135 - 146
  • [39] Enhancing robotic grasping with attention mechanism and advanced UNet architectures in generative grasping convolutional neural networks
    Rasheed, Mayada Abdalsalam
    Jasim, Wesam M.
    Farhan, Rabahnori
    ALEXANDRIA ENGINEERING JOURNAL, 2024, 102 : 149 - 158
  • [40] Temporal Convolutional Attention Neural Networks for Time Series Forecasting
    Lin, Yang
    Koprinska, Irena
    Rana, Mashud
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,