A feature-wise attention module based on the difference with surrounding features for convolutional neural networks

被引:9
|
作者
Tan, Shuo [1 ]
Zhang, Lei [1 ]
Shu, Xin [1 ]
Wang, Zizhou [1 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Machine Intelligence Lab, Chengdu 610065, Peoples R China
关键词
feature-wise attention; surround suppression; image classification; convolutional neural networks;
D O I
10.1007/s11704-022-2126-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Attention mechanism has become a widely researched method to improve the performance of convolutional neural networks (CNNs). Most of the researches focus on designing channel-wise and spatial-wise attention modules but neglect the importance of unique information on each feature, which is critical for deciding both "what" and "where" to focus. In this paper, a feature-wise attention module is proposed, which can give each feature of the input feature map an attention weight. Specifically, the module is based on the well-known surround suppression in the discipline of neuroscience, and it consists of two sub-modules, Minus-Square-Add (MSA) operation and a group of learnable nonlinear mapping functions. The MSA imitates the surround suppression and defines an energy function which can be applied to each feature to measure its importance. The group of non-linear functions refines the energy calculated by the MSA to more reasonable values. By these two sub-modules, feature-wise attention can be well captured. Meanwhile, due to the simple structure and few parameters of the two sub-modules, the proposed module can easily be almost integrated into any CNN. To verify the performance and effectiveness of the proposed module, several experiments were conducted on the Cifar10, Cifar100, Cinic10, and Tiny-ImageNet datasets, respectively. The experimental results demonstrate that the proposed module is flexible and effective for CNNs to improve their performance.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Adversarial Attack on Object Detection via Object Feature-Wise Attention and Perturbation Extraction
    Xue, Wei
    Xia, Xiaoyan
    Wan, Pengcheng
    Zhong, Ping
    Zheng, Xiao
    TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (03): : 1174 - 1189
  • [22] FAR-Net: Feature-Wise Attention-Based Relation Network for Multilabel Jujube Defect Classification
    Xu, Xiaohang
    Zheng, Hong
    You, Changhui
    Guo, Zhongyuan
    Wu, Xiongbin
    SENSORS, 2021, 21 (02) : 1 - 17
  • [23] Face Morphing Attack Detection and Localization Based on Feature-Wise Supervision
    Qin, Le
    Peng, Fei
    Long, Min
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2022, 17 : 3649 - 3662
  • [24] TAME: Attention Mechanism Based Feature Fusion for Generating Explanation Maps of Convolutional Neural Networks
    Ntrougkas, Mariano
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    2022 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM), 2022, : 58 - 65
  • [25] FDAM: full-dimension attention module for deep convolutional neural networks
    Cai, Silin
    Wang, Changping
    Ding, Jiajun
    Yu, Jun
    Fan, Jianping
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022, 11 (04) : 599 - 610
  • [26] Semantic Face Segmentation Using Convolutional Neural Networks With a Supervised Attention Module
    Hizukuri, Akiyoshi
    Hirata, Yuto
    Nakayama, Ryohei
    IEEE ACCESS, 2023, 11 : 116892 - 116902
  • [27] FDAM: full-dimension attention module for deep convolutional neural networks
    Silin Cai
    Changping Wang
    Jiajun Ding
    Jun Yu
    Jianping Fan
    International Journal of Multimedia Information Retrieval, 2022, 11 : 599 - 610
  • [28] SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks
    Yang, Lingxiao
    Zhang, Ru-Yuan
    Li, Lida
    Xie, Xiaohua
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [29] HAM: Hybrid attention module in deep convolutional neural networks for image classification
    Li, Guoqiang
    Fang, Qi
    Zha, Linlin
    Gao, Xin
    Zheng, Nenggan
    PATTERN RECOGNITION, 2022, 129
  • [30] RETRACTION: Channel-Wise Correlation Calibrates Attention Module for Convolutional Neural Networks (Retraction of Vol 2022, art no 2000170, 2022)
    Lu, Z.
    Dong, Y.
    Li, J.
    Lu, Z.
    He, P.
    Ru, H.
    JOURNAL OF SENSORS, 2023, 2023