On the Integration of Self-Attention and Convolution

被引:169
|
作者
Pan, Xuran [1 ]
Ge, Chunjiang [1 ]
Lu, Rui [1 ]
Song, Shiji [1 ]
Chen, Guanfu [2 ]
Huang, Zeyi [2 ]
Huang, Gao [1 ,3 ]
机构
[1] Tsinghua Univ, Dept Automat, BNRist, Beijing, Peoples R China
[2] Huawei Technol Ltd, Shenzhen, Peoples R China
[3] Beijing Acad Artificial Intelligence, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR52688.2022.00089
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolution and self-attention are two powerful techniques for representation learning, and they are usually considered as two peer approaches that are distinct from each other. In this paper, we show that there exists a strong underlying relation between them, in the sense that the bulk of computations of these two paradigms are in fact done with the same operation. Specifically, we first show that a traditional convolution with kernel size k x k can be decomposed into k(2) individual 1 x 1 convolutions, followed by shift and summation operations. Then, we interpret the projections of queries, keys, and values in self-attention module as multiple 1 x 1 convolutions, followed by the computation of attention weights and aggregation of the values. Therefore, the first stage of both two modules comprises the similar operation. More importantly, the first stage contributes a dominant computation complexity (square of the channel size) comparing to the second stage. This observation naturally leads to an elegant integration of these two seemingly distinct paradigms, i.e., a mixed model that enjoys the benefit of both self-Attention and Convolution (ACmix), while having minimum computational overhead compared to the pure convolution or self-attention counterpart. Extensive experiments show that our model achieves consistently improved results over competitive baselines on image recognition and downstream tasks. Code and pre-trained models will be released at https ://github.com/LeapLabTHU/ACmix and https://gitee.com/mindspore/models.
引用
下载
收藏
页码:805 / 815
页数:11
相关论文
共 50 条
  • [1] Global Self-Attention as a Replacement for Graph Convolution
    Hussain, Md Shamim
    Zaki, Mohammed J.
    Subramanian, Dharmashankar
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 655 - 665
  • [2] Integration of Multi-Head Self-Attention and Convolution for Person Re-Identification
    Zhou, Yalei
    Liu, Peng
    Cui, Yue
    Liu, Chunguang
    Duan, Wenli
    SENSORS, 2022, 22 (16)
  • [3] Stock Prediction Method Combining Graph Convolution and Convolution Self-Attention
    Tian, Hongli
    Cui, Yao
    Yan, Huiqiang
    Computer Engineering and Applications, 2024, 60 (04) : 192 - 199
  • [4] CellCentroidFormer: Combining Self-attention and Convolution for Cell Detection
    Wagner, Royden
    Rohr, Karl
    MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 212 - 222
  • [5] ISC-TRANSUNET: MEDICAL IMAGE SEGMENTATION NETWORK BASED ON THE INTEGRATION OF SELF-ATTENTION AND CONVOLUTION
    Li, Fang
    Pei, Siyu
    Zhang, Ziqun
    Yang, Fuming
    JOURNAL OF MECHANICS IN MEDICINE AND BIOLOGY, 2023, 23 (09)
  • [6] UniFormer: Unifying Convolution and Self-Attention for Visual Recognition
    Li, Kunchang
    Wang, Yali
    Zhang, Junhao
    Gao, Peng
    Song, Guanglu
    Liu, Yu
    Li, Hongsheng
    Qiao, Yu
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12581 - 12600
  • [7] Integrated convolution and self-attention for improving peptide toxicity prediction
    Jiao, Shihu
    Ye, Xiucai
    Sakurai, Tetsuya
    Zou, Quan
    Liu, Ruijun
    BIOINFORMATICS, 2024, 40 (05)
  • [8] Lightweight Smoke Recognition Based on Deep Convolution and Self-Attention
    Zhao, Yang
    Wang, Yigang
    Jung, Hoi-Kyung
    Jin, Yongqiang
    Hua, Dan
    Xu, Sen
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [9] CMAT: Integrating Convolution Mixer and Self-Attention for Visual Tracking
    Wang, Jun
    Yin, Peng
    Wang, Yuanyun
    Yang, Wenhui
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 326 - 338
  • [10] The implicit mathematical reasoning model combining self-attention and convolution
    Yao, Zhuangkai
    Zeng, Bi
    Hu, Huiting
    Wei, Pengfei
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (01) : 975 - 988