CMASR: Lightweight image super-resolution with cluster and match attention

被引:0
|
作者
Huang, Detian [1 ,2 ]
Lin, Mingxin [1 ]
Liu, Hang [1 ]
Zeng, Huanqiang [1 ,2 ]
机构
[1] Huaqiao Univ, Coll Engn, Quanzhou 362021, Fujian, Peoples R China
[2] Quanzhou Digital Inst, Quanzhou 362021, Fujian, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Image super-resolution; Transformer; Token clustering; Axial self-attention;
D O I
10.1016/j.imavis.2025.105457
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Transformer has recently achieved impressive success in image super-resolution due to its ability to model long-range dependencies with multi-head self-attention (MHSA). However, most existing MHSAs focus only on the dependencies among individual tokens, and ignore the ones among token clusters containing several tokens, resulting in the inability of Transformer to adequately explore global features. On the other hand, Transformer neglects local features, which inevitably hinders accurate detail reconstruction. To address the above issues, we propose a lightweight image super-resolution method with cluster and match attention (CMASR). Specifically, a token Clustering block is designed to divide input tokens into token clusters of different sizes with depthwise separable convolution. Subsequently, we propose an efficient axial matching self-attention (AMSA) mechanism, which introduces an axial matrix to extract local features, including axial similarities and symmetries. Further, by combining AMSA and Window Self-Attention, we construct a Hybrid Self-Attention block to capture the dependencies among token clusters of different sizes to sufficiently extract axial local features and global features. Extensive experiments demonstrate that the proposed CMASR outperforms state-of-the-art methods with fewer computational cost (i.e., the number of parameters and FLOPs).
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Lightweight network with masks for light field image super-resolution based on swin attention
    Wang, Xingzheng
    Wu, Shaoyong
    Li, Jiahui
    Wu, Jianbin
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (33) : 79785 - 79804
  • [42] Lightweight multi-scale aggregated residual attention networks for image super-resolution
    Shurong Pang
    Zhe Chen
    Fuliang Yin
    Multimedia Tools and Applications, 2022, 81 : 4797 - 4819
  • [43] Balanced Spatial Feature Distillation and Pyramid Attention Network for Lightweight Image Super-resolution
    Gendy, Garas
    Sabor, Nabil
    Hou, Jingchao
    He, Guanghui
    NEUROCOMPUTING, 2022, 509 (157-166) : 157 - 166
  • [44] Lightweight Image Super-Resolution Based on Shuffle Group Convolution and Sparse Global Attention
    Li Xiang
    Zhang Juan
    LASER & OPTOELECTRONICS PROGRESS, 2024, 61 (04)
  • [45] Densely Connected Transformer With Linear Self-Attention for Lightweight Image Super-Resolution
    Zeng, Kun
    Lin, Hanjiang
    Yan, Zhiqiang
    Fang, Jinsheng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [46] LCRCA: image super-resolution using lightweight concatenated residual channel attention networks
    Peng, Changmeng
    Shu, Pei
    Huang, Xiaoyang
    Fu, Zhizhong
    Li, Xiaofeng
    APPLIED INTELLIGENCE, 2022, 52 (09) : 10045 - 10059
  • [47] Lightweight Single Image Super-Resolution With Multi-Scale Spatial Attention Networks
    Soh, Jae Woong
    Cho, Nam Ik
    IEEE ACCESS, 2020, 8 : 35383 - 35391
  • [48] Enhanced local multi-windows attention network for lightweight image super-resolution
    Lv, Yanheng
    Pan, Lulu
    Xu, Ke
    Li, Guo
    Zhang, Wenbo
    Li, Lingxiao
    Lei, Le
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2025, 250
  • [49] Lightweight Image Super-Resolution Network Based on Regional Complementary Attention and Multi-dimensional Attention
    Zhou D.
    Wang W.
    Ma Y.
    Gao D.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2022, 35 (07): : 625 - 636
  • [50] Adaptive Attention Network for Image Super-resolution
    Chen Y.-M.
    Zhou D.-W.
    Zidonghua Xuebao/Acta Automatica Sinica, 2022, 48 (08): : 1950 - 1960