Multi-scale strip-shaped convolution attention network for lightweight image super-resolution

被引:0
|
作者
Xu, Ke [1 ]
Pan, Lulu [1 ]
Peng, Guohua [1 ]
Zhang, Wenbo [1 ]
Lv, Yanheng [1 ]
Li, Guo [1 ]
Li, Lingxiao [1 ]
Lei, Le [1 ]
机构
[1] Northwestern Polytech Univ, Sch Math & Stat, Xian 710129, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; Strip convolution; Attention mechanism; Lightweight; Convolutional neural network;
D O I
10.1016/j.image.2024.117166
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Lightweight convolutional neural networks for Single Image Super-Resolution (SISR) have exhibited remarkable performance improvements in recent years. These models achieve excellent performance by relying on attention mechanisms that incorporate square-shaped convolutions to enhance feature representation. However, these approaches still suffer from redundancy which comes from square-shaped convolutional kernels and overlooks the utilization of multi-scale information. In this paper, we propose a novel attention mechanism called Multiscale Strip-shaped convolution Attention (MSA), which utilizes three sets of differently sized depth-wise separable stripe convolution kernels in parallel to replace the redundant square-shaped convolution attention and extract multi-scale features. We also generalize MSA to other lightweight neural network models, and experimental results show that MSA outperforms other convolutional based attention mechanisms. Building upon MSA, we propose an Efficient Feature Extraction Block (EFEB), a lightweight block for SISR. Finally, based on EFEB, we propose a lightweight image super-resolution neural network named Multi-scale Strip-shaped convolution Attention Network (MSAN). Experiments demonstrate that MSAN outperforms existing state-of-the-art lightweight SR methods with fewer parameters and lower computational complexity.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] A lightweight multi-scale channel attention network for image super-resolution
    Li, Wenbin
    Li, Juefei
    Li, Jinxin
    Huang, Zhiyong
    Zhou, Dengwen
    NEUROCOMPUTING, 2021, 456 : 327 - 337
  • [2] Lightweight Multi-Scale Asymmetric Attention Network for Image Super-Resolution
    Zhang, Min
    Wang, Huibin
    Zhang, Zhen
    Chen, Zhe
    Shen, Jie
    MICROMACHINES, 2022, 13 (01)
  • [3] Lightweight multi-scale distillation attention network for image super-resolution
    Tang, Yinggan
    Hu, Quanwei
    Bu, Chunning
    Knowledge-Based Systems, 2025, 309
  • [4] Multi-scale convolutional attention network for lightweight image super-resolution
    Xie, Feng
    Lu, Pei
    Liu, Xiaoyong
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 95
  • [5] Multi-scale attention network for image super-resolution
    Wang, Li
    Shen, Jie
    Tang, E.
    Zheng, Shengnan
    Xu, Lizhong
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 80
  • [6] Image Super-Resolution Reconstruction Based on Lightweight Multi-Scale Channel Attention Network
    Zhou D.-W.
    Li W.-B.
    Li J.-X.
    Huang Z.-Y.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2022, 50 (10): : 2336 - 2346
  • [7] Lightweight multi-scale residual networks with attention for image super-resolution
    Liu, Huan
    Cao, Feilong
    Wen, Chenglin
    Zhang, Qinghua
    KNOWLEDGE-BASED SYSTEMS, 2020, 203
  • [8] An image super-resolution network based on multi-scale convolution fusion
    Yang, Xin
    Zhu, Yitian
    Guo, Yingqing
    Zhou, Dake
    VISUAL COMPUTER, 2022, 38 (12): : 4307 - 4317
  • [9] An image super-resolution network based on multi-scale convolution fusion
    Xin Yang
    Yitian Zhu
    Yingqing Guo
    Dake Zhou
    The Visual Computer, 2022, 38 : 4307 - 4317
  • [10] LMSN:a lightweight multi-scale network for single image super-resolution
    Yiye Zou
    Xiaomin Yang
    Marcelo Keese Albertini
    Farhan Hussain
    Multimedia Systems, 2021, 27 : 845 - 856