Single image super-resolution via global aware external attention and multi-scale residual channel attention network

被引:2
|
作者
Liu, Mingming [1 ,2 ]
Li, Sui [2 ,3 ]
Liu, Bing [2 ,3 ]
Yang, Yuxin [2 ,3 ]
Liu, Peng [4 ]
Zhang, Chen [2 ,3 ]
机构
[1] Jiangsu Vocat Inst Architectural Technol, Sch Intelligent Mfg, Xuzhou 221000, Jiangsu, Peoples R China
[2] China Univ Min & Technol, Sch Comp Sci & Technol, Xuzhou 221116, Jiangsu, Peoples R China
[3] Minist Educ, Mine Digitizat Engn Res Ctr, Xuzhou, Peoples R China
[4] Natl Joint Engn Lab Internet Appl Technol Mines, Xuzhou 221008, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Single image super-resolution; Deep feature extraction structure; Deep-connected multi-scale residual attention block; Local aware channel attention; Global aware external attention; INTERPOLATION;
D O I
10.1007/s13042-023-02030-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, deep convolutional neural networks (CNNs) have shown significant advantages in improving the performance of single image super-resolution (SISR). To build an efficient network, multi-scale convolution is commonly incorporated into CNN-based SISR methods via scale features with different perceptive fields. However, the feature correlations of the same sample are not fully utilized by the existing multi-scale SISR approaches, impeding the further improvement of reconstruction performance. In addition, the correlations between different samples are still left unexplored. To address these problems, this paper proposes a deep-connected multi-scale residual attention network (DMRAN) by virtue of the feature correlations of the same sample and the correlations between different samples. Specifically, we propose a deep-connected multi-scale residual attention block (DMRAB) to take fully advantage of the multi-scale and hierarchical features, which can effectively learn the local interdependencies between channels by adjusting the channel features adaptively. Meanwhile, a global aware external attention (GAEA) is introduced to boost the performance of SISR by learning the correlations between all the samples. Furthermore, we develop a deep feature extraction structure (DFES), which seamlessly combines the stacked deep-connected multi-scale residual attention groups (DMRAG) with GAEA to learn deep feature representations incrementally. Extensive experimental results on the public benchmark datasets show the superiority of our DMRAN to the state-of-the-art SISR methods.
引用
下载
收藏
页码:2309 / 2321
页数:13
相关论文
共 50 条
  • [41] Multi-Scale Pixel-Attention Feedback Link Network for Single Image Super-Resolution
    Ge, Yanliang
    Tan, Shuang
    Bi, Hongbo
    Sun, Xiaoxiao
    PATTERN RECOGNITION AND IMAGE ANALYSIS, 2022, 32 (02) : 393 - 401
  • [42] Multi-scale non-local attention network for image super-resolution
    Wu, Xue
    Zhang, Kaibing
    Hu, Yanting
    He, Xin
    Gao, Xinbo
    SIGNAL PROCESSING, 2024, 218
  • [43] Efficient residual attention network for single image super-resolution
    Fangwei Hao
    Taiping Zhang
    Linchang Zhao
    Yuanyan Tang
    Applied Intelligence, 2022, 52 : 652 - 661
  • [44] Efficient residual attention network for single image super-resolution
    Hao, Fangwei
    Zhang, Taiping
    Zhao, Linchang
    Tang, Yuanyan
    APPLIED INTELLIGENCE, 2022, 52 (01) : 652 - 661
  • [45] Quaternion attention multi-scale widening network for endoscopy image super-resolution
    Lin, Junyu
    Huang, Guoheng
    Huang, Jun
    Yuan, Xiaochen
    Zeng, Yiwen
    Shi, Cheng
    PHYSICS IN MEDICINE AND BIOLOGY, 2023, 68 (07):
  • [46] Lightweight Single Image Super-Resolution With Multi-Scale Spatial Attention Networks
    Soh, Jae Woong
    Cho, Nam Ik
    IEEE ACCESS, 2020, 8 : 35383 - 35391
  • [47] Image super-resolution reconstruction with multi-scale attention fusion
    Chen, Chun-yi
    Wu, Xin-yi
    Hu, Xiao-juan
    Yu, Hai-yang
    CHINESE OPTICS, 2023, 16 (05) : 1034 - 1044
  • [48] TBNet: Stereo Image Super-Resolution with Multi-Scale Attention
    Zhu, Jiyang
    Han, Xue
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2023, 32 (18)
  • [49] Image super-resolution via channel attention and spatial attention
    Enmin Lu
    Xiaoxiao Hu
    Applied Intelligence, 2022, 52 : 2260 - 2268
  • [50] Image super-resolution via channel attention and spatial attention
    Lu, Enmin
    Hu, Xiaoxiao
    APPLIED INTELLIGENCE, 2022, 52 (02) : 2260 - 2268