Multi-Attention Residual Network for Image Super Resolution

被引:2
|
作者
Chang, Qing [1 ]
Jia, Xiaotian [1 ]
Lu, Chenhao [1 ]
Ye, Jian [1 ]
机构
[1] East China Univ Sci & Technol, Coll Informat Sci & Engn, Shanghai 200237, Peoples R China
基金
中国国家自然科学基金;
关键词
Super resolution; deep convolutional neural network; multi-attention residual network; global hierarchical feature fusion; SUPERRESOLUTION;
D O I
10.1142/S021800142254009X
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, many studies have shown that deep convolutional neural network can achieve superior performance in image super resolution (SR). The majority of current CNN-based SR methods tend to use deeper architecture to get excellent performance. However, with the growing depth and width of network, the hierarchical features from low-resolution (LR) images cannot be exploited effectively. On the other hand, most models lack the ability of discriminating different types of information and treating them equally, which results in limiting the representational capacity of the models. In this study, we propose the multi-attention residual network (MARN) to address these problems. Specifically, we propose a new multi-attention residual block (MARB), which is composed of attention mechanism and multi-scale residual network. At the beginning of each residual block, the channel importance of image features is adaptively recalibrated by attention mechanism. Then, we utilize convolutional kernels of different sizes to adaptively extract the multi-attention features on different scales. At the end of blocks, local multi-attention features fusion is applied to get more effective hierarchical features. After obtaining the outputs of each MARB, global hierarchical feature fusion jointly fuses all hierarchical features for reconstructing images. Our extensive experiments show that our model outperforms most of the state-of-the-art methods.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Multi-attention augmented network for single image super-resolution
    Chen, Rui
    Zhang, Heng
    Liu, Jixin
    [J]. PATTERN RECOGNITION, 2022, 122
  • [2] Single Image Super Resolution via Multi-Attention Fusion Recurrent Network
    Kou, Qiqi
    Cheng, Deqiang
    Zhang, Haoxiang
    Liu, Jingjing
    Guo, Xin
    Jiang, He
    [J]. IEEE ACCESS, 2023, 11 : 98653 - 98665
  • [3] Gated Multi-Attention Feedback Network for Medical Image Super-Resolution
    Shang, Jianrun
    Zhang, Xue
    Zhang, Guisheng
    Song, Wenhao
    Chen, Jinyong
    Li, Qilei
    Gao, Mingliang
    [J]. ELECTRONICS, 2022, 11 (21)
  • [4] Multi-Attention Ghost Residual Fusion Network for Image Classification
    Jia, Xiaofen
    Du, Shengjie
    Guo, Yongcun
    Huang, Yourui
    Zhao, Baiting
    [J]. IEEE ACCESS, 2021, 9 : 81421 - 81431
  • [5] A Multi-Attention Feature Distillation Neural Network for Lightweight Single Image Super-Resolution
    Zhang, Yongfei
    Lin, Xinying
    Yang, Hong
    He, Jie
    Qing, Linbo
    He, Xiaohai
    Li, Yi
    Chen, Honggang
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2024, 2024
  • [6] Multi-attention fusion transformer for single-image super-resolution
    Li, Guanxing
    Cui, Zhaotong
    Li, Meng
    Han, Yu
    Li, Tianping
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01):
  • [7] Image super-resolution with multi-scale fractal residual attention network
    Song, Xiaogang
    Liu, Wanbo
    Liang, Li
    Shi, Weiwei
    Xie, Guo
    Lu, Xiaofeng
    Hei, Xinhong
    [J]. COMPUTERS & GRAPHICS-UK, 2023, 113 : 21 - 31
  • [8] Residual shuffle attention network for image super-resolution
    Xuanyi Li
    Zhuhong Shao
    Bicao Li
    Yuanyuan Shang
    Jiasong Wu
    Yuping Duan
    [J]. Machine Vision and Applications, 2023, 34
  • [9] Residual shuffle attention network for image super-resolution
    Li, Xuanyi
    Shao, Zhuhong
    Li, Bicao
    Shang, Yuanyuan
    Wu, Jiasong
    Duan, Yuping
    [J]. MACHINE VISION AND APPLICATIONS, 2023, 34 (05)
  • [10] Residual shuffle attention network for image super-resolution
    Li, Zhiwei
    Zhang, Yaping
    Yang, Yuwei
    [J]. Journal of Physics: Conference Series, 2021, 2025 (01):