Attention augmented multi-scale network for single image super-resolution

被引:13
|
作者
Xiong, Chengyi [1 ,2 ]
Shi, Xiaodi [1 ]
Gao, Zhirong [3 ]
Wang, Ge [4 ]
机构
[1] South Cent Univ Nationalities, Sch Elect & Informat Engn, Wuhan 430074, Peoples R China
[2] South Cent Univ Nationalities, Hubei Key Lab Intelligent Wireless Commun, Wuhan 430074, Peoples R China
[3] South Cent Univ Nationalities, Sch Comp Sci, Wuhan 430074, Peoples R China
[4] Rensselaer Polytech Inst, Dept Biomed Engn, Troy, NY 12180 USA
基金
中国国家自然科学基金; 中央高校基本科研业务费专项资金资助;
关键词
Single image super-resolution; Attention mechanism; Multi-scale convolution; Feature recalibration and aggregation; Local hierarchical feature fusion; Global hierarchical feature fusion; CONVOLUTIONAL NETWORK;
D O I
10.1007/s10489-020-01869-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-scale convolution can be used in a deep neural network (DNN) to obtain a set of features in parallel with different perceptive fields, which is beneficial to reduce network depth and lower training difficulty. Also, the attention mechanism has great advantages to strengthen representation power of a DNN. In this paper, we propose an attention augmented multi-scale network (AAMN) for single image super-resolution (SISR), in which deep features from different scales are discriminatively aggregated to improve performance. Specifically, the statistics of features at different scales are first computed by global average pooling operation, and then used as a guidance to learn the optimal weight allocation for the subsequent feature recalibration and aggregation. Meanwhile, we adopt feature fusion at two levels to further boost reconstruction power, one of which is intra-group local hierarchical feature fusion (LHFF), and the other is inter-group global hierarchical feature fusion (GHFF). Extensive experiments on public standard datasets indicate the superiority of our AAMN over the state-of-the-art models, in terms of not only quantitative and qualitative evaluation but also model complexity and efficiency.
引用
收藏
页码:935 / 951
页数:17
相关论文
共 50 条
  • [1] Attention augmented multi-scale network for single image super-resolution
    Chengyi Xiong
    Xiaodi Shi
    Zhirong Gao
    Ge Wang
    [J]. Applied Intelligence, 2021, 51 : 935 - 951
  • [2] Multi-scale attention network for image super-resolution
    Wang, Li
    Shen, Jie
    Tang, E.
    Zheng, Shengnan
    Xu, Lizhong
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 80
  • [3] Single image super-resolution based on multi-scale dense attention network
    Gao, Farong
    Wang, Yong
    Yang, Zhangyi
    Ma, Yuliang
    Zhang, Qizhong
    [J]. SOFT COMPUTING, 2023, 27 (06) : 2981 - 2992
  • [4] Single image super-resolution based on multi-scale dense attention network
    Farong Gao
    Yong Wang
    Zhangyi Yang
    Yuliang Ma
    Qizhong Zhang
    [J]. Soft Computing, 2023, 27 : 2981 - 2992
  • [5] Multi-attention augmented network for single image super-resolution
    Chen, Rui
    Zhang, Heng
    Liu, Jixin
    [J]. PATTERN RECOGNITION, 2022, 122
  • [6] Attention-enhanced multi-scale residual network for single image super-resolution
    Sun, Yubin
    Qin, Jiongming
    Gao, Xuliang
    Chai, Shuiqin
    Chen, Bin
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2022, 16 (05) : 1417 - 1424
  • [7] Dual-attention guided multi-scale network for single image super-resolution
    Juan Wen
    Lei Zha
    [J]. Applied Intelligence, 2022, 52 : 12258 - 12271
  • [8] Single image super-resolution via multi-scale residual channel attention network
    Cao, Feilong
    Liu, Huan
    [J]. NEUROCOMPUTING, 2019, 358 : 424 - 436
  • [9] Attention-enhanced multi-scale residual network for single image super-resolution
    Yubin Sun
    Jiongming Qin
    Xuliang Gao
    Shuiqin Chai
    Bin Chen
    [J]. Signal, Image and Video Processing, 2022, 16 : 1417 - 1424
  • [10] Dual-attention guided multi-scale network for single image super-resolution
    Wen, Juan
    Zha, Lei
    [J]. APPLIED INTELLIGENCE, 2022, 52 (11) : 12258 - 12271