Inception residual attention network for remote sensing image super-resolution

被引:2
|
作者
Lei, Pengcheng [1 ]
Liu, Cong [1 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt & Comp Engn, Shanghai 200093, Peoples R China
基金
中国国家自然科学基金;
关键词
HALLUCINATION;
D O I
10.1080/01431161.2020.1800129
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
How to enhance the spatial resolution for a remote sensing image is an important issue that we face. Many image super-resolution (SR) techniques have been proposed for this purpose and deep convolutional neural network (CNN) is the most effective approach in recent years. However, we observe that most CNN-based SR methods treat low-frequency areas and high-frequency areas equally, hence hindering the recovery of high-frequency information. In this paper, we propose a network named inception residual attention network (IRAN) to address this problem. Specifically, we propose a spatial attention module to make the network adaptively learn the importance of different spatial areas, so as to pay more attention to the areas with high-frequency information. Furthermore, we present an inception module to fuse local multilevel features, so as to provide richer information for reconstructing detailed textures. In order to evaluate the effectiveness of the proposed method, a large number of experiments are performed on UCMerced-LandUse data set and the results show that the proposed method is superior to the current state-of-the-art methods in both visual effects and objective indicators.
引用
收藏
页码:9565 / 9587
页数:23
相关论文
共 50 条
  • [1] Remote Sensing Image Super-Resolution With Residual Split Attention Mechanism
    Chen, Xitong
    Wu, Yuntao
    Lu, Tao
    Kong, Quan
    Wang, Jiaming
    Wang, Yu
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 1 - 13
  • [2] Remote Sensing Image Super-Resolution via Residual-Dense Hybrid Attention Network
    Yu, Bo
    Lei, Bin
    Guo, Jiayi
    Sun, Jiande
    Li, Shengtao
    Xie, Guangshuai
    [J]. REMOTE SENSING, 2022, 14 (22)
  • [3] Global sparse attention network for remote sensing image super-resolution
    Hu, Tao
    Chen, Zijie
    Wang, Mingyi
    Hou, Xintong
    Lu, Xiaoping
    Pan, Yuanyuan
    Li, Jianqing
    [J]. KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [4] Deep Residual Squeeze and Excitation Network for Remote Sensing Image Super-Resolution
    Gu, Jun
    Sun, Xian
    Zhang, Yue
    Fu, Kun
    Wang, Lei
    [J]. REMOTE SENSING, 2019, 11 (15)
  • [5] Remote Sensing Image Super-Resolution Based on Dense Channel Attention Network
    Ma, Yunchuan
    Lv, Pengyuan
    Liu, Hao
    Sun, Xuehong
    Zhong, Yanfei
    [J]. REMOTE SENSING, 2021, 13 (15)
  • [6] Residual shuffle attention network for image super-resolution
    Xuanyi Li
    Zhuhong Shao
    Bicao Li
    Yuanyuan Shang
    Jiasong Wu
    Yuping Duan
    [J]. Machine Vision and Applications, 2023, 34
  • [7] Residual shuffle attention network for image super-resolution
    Li, Xuanyi
    Shao, Zhuhong
    Li, Bicao
    Shang, Yuanyuan
    Wu, Jiasong
    Duan, Yuping
    [J]. MACHINE VISION AND APPLICATIONS, 2023, 34 (05)
  • [8] A Two-Branch Multiscale Residual Attention Network for Single Image Super-Resolution in Remote Sensing Imagery
    Patnaik, Allen
    Bhuyan, M. K.
    Macdorman, Karl F.
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 6003 - 6013
  • [9] RIMS: Residual-Inception Multiscale Image Super-Resolution Network
    Muhammad, Wazir
    Bhutto, Zuhaibuddin
    Shah, Jalal
    Shaikh, Murtaza Hussain
    Shah, Syed Ali Raza
    Butt, Shah Muhammad
    Masroor, Salman
    Hussain, Ayaz
    [J]. INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2022, 22 (01): : 588 - 592
  • [10] RFCNet: Remote Sensing Image Super-Resolution Using Residual Feature Calibration Network
    Xue, Yuan
    Li, Liangliang
    Wang, Zheyuan
    Jiang, Chenchen
    Liu, Minqin
    Wang, Jiawen
    Sun, Kaipeng
    Ma, Hongbing
    [J]. TSINGHUA SCIENCE AND TECHNOLOGY, 2023, 28 (03) : 475 - 485