MSAN: Multiscale self-attention network for pansharpening

被引:0
|
作者
Lu, Hangyuan [1 ,2 ]
Yang, Yong [3 ]
Huang, Shuying [4 ]
Liu, Rixian [1 ,2 ]
Guo, Huimin [5 ]
机构
[1] Jinhua Univ Vocat Technol, Coll Informat Engn, Jinhua 321007, Peoples R China
[2] Jinhua Univ Vocat Technol, Key Lab Crop Harvesting Equipment Technol Zhejiang, Jinhua 321007, Peoples R China
[3] Tiangong Univ, Sch Comp Sci & Technol, Tianjin 300387, Peoples R China
[4] Tiangong Univ, Sch Software, Tianjin 300387, Peoples R China
[5] Xiamen Univ, Sch Informat, Xiamen 361005, Peoples R China
基金
中国国家自然科学基金;
关键词
Pansharpening; Multiscale; Self-attention; Swin Transformer; FUSION; IMAGES;
D O I
10.1016/j.patcog.2025.111441
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Effective extraction of spectral-spatial features from multispectral (MS) and panchromatic (PAN) images is critical for high-quality pansharpening. However, existing deep learning methods often overlook local misalignment and struggle to integrate local and long-range features effectively, resulting in spectral and spatial distortions. To address these challenges, this paper proposes a refined detail injection model that adaptively learns injection coefficients using long-range features. Building upon this model, a multiscale self-attention network (MSAN) is proposed, consisting of a feature extraction branch and a self-attention mechanism branch. In the former branch, a two-stage multiscale convolution network is designed to fully extract detail features with multiple receptive fields. In the latter branch, a streamlined Swin Transformer (SST) is proposed to efficiently generate multiscale self-attention maps by learning the correlation between local and long-range features. To better preserve spectral-spatial information, a revised Swin Transformer block is proposed by incorporating spectral and spatial attention within the block. The obtained self-attention maps from SST serve as the injection coefficients to refine the extracted details, which are then injected into the upsampled MS image to produce the final fused image. Experimental validation demonstrates the superiority of MSAN over traditional and state-of-the-art methods, with competitive efficiency. The code of this work will be released on GitHub once the paper is accepted.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] HyperRefiner: a refined hyperspectral pansharpening network based on the autoencoder and self-attention
    Zhou, Bo
    Zhang, Xianfeng
    Chen, Xiao
    Ren, Miao
    Feng, Ziyuan
    INTERNATIONAL JOURNAL OF DIGITAL EARTH, 2023, 16 (01) : 3268 - 3294
  • [2] Dual Branch Self-Attention DenseNet for Pansharpening
    Wang, Ying
    Pan, ShanShan
    Zuo, Fang
    Wang, Chen
    INTERNATIONAL CONFERENCE ON ENVIRONMENTAL REMOTE SENSING AND BIG DATA (ERSBD 2021), 2021, 12129
  • [3] Unsupervised Pansharpening Based on Self-Attention Mechanism
    Qu, Ying
    Baghbaderani, Razieh Kaviani
    Qi, Hairong
    Kwan, Chiman
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (04): : 3192 - 3208
  • [4] Multiscale echo self-attention memory network for multivariate time series classification
    Lyu, Huizi
    Huang, Desen
    Li, Sen
    Ma, Qianli
    Ng, Wing W. Y.
    NEUROCOMPUTING, 2023, 520 : 60 - 72
  • [5] Multiscale Self-Attention Architecture in Temporal Neural Network for Nonintrusive Load Monitoring
    Shan, Zihan
    Si, Gangquan
    Qu, Kai
    Wang, Qianyue
    Kong, Xiangguang
    Tang, Yu
    Yang, Chen
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [6] Optimization of the routing of capsule network based on multiscale information and self-attention mechanism
    Shang, Yunhao
    Xu, Ning
    Jin, Zhenzhou
    Yao, Xiao
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (02)
  • [7] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [8] Deep Multiscale Siamese Network With Parallel Convolutional Structure and Self-Attention for Change Detection
    Guo, Qingle
    Zhang, Junping
    Zhu, Shengyu
    Zhong, Chongxiao
    Zhang, Ye
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [9] Self-attention Hypergraph Pooling Network
    Zhao Y.-F.
    Jin F.-S.
    Li R.-H.
    Qin H.-C.
    Cui P.
    Wang G.-R.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (10):
  • [10] Relevance, valence, and the self-attention network
    Mattan, Bradley D.
    Quinn, Kimberly A.
    Rotshtein, Pia
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 27 - 28