Remote Sensing Scene Classification Based on Semantic-Aware Fusion Network

被引:0
|
作者
Song, Wanying [1 ,2 ]
Zhang, Yingying [1 ]
Wang, Chi [1 ]
Jiang, Yinyin [1 ]
Wu, Yan [3 ]
Zhang, Peng [3 ]
机构
[1] Xian Univ Sci & Technol, Sch Commun & Informat Engn, Xian 710054, Peoples R China
[2] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
[3] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
基金
中国博士后科学基金;
关键词
Discriminative features; feature fusion; remote sensing scene classification (RSSC); semantic-aware;
D O I
10.1109/LGRS.2024.3470773
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
The remote sensing scene classification (RSSC) based on convolutional neural networks (CNNs) are generally limited by the complex background interference and the difficulty of identifying key targets in image. Thus, this letter proposes a semantic-aware fusion network for RSSC, abbreviated as SAF-Net, to better construct discriminative features and effectively fuse features for classification. The proposed SAF-Net, which employs the ResNet50 pretrained on the ImageNet dataset as the backbone network, mainly contains the semantic-aware module and the multilayer feature fusion module (MFFM). The semantic-aware module utilizes a spatial enhanced module (SEM) and a covariance channel attention module (CCAM) to accurately capture the discriminative semantic features. It can precisely identify and extract the essential semantic elements in image, such as distinct object types and their spatial distributions. Then, the MFFM uses the features learned by the semantic-aware module to guide other layers for effective feature fusion through a self-attention mechanism. It can not only enriches the feature representation of SAF-Net but also ensure the effective fusion of the semantic information. Extensive comparisons and ablation experiments on remote sensing datasets demonstrate the effectiveness of the proposed SAF-Net, and verify that it can greatly improve the classification performance.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] SAGN: Semantic-Aware Graph Network for Remote Sensing Scene Classification
    Yang, Yuqun
    Tang, Xu
    Cheung, Yiu-Ming
    Zhang, Xiangrong
    Jiao, Licheng
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 1011 - 1025
  • [2] Remote Sensing Scene Classification Based on Multibranch Fusion Attention Network
    Shi, Jiacheng
    Liu, Wei
    Shan, Haoyu
    Li, Erzhu
    Li, Xing
    Zhang, Lianpeng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [3] Semantic-Aware Fusion Network Based on Super-Resolution
    Xu, Lingfeng
    Zou, Qiang
    SENSORS, 2024, 24 (11)
  • [4] Semantic-aware scene recognition
    Lopez-Cifuentes, Alejandro
    Escudero-Vinolo, Marcos
    Bescos, Jesus
    Garcia-Martin, Alvaro
    PATTERN RECOGNITION, 2020, 102
  • [5] Deep semantic-aware remote sensing image deblurring
    Song, Zhenbo
    Zhang, Zhenyuan
    Fang, Feiyi
    Fan, Zhaoxin
    Lu, Jianfeng
    SIGNAL PROCESSING, 2023, 211
  • [6] Multilayer Feature Fusion Network for Scene Classification in Remote Sensing
    Xu, Kejie
    Huang, Hong
    Li, Yuan
    Shi, Guangyao
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2020, 17 (11) : 1894 - 1898
  • [7] Semantic-Aware Dehazing Network With Adaptive Feature Fusion
    Zhang, Shengdong
    Ren, Wenqi
    Tan, Xin
    Wang, Zhi-Jie
    Liu, Yong
    Zhang, Jingang
    Zhang, Xiaoqin
    Cao, Xiaochun
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (01) : 454 - 467
  • [8] Semantic-aware visual scene representation
    Mohammad Javad Parseh
    Mohammad Rahmanimanesh
    Parviz Keshavarzi
    Zohreh Azimifar
    International Journal of Multimedia Information Retrieval, 2022, 11 : 619 - 638
  • [9] Semantic-aware visual scene representation
    Parseh, Mohammad Javad
    Rahmanimanesh, Mohammad
    Keshavarzi, Parviz
    Azimifar, Zohreh
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2022, 11 (04) : 619 - 638
  • [10] ATTENTION BASED NETWORK FOR REMOTE SENSING SCENE CLASSIFICATION
    Liu, Shaoteng
    Wang, Qi
    Li, Xuelong
    IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2018, : 4740 - 4743