Texture-Aware Self-Attention Model for Hyperspectral Tree Species Classification

被引:4
|
作者
Li, Nanying [1 ,2 ]
Jiang, Shuguo [1 ,2 ]
Xue, Jiaqi [1 ,2 ]
Ye, Songxin [1 ,2 ]
Jia, Sen [1 ,2 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Guangdong Hong Kong Macau Joint Lab Smart Cities, Minist Nat Resources, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Key Lab Geoenvironm Monitoring Coastal Zone, Minist Nat Resources, Shenzhen 518060, Peoples R China
基金
中国国家自然科学基金;
关键词
Carbon sinks; hyperspectral images (HSIs); spectral-spatial attention module; tree species classification; TRANSFORMER; NETWORKS; LEVEL;
D O I
10.1109/TGRS.2023.3344787
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Forests play an irreplaceable role in carbon sinks. However, there are obvious differences in the carbon sink capacity of different tree species, so the scientific and accurate identification of surface forest vegetation is the key to achieving the double carbon goal. Due to the disordered distribution of trees, varied crown geometry, and high difficulty in labeling tree species, traditional methods have a poor ability to represent complex spatial-spectral structures. Therefore, how to quickly and accurately obtain key and subtle features of tree species to finely identify tree species is an urgent problem to be solved in current research. To address these issues, a texture-aware self-attention model (TASAM) is proposed to improve spatial contrast and overcome spectral variance, achieving accurate classification of tree species hyperspectral images (HSIs). In our model, a nested spatial pyramid module is first constructed to accurately extract the multiview and multiscale features that highlight the distinction between tree species and surrounding backgrounds. In addition, a cross-spectral-spatial attention module is designed, which can capture spatial-spectral joint features over the entire image domain. The Gabor feature is introduced as an auxiliary function to guide self-attention to autonomously focus on latent space texture features, further extract more appropriate and accurate information, and enhance the distinction between the target and the background. Verification experiments on three tree species hyperspectral datasets prove that the proposed method can obtain finer and more accurate tree species classification under the condition of limited labeled samples. This method can effectively solve the problem of tree species classification in complex forest structures and can meet the application requirements of tree species diversity monitoring, forestry resource investigation, and forestry carbon sink analysis based on HSIs.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] TiSA: a Time Interval Aware Self-Attention model for Popularity Prediction on Social Networks
    Zeng, Xiang
    Liu, Yujia
    Song, Xin
    Shang, Yingdan
    Zhou, Bin
    [J]. 2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 552 - 558
  • [42] A time-aware self-attention based neural network model for sequential recommendation
    Zhang, Yihu
    Yang, Bo
    Liu, Haodong
    Li, Dongsheng
    [J]. APPLIED SOFT COMPUTING, 2023, 133
  • [43] Interactive Enhanced Network Based on Multihead Self-Attention and Graph Convolution for Classification of Hyperspectral and LiDAR Data
    Gao, Hongmin
    Feng, Hao
    Zhang, Yiyan
    Fei, Shuyu
    Shen, Runhua
    Xu, Shufang
    Zhang, Bing
    [J]. IEEE Transactions on Geoscience and Remote Sensing, 2024, 62
  • [44] Hyperspectral Image Classification with the Orthogonal Self-Attention ResNet and Two-Step Support Vector Machine
    Sun, Heting
    Wang, Liguo
    Liu, Haitao
    Sun, Yinbang
    [J]. REMOTE SENSING, 2024, 16 (06)
  • [45] Multiscale Feature Fusion Network Incorporating 3D Self-Attention for Hyperspectral Image Classification
    Qing, Yuhao
    Huang, Quanzhen
    Feng, Liuyan
    Qi, Yueyan
    Liu, Wenyi
    [J]. REMOTE SENSING, 2022, 14 (03)
  • [46] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Mohapatra, Asutosh
    Thota, Nithin
    Prakasam, P.
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (13) : 18503 - 18519
  • [47] CYBERNETIC MODEL OF SELF-ATTENTION PROCESSES
    CARVER, CS
    [J]. JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 1979, 37 (08) : 1251 - 1281
  • [48] A light-weight quantum self-attention model for classical data classification
    Zhang, Hui
    Zhao, Qinglin
    Chen, Chuangtao
    [J]. APPLIED INTELLIGENCE, 2024, 54 (04) : 3077 - 3091
  • [49] A Sparse Self-Attention Enhanced Model for Aspect-Level Sentiment Classification
    P. R. Joe Dhanith
    B. Surendiran
    G. Rohith
    Sujithra R. Kanmani
    K. Valli Devi
    [J]. Neural Processing Letters, 56
  • [50] A Sparse Self-Attention Enhanced Model for Aspect-Level Sentiment Classification
    Dhanith, P. R. Joe
    Surendiran, B.
    Rohith, G.
    Kanmani, Sujithra R.
    Devi, K. Valli
    [J]. NEURAL PROCESSING LETTERS, 2024, 56 (02)