Dual-Branch Adaptive Convolutional Transformer for Hyperspectral Image Classification

被引:1
|
作者
Wang, Chuanzhi [1 ]
Huang, Jun [1 ]
Lv, Mingyun [1 ]
Wu, Yongmei [1 ]
Qin, Ruiru [1 ]
机构
[1] Beihang Univ, Sch Aeronaut Sci & Engn, Beijing 100191, Peoples R China
关键词
hyperspectral image classification; adaptive multi-head self-attention; convolutional neural networks; transformers; RESIDUAL NETWORK; ATTENTION;
D O I
10.3390/rs16091615
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
In hyperspectral image (HSI) classification, convolutional neural networks (CNNs) and transformer architectures have each contributed to considerable advancements. CNNs possess potent local feature representation skills, whereas transformers excel in learning global features, offering a complementary strength. Nevertheless, both architectures are limited by static receptive fields, which hinder their accuracy in delineating subtle boundary discrepancies. To mitigate the identified limitations, we introduce a novel dual-branch adaptive convolutional transformer (DBACT) network architecture featuring an adaptive multi-head self-attention mechanism. The architecture begins with a triadic parallel stem structure for shallow feature extraction and reduction of the spectral dimension. A global branch with adaptive receptive fields performs high-level global feature extraction. Simultaneously, a local branch with a cross-attention module provides detailed local insights, enriching the global perspective. This methodical integration synergizes the advantages of both branches, capturing representative spatial-spectral features from HSI. Comprehensive evaluation across three benchmark datasets reveals that the DBACT model exhibits superior classification performance compared to leading-edge models.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Two-Branch Pure Transformer for Hyperspectral Image Classification
    He, Xin
    Chen, Yushi
    Li, Qingyun
    IEEE Geoscience and Remote Sensing Letters, 2022, 19
  • [42] A Multiscale Dual-Branch Feature Fusion and Attention Network for Hyperspectral Images Classification
    Gao, Hongmin
    Zhang, Yiyan
    Chen, Zhonghao
    Li, Chenming
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 8180 - 8192
  • [43] Dual-branch adaptive attention transformer for occluded person re-identification
    Lu, Yunhua
    Jiang, Mingzi
    Liu, Zhi
    Mu, Xinyu
    IMAGE AND VISION COMPUTING, 2023, 131
  • [44] Dual-Branch Spectral-Spatial Adversarial Representation Learning for Hyperspectral Image Classification With Few Labeled Samples
    Sun, Caihao
    Zhang, Xiaohua
    Meng, Hongyun
    Cao, Xianghai
    Zhang, Jinhua
    Jiao, Licheng
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 45 - 45
  • [45] MultiScale spectral–spatial convolutional transformer for hyperspectral image classification
    Gong, Zhiqiang
    Zhou, Xian
    Yao, Wen
    IET Image Processing, 2024, 18 (13) : 4328 - 4340
  • [46] A Dual Frequency Transformer Network for Hyperspectral Image Classification
    Qiao, Xin
    Huang, Weimin
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 10344 - 10358
  • [47] Dual attention transformer network for hyperspectral image classification
    Shu, Zhenqiu
    Wang, Yuyang
    Yu, Zhengtao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [48] DBCvT: Double Branch Convolutional Transformer for Medical Image Classification
    Li, Jinfeng
    Feng, Meiling
    Xia, Chengyi
    Pattern Recognition Letters, 2024, 186 : 250 - 257
  • [49] Dual-branch hybrid encoding embedded network for histopathology image classification
    Li, Mingshuai
    Hu, Zhiqiu
    Qiu, Song
    Zhou, Chenhao
    Weng, Jialei
    Dong, Qiongzhu
    Sheng, Xia
    Ren, Ning
    Zhou, Mei
    PHYSICS IN MEDICINE AND BIOLOGY, 2023, 68 (19):
  • [50] Double-branch feature fusion transformer for hyperspectral image classification
    Lanxue Dang
    Libo Weng
    Yane Hou
    Xianyu Zuo
    Yang Liu
    Scientific Reports, 13