CT-UNet: Context-Transfer-UNet for Building Segmentation in Remote Sensing Images

被引:17
|
作者
Liu, Sheng [1 ]
Ye, Huanran [1 ]
Jin, Kun [1 ]
Cheng, Haohao [1 ]
机构
[1] Zhejiang Univ Technol, Inst Comp Sci & Technol, Hangzhou 310023, Zhejiang, Peoples R China
基金
国家重点研发计划;
关键词
Remote sensing images; Building segmentation; U-Net; Context information; Attention models;
D O I
10.1007/s11063-021-10592-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the proliferation of remote sensing images, how to segment buildings more accurately in remote sensing images is a critical challenge. First, most networks have poor recognition ability on high resolution images, resulting in blurred boundaries in the segmented building maps. Second, the similarity between buildings and background results in intra-class inconsistency. To address these two problems, we propose an UNet-based network named Context-Transfer-UNet (CT-UNet). Specifically, we design Dense Boundary Block. Dense Block utilizes reuse mechanism to refine features and increase recognition capabilities. Boundary Block introduces the low-level spatial information to solve the fuzzy boundary problem. Then, to handle intra-class inconsistency, we construct Spatial Channel Attention Block. It combines context space information and selects more distinguishable features from space and channel. Finally, we propose an improved loss function to enhance the purpose of loss by adding evaluation indicator. Based on our proposed CT-UNet, we achieve 85.33% mean IoU on the Inria dataset, 91.00% mean IoU on the WHU dataset and 83.92% F1-score on the Massachusetts dataset. The results outperform our baseline (U-Net ResNet-34) by 3.76%, exceed Web-Net by 2.24% and surpass HFSA-Unet by 2.17%.
引用
收藏
页码:4257 / 4277
页数:21
相关论文
共 50 条
  • [31] GateFormer: Gate Attention UNet With Transformer for Change Detection of Remote Sensing Images
    Li, Li-Li
    You, Zhi-Hui
    Chen, Si-Bao
    Huang, Li-Li
    Tang, Jin
    Luo, Bin
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 871 - 883
  • [32] MS-UNet: A multi-scale UNet with feature recalibration approach for automatic liver and tumor segmentation in CT images
    Kushnure, Devidas T.
    Talbar, Sanjay N.
    COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2021, 89
  • [33] DASGC-Unet: An Attention Network for Accurate Segmentation of Liver CT Images
    Zhang, Xiaoqian
    Chen, Yufeng
    Pu, Lei
    He, Youdong
    Zhou, Ying
    Sun, Huaijiang
    NEURAL PROCESSING LETTERS, 2023, 55 (09) : 12289 - 12308
  • [34] Pulmonary CT Images Segmentation using CNN and UNet Models of Deep Learning
    Shaziya, Humera
    Shyamala, K.
    2020 IEEE PUNE SECTION INTERNATIONAL CONFERENCE (PUNECON), 2020, : 195 - 201
  • [35] DASGC-Unet: An Attention Network for Accurate Segmentation of Liver CT Images
    Xiaoqian Zhang
    Yufeng Chen
    Lei Pu
    Youdong He
    Ying Zhou
    Huaijiang Sun
    Neural Processing Letters, 2023, 55 (9) : 12289 - 12308
  • [36] Improved UNet++ for Tree Rings Segmentation of Chinese Fir CT Images
    Liu, Shuai
    Ge, Zhedong
    Liu, Xiaotong
    Gao, Yisheng
    Li, Yang
    Li, Mengfei
    Computer Engineering and Applications, 2024, 60 (05) : 232 - 239
  • [37] SD-UNet: A Novel Segmentation Framework for CT Images of Lung Infections
    Yin, Shuangcai
    Deng, Hongmin
    Xu, Zelin
    Zhu, Qilin
    Cheng, Junfeng
    ELECTRONICS, 2022, 11 (01)
  • [38] CFM-UNet:A Joint CNN and Transformer Network via Cross Feature Modulation for Remote Sensing Images Segmentation
    Min WANG
    Peidong WANG
    Journal of Geodesy and Geoinformation Science, 2023, 6 (04) : 40 - 47
  • [39] Water body segmentation in remote sensing images based on multi-scale fusion attention module improved UNet
    Shi, Tian-Tan
    Guo, Zhong-Hua
    Yan, Xiang
    Wei, Shi-Qin
    CHINESE JOURNAL OF LIQUID CRYSTALS AND DISPLAYS, 2023, 38 (03) : 397 - 408
  • [40] Res50-SimAM-ASPP-Unet: A Semantic Segmentation Model for High-Resolution Remote Sensing Images
    Cai, Jiajing
    Shi, Jinmei
    Leau, Yu-Beng
    Meng, Shangyu
    Zheng, Xiuyan
    Zhou, Jinghe
    IEEE Access, 2024, 12 : 192301 - 192316