Building damage detection based on multi-source adversarial domain adaptation

被引:3
|
作者
Wang, Xiang [1 ]
Li, Yundong [1 ]
Lin, Chen [1 ]
Liu, Yi [1 ]
Geng, Shuo [1 ]
机构
[1] North China Univ Technol Informat Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
remote sensing imagery; building damage detection; domain adaptation; multi-source domain; adapted source domain; transfer learning;
D O I
10.1117/1.JRS.15.036503
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Building damage assessment plays an essential role during post-disaster rescue operations. Given that labeled samples are difficult to timely obtain after a disaster, transfer learning attracts increasing attention. However, different sensors employed cause considerable discrepancies not only between historical and current scenes but also among historical scenes, which could exert an effect on transfer performance. Therefore, a multi-source adversarial domain adaptation (MADA) method is proposed in this paper to fulfill the task of post-disaster building assessment. This method consists of two phases. First, imageries of several historical scenes are transformed into the same style of the current scene through the CycleGAN model with a classifier, ensuring class invariance, to be fused to make an adapted source domain. Second, feature alignment between adapted source and target domains is executed based on adversarial discriminative domain adaptation. The MADA method enhances the transformed image quality, fully utilizes relevant information in historical scenes, solves inter-scene interference problems among historical images, and improves the transfer efficiency from historical to the current disaster scene. Two experiments are conducted with Hurricane Sandy, Irma, and Maria datasets as multi-source and target domains to validate MADA's effectiveness. Results show that the classification performance is better than other methods. (c) 2021 Society of Photo-Optical Instrumentation Engineers (SPIE)
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Multi-source Domain Adaptation for Face Recognition
    Yi, Haiyang
    Xu, Zhi
    Wen, Yimin
    Fan, Zhigang
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1349 - 1354
  • [42] Automatic online multi-source domain adaptation
    Renchunzi, Xie
    Pratama, Mahardhika
    INFORMATION SCIENCES, 2022, 582 : 480 - 494
  • [43] Multi-source domain adaptation for image classification
    Karimpour, Morvarid
    Noori Saray, Shiva
    Tahmoresnezhad, Jafar
    Pourmahmood Aghababa, Mohammad
    MACHINE VISION AND APPLICATIONS, 2020, 31 (06)
  • [44] Moment Matching for Multi-Source Domain Adaptation
    Peng, Xingchao
    Bai, Qinxun
    Xia, Xide
    Huang, Zijun
    Saenko, Kate
    Wang, Bo
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1406 - 1415
  • [45] Subspace Identification for Multi-Source Domain Adaptation
    Li, Zijian
    Cai, Ruichu
    Chen, Guangyi
    Sun, Boyang
    Hao, Zhifeng
    Zhang, Kun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [46] Multi-source domain adaptation for image classification
    Morvarid Karimpour
    Shiva Noori Saray
    Jafar Tahmoresnezhad
    Mohammad Pourmahmood Aghababa
    Machine Vision and Applications, 2020, 31
  • [47] Transformer-Based Multi-Source Domain Adaptation Without Source Data
    Li, Gang
    Wu, Chao
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [48] Federated multi-source domain adversarial adaptation framework for machinery fault diagnosis with data privacy
    Zhao, Ke
    Hu, Junchen
    Shao, Haidong
    Hu, Jiabei
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 236
  • [49] Evidential combination of augmented multi-source of information based on domain adaptation
    Linqing Huang
    Zhunga Liu
    Quan Pan
    Jean Dezert
    Science China Information Sciences, 2020, 63
  • [50] Evidential combination of augmented multi-source of information based on domain adaptation
    Huang, Linqing
    Liu, Zhunga
    Pan, Quan
    Dezert, Jean
    SCIENCE CHINA-INFORMATION SCIENCES, 2020, 63 (11)