Multi-Step Feature Fusion for Natural Disaster Damage Assessment on Satellite Images

被引:1
|
作者
Zarski, Mateusz [1 ]
Miszczak, Jaroslaw A. [1 ]
机构
[1] Polish Acad Sci, Inst Theoret & Appl Informat, PL-44100 Gliwice, Poland
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Disasters; Fuses; Buildings; Satellites; Satellite images; Transformers; Feature extraction; Computer vision; Machine learning; Remote sensing; damage state assessment; machine learning; remote sensing; TIME-SERIES; PREDICTION; CNN;
D O I
10.1109/ACCESS.2024.3459424
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Quick and accurate assessment of the damage state of buildings after natural disasters is crucial for undertaking properly targeted rescue and subsequent recovery operations, which can have a major impact on the safety of victims and the cost of disaster recovery. The quality of such a process can be significantly improved by harnessing the potential of machine learning methods in computer vision. This paper presents a novel damage assessment method using an original multi-step feature fusion network for the classification of the damage state of buildings based on pre- and post-disaster large-scale satellite images. We introduce a novel convolutional neural network (CNN) module that performs feature fusion at multiple network levels between pre- and post-disaster images in the horizontal and vertical directions of CNN network. An additional network element - Fuse Module - was proposed to adapt any CNN model to analyze image pairs in the issue of pair classification. We use, open, large-scale datasets (IDA-BD and xView2) to verify, that the proposed method is suitable to improve on existing state-of-the-art architectures. We report over a 3 percentage point increase in the accuracy of the Vision Transformer model.
引用
收藏
页码:140072 / 140081
页数:10
相关论文
共 50 条
  • [41] A Study of Multi-Step Sparse Vessel Trajectory Restoration Based on Feature Correlation
    Ye, Lin
    Chen, Xiaohui
    Liu, Haiyan
    Zhang, Ran
    Li, Jia
    Lu, Chuanwei
    Zhao, Yunpeng
    APPLIED SCIENCES-BASEL, 2024, 14 (10):
  • [42] Multi-step Transfer Learning in Natural Language Processing for the Health Domain
    Manaka, Thokozile
    Van Zyl, Terence
    Kar, Deepak
    Wade, Alisha
    NEURAL PROCESSING LETTERS, 2024, 56 (03)
  • [43] Semantic and Visual Cues for Humanitarian Computing of Natural Disaster Damage Images
    Jomaa, Hadi S.
    Rizk, Yara
    Awad, Mariette
    2016 12TH INTERNATIONAL CONFERENCE ON SIGNAL-IMAGE TECHNOLOGY & INTERNET-BASED SYSTEMS (SITIS), 2016, : 404 - 411
  • [44] Optimal and self-tuning information fusion kalman multi-step predictor
    Sun, Shuli
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2007, 43 (02) : 418 - 427
  • [45] Multi-step prediction of roof pressure based on multi-scale contextual fusion network
    Zhang, Yuhai
    Yu, Qiongfang
    Tang, Gaofeng
    Wu, Qiong
    SENSORS AND ACTUATORS A-PHYSICAL, 2024, 369
  • [46] Multi-Step Spatial-Temporal Fusion Network for Traffic Flow Forecasting
    Dong, Honghui
    Meng, Ziying
    Wang, Yiming
    Jia, Limin
    Qin, Yong
    2021 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2021, : 3412 - 3419
  • [47] A multi-step interaction network for multi-class classification based on OCT and OCTA images
    Zhang, Han
    Bai, Xuening
    Hou, Guangyao
    Quan, Xiongwen
    INFORMATION FUSION, 2025, 120
  • [48] Multi-Step Peak Passenger Flow Prediction of Urban Rail Transit Based on Multi-Station Spatio-Temporal Feature Fusion Model
    Sun, Jianan
    Ye, Xiaofei
    Yan, Xingchen
    Wang, Tao
    Chen, Jun
    SYSTEMS, 2025, 13 (02):
  • [49] Research study on appropriate interpretation techniques of satellite images for natural disaster management
    Poursaber, Mohammadreza
    Ariki, Yasuo
    Safi, Mohammad
    EARTH RESOURCES AND ENVIRONMENTAL REMOTE SENSING/GIS APPLICATIONS III, 2012, 8538
  • [50] Effective numerical approach with complete damage transfer under multi-step loading
    赵士洋
    薛璞
    彭雄奇
    王琰
    AppliedMathematicsandMechanics(EnglishEdition), 2014, 35 (03) : 391 - 402