Enhancing GPR Multisource Reverse Time Migration With a Feature Pyramid Attention Network

被引:0
|
作者
Wang, Xiangyu [1 ]
Chen, Junhong [1 ]
Yuan, Guiquan [1 ]
He, Qin [1 ,2 ]
Liu, Hai [1 ]
机构
[1] Guangzhou Univ, Sch Civil Engn & Transportat, Guangzhou 510006, Guangdong, Peoples R China
[2] Guangdong Prov Acad Bldg Res Grp Co Ltd, Guangzhou, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Imaging; Encoding; Computational efficiency; Crosstalk; Accuracy; Radar imaging; Deep learning; Crosstalk artifact suppression; feature pyramid attention network (FPANet); ground-penetrating radar (GPR); multisource reverse time migration (RTM);
D O I
10.1109/TGRS.2024.3426606
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
The reverse time migration (RTM) algorithm is widely recognized in ground-penetrating radar (GPR) imaging for its high-resolution capabilities. However, the algorithm involves multiple forward modeling making it computationally intensive and less efficient. This article presents a workflow designed to enhance computational efficiency while maintaining the accuracy of RTM imaging. This purpose is achieved by implementing a source encoding strategy that integrates random polarity and time shifts to build a supergather as a new independent excitation source. This approach aims to suppress the crosstalk artifact among integrated excitation sources within the supergather during wave propagation, which could otherwise impact imaging accuracy. Subsequently, by integrating the feature pyramid attention network (FPANet) to further suppress residual multisource crosstalk artifact, thereby enhancing the overall imaging quality of RTM. Evaluations on synthetic GPR data demonstrate the algorithm's capability to improve computational efficiency without sacrificing imaging accuracy, thereby confirming its effectiveness. Supported by both laboratory and field GPR data, the algorithm's widespread applicability is proven. In summary, the proposed workflow is expected to enhance imaging efficiency significantly, achieving a 2x - 5x speedup ratio without compromising the quality of imaging progress.
引用
收藏
页码:1 / 1
页数:12
相关论文
共 50 条
  • [1] Deep Learning for Enhancing Multisource Reverse Time Migration
    Li, Yaxing
    Jia, Xiaofeng
    Wu, Xinming
    Geng, Zhicheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [2] Dual Attention Based Feature Pyramid Network
    Xing, Huijun
    Wang, Shuai
    Zheng, Dezhi
    Zhao, Xiaotong
    CHINA COMMUNICATIONS, 2020, 17 (08) : 242 - 252
  • [3] Pyramid Feature Attention Network for Saliency detection
    Zhao, Ting
    Wu, Xiangqian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3080 - 3089
  • [4] Dual Attention Based Feature Pyramid Network
    Huijun Xing
    Shuai Wang
    Dezhi Zheng
    Xiaotong Zhao
    中国通信, 2020, 17 (08) : 242 - 252
  • [5] FPANet: feature pyramid attention network for crowd counting
    Zhai, Wenzhe
    Gao, Mingliang
    Li, Qilei
    Jeon, Gwanggil
    Anisetti, Marco
    APPLIED INTELLIGENCE, 2023, 53 (16) : 19199 - 19216
  • [6] FPANet: feature pyramid attention network for crowd counting
    Wenzhe Zhai
    Mingliang Gao
    Qilei Li
    Gwanggil Jeon
    Marco Anisetti
    Applied Intelligence, 2023, 53 : 19199 - 19216
  • [7] Pyramid Feature Attention Network for Speech Resampling Detection
    Zhou, Xinyu
    Zhang, Yujin
    Wang, Yongqi
    Tian, Jin
    Xu, Shaolun
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [8] CONVOLUTIONAL FEATURE PYRAMID FUSION VIA ATTENTION NETWORK
    Jeon, Sangryul
    Kim, Seungryong
    Sohn, Kwanghoon
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 1007 - 1011
  • [9] Attention guided feature pyramid network for crowd counting
    Chu, Huanpeng
    Tang, Jilin
    Hu, Haoji
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 80
  • [10] Fractional-order multiscale attention feature pyramid network for time series classification
    Pan, Wen
    Zhang, Weihua
    Pu, Yifei
    APPLIED INTELLIGENCE, 2023, 53 (07) : 8160 - 8179