Improving Performance of Seismic Fault Detection by Fine-Tuning the Convolutional Neural Network Pre-Trained with Synthetic Samples

被引:15
|
作者
Yan, Zhe [1 ]
Zhang, Zheng [1 ]
Liu, Shaoyong [1 ]
机构
[1] China Univ Geosci, Inst Geophys & Geomat, Wuhan 430074, Peoples R China
基金
中国国家自然科学基金;
关键词
fault detection; deep learning; U-net; transfer learning; RESERVOIR; SPARSE;
D O I
10.3390/en14123650
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Fault interpretation is an important part of seismic structural interpretation and reservoir characterization. In the conventional approach, faults are detected as reflection discontinuity or abruption and are manually tracked in post-stack seismic data, which is time-consuming. In order to improve efficiency, a variety of automatic fault detection methods have been proposed, among which widespread attention has been given to deep learning-based methods. However, deep learning techniques require a large amount of marked seismic samples as a training dataset. Although the amount of synthetic seismic data can be guaranteed and the labels are accurate, the difference between synthetic data and real data still exists. To overcome this drawback, we apply a transfer learning strategy to improve the performance of automatic fault detection by deep learning methods. We first pre-train a deep neural network with synthetic seismic data. Then we retrain the network with real seismic samples. We use a random sample consensus (RANSAC) method to obtain real seismic samples and generate corresponding labels automatically. Three real 3D examples are included to demonstrate that the fault detection accuracy of the pre-trained network models can be greatly improved by retraining the network with a few amount of real seismic samples.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Food Detection by Fine-Tuning Pre-trained Convolutional Neural Network Using Noisy Labels
    Alshomrani, Shroog
    Aljoudi, Lina
    Aljabri, Banan
    Al-Shareef, Sarah
    [J]. INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2021, 21 (07): : 182 - 190
  • [2] Transfer learning by fine-tuning pre-trained convolutional neural network architectures for switchgear fault detection using thermal imaging
    Mahmoud, Karim A. A.
    Badr, Mohamed M.
    Elmalhy, Noha A.
    Hamdy, Ragi A.
    Ahmed, Shehab
    Mordi, Ahmed A.
    [J]. ALEXANDRIA ENGINEERING JOURNAL, 2024, 103 : 327 - 342
  • [3] Seismic fault detection in real data using transfer learning from a convolutional neural network pre-trained with synthetic seismic data
    Cunha, Augusto
    Pochet, Axelle
    Lopes, Helio
    Gattass, Marcelo
    [J]. COMPUTERS & GEOSCIENCES, 2020, 135
  • [4] Fine-tuning of pre-trained convolutional neural networks for diabetic retinopathy screening: a clinical study
    Roshan, Saboora M.
    Karsaz, Ali
    Vejdani, Amir Hossein
    Roshan, Yaser M.
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2020, 21 (04) : 564 - 573
  • [5] Comparative Study of Fine-Tuning of Pre-Trained Convolutional Neural Networks for Diabetic Retinopathy Screening
    Mohammadian, Saboora
    Karsaz, Ali
    Roshan, Yaser M.
    [J]. 2017 24TH NATIONAL AND 2ND INTERNATIONAL IRANIAN CONFERENCE ON BIOMEDICAL ENGINEERING (ICBME), 2017, : 224 - 229
  • [6] Fine-tuning a pre-trained Convolutional Neural Network Model to translate American Sign Language in Real-time
    Cayamcela, Manuel Eugenio Morocho
    Lim, Wansu
    [J]. 2019 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS (ICNC), 2019, : 100 - 104
  • [7] Improving automatic cyberbullying detection in social network environments by fine-tuning a pre-trained sentence transformer language model
    Gutierrez-Batista, Karel
    Gomez-Sanchez, Jesica
    Fernandez-Basso, Carlos
    [J]. SOCIAL NETWORK ANALYSIS AND MINING, 2024, 14 (01)
  • [8] Improving Pre-Trained Weights through Meta-Heuristics Fine-Tuning
    de Rosa, Gustavo H.
    Roder, Mateus
    Papa, Joao Paulo
    dos Santos, Claudio F. G.
    [J]. 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [9] Pruning Pre-trained Language ModelsWithout Fine-Tuning
    Jiang, Ting
    Wang, Deqing
    Zhuang, Fuzhen
    Xie, Ruobing
    Xia, Feng
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 594 - 605
  • [10] Span Fine-tuning for Pre-trained Language Models
    Bao, Rongzhou
    Zhang, Zhuosheng
    Zhao, Hai
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1970 - 1979