Automatic horizon tracking method based on knowledge self-distillation

被引:0
|
作者
Yang, Mengqiong [1 ]
Xu, Huiqun [1 ]
Peng, Zhen [1 ]
Wang, Peng [1 ]
机构
[1] Yangtze Univ, Coll Geophys & Petr Resources, Wuhan 430100, Peoples R China
关键词
deep learning; horizons tracking; UNet; teacher model; student models; knowledge self-distillation;
D O I
10.1504/IJOGCT.2023.10056112
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In recent years, deep learning has achieved a large number of successful cases in the seismic horizon tracking; however, most of the existing studies apply more complex network structures to improve the horizon tracking accuracy, which significantly increases the model complexity and training time. In order to improve the tracking accuracy without increasing the model complexity, this paper proposes a knowledge-based self-distillation method for horizon tracking. In the first step, the teacher model is obtained based on UNet training to obtain the prior weights; in the second step, the untrained UNet is used as the student network, and the teacher model is used to guide the student network for knowledge distillation training to improve the knowledge transfer performance. After synthesis and actual data testing, using the knowledge distillation method for horizon tracking improves the horizon tracking accuracy without increasing the model complexity, and provides a new method for automatic horizon tracking.
引用
收藏
页码:336 / 350
页数:16
相关论文
共 50 条
  • [21] Future Augmentation with Self-distillation in Recommendation
    Liu, Chong
    Xie, Ruobing
    Liu, Xiaoyang
    Wang, Pinzheng
    Zheng, Rongqin
    Zhang, Lixin
    Li, Juntao
    Xia, Feng
    Lin, Leyu
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 602 - 618
  • [22] Self-Distillation With Augmentation in Feature Space
    Xu, Kai
    Wang, Lichun
    Li, Shuang
    Xin, Jianjia
    Yin, Baocai
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9578 - 9590
  • [23] Extracting Key Information from Unlabeled Patents Based on Knowledge Self-Distillation of Large Language Model
    Jianfei, Zhao
    Ting, Chen
    Xiaomei, Wang
    Chong, Feng
    Data Analysis and Knowledge Discovery, 2024, 8 (8-9) : 133 - 143
  • [24] Adaptive Similarity Bootstrapping for Self-Distillation based Representation Learning
    Lebailly, Tim
    Stegmueller, Thomas
    Bozorgtabar, Behzad
    Thiran, Jean-Philippe
    Tuytelaars, Tinne
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 16459 - 16468
  • [25] Self-Distillation for Randomized Neural Networks
    Hu, Minghui
    Gao, Ruobin
    Suganthan, Ponnuthurai Nagaratnam
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (11) : 1 - 10
  • [26] KED: A Deep-Supervised Knowledge Enhancement Self-Distillation Framework for Model Compression
    Lai, Yutong
    Ning, Dejun
    Liu, Shipeng
    IEEE SIGNAL PROCESSING LETTERS, 2025, 32 : 831 - 835
  • [27] Retinal vessel segmentation based on self-distillation and implicit neural representation
    Jia Gu
    Fangzheng Tian
    Il-Seok Oh
    Applied Intelligence, 2023, 53 : 15027 - 15044
  • [28] Understanding Self-Distillation in the Presence of Label Noise
    Das, Rudrajit
    Sanghavi, Sujay
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [29] Weakly Supervised Object Detection Based on Feature Self-Distillation Mechanism
    Gao Wenlong
    Chen Ying
    Peng Yong
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (04)
  • [30] Retinal vessel segmentation based on self-distillation and implicit neural representation
    Gu, Jia
    Tian, Fangzheng
    Oh, Il-Seok
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15027 - 15044