Automatic horizon tracking method based on knowledge self-distillation

被引:0
|
作者
Yang, Mengqiong [1 ]
Xu, Huiqun [1 ]
Peng, Zhen [1 ]
Wang, Peng [1 ]
机构
[1] Yangtze Univ, Coll Geophys & Petr Resources, Wuhan 430100, Peoples R China
关键词
deep learning; horizons tracking; UNet; teacher model; student models; knowledge self-distillation;
D O I
10.1504/IJOGCT.2023.10056112
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In recent years, deep learning has achieved a large number of successful cases in the seismic horizon tracking; however, most of the existing studies apply more complex network structures to improve the horizon tracking accuracy, which significantly increases the model complexity and training time. In order to improve the tracking accuracy without increasing the model complexity, this paper proposes a knowledge-based self-distillation method for horizon tracking. In the first step, the teacher model is obtained based on UNet training to obtain the prior weights; in the second step, the untrained UNet is used as the student network, and the teacher model is used to guide the student network for knowledge distillation training to improve the knowledge transfer performance. After synthesis and actual data testing, using the knowledge distillation method for horizon tracking improves the horizon tracking accuracy without increasing the model complexity, and provides a new method for automatic horizon tracking.
引用
收藏
页码:336 / 350
页数:16
相关论文
共 50 条
  • [1] The Self-Distillation HRNet Object Segmentation Based on the Pyramid Knowledge
    Zheng Y.-F.
    Wang X.-B.
    Zhang X.-W.
    Cao T.-Y.
    Sun M.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2023, 51 (03): : 746 - 756
  • [2] Reverse Self-Distillation Overcoming the Self-Distillation Barrier
    Ni, Shuiping
    Ma, Xinliang
    Zhu, Mingfu
    Li, Xingwang
    Zhang, Yu-Dong
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2023, 4 : 195 - 205
  • [3] A self-distillation object segmentation method via frequency domain knowledge augmentation
    Chen, Lei
    Cao, Tieyong
    Zheng, Yunfei
    Fang, Zheng
    IET COMPUTER VISION, 2023, 17 (03) : 341 - 351
  • [4] Image classification based on self-distillation
    Yuting Li
    Linbo Qing
    Xiaohai He
    Honggang Chen
    Qiang Liu
    Applied Intelligence, 2023, 53 : 9396 - 9408
  • [5] Image classification based on self-distillation
    Li, Yuting
    Qing, Linbo
    He, Xiaohai
    Chen, Honggang
    Liu, Qiang
    APPLIED INTELLIGENCE, 2023, 53 (08) : 9396 - 9408
  • [6] A Teacher-Free Graph Knowledge Distillation Framework With Dual Self-Distillation
    Wu, Lirong
    Lin, Haitao
    Gao, Zhangyang
    Zhao, Guojiang
    Li, Stan Z.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (09) : 4375 - 4385
  • [7] A dynamic dropout self-distillation method for object segmentation
    Chen, Lei
    Cao, Tieyong
    Zheng, Yunfei
    Wang, Yang
    Zhang, Bo
    Yang, Jibin
    COMPLEX & INTELLIGENT SYSTEMS, 2025, 11 (01)
  • [8] Self-distillation with model averaging
    Gu, Xiaozhe
    Zhang, Zixun
    Jin, Ran
    Goh, Rick Siow Mong
    Luo, Tao
    INFORMATION SCIENCES, 2025, 694
  • [9] Probabilistic online self-distillation
    Tzelepi, Maria
    Passalis, Nikolaos
    Tefas, Anastasios
    NEUROCOMPUTING, 2022, 493 : 592 - 604
  • [10] Self-distillation object segmentation via pyramid knowledge representation and transfer
    Zheng, Yunfei
    Sun, Meng
    Wang, Xiaobing
    Cao, Tieyong
    Zhang, Xiongwei
    Xing, Lixing
    Fang, Zheng
    MULTIMEDIA SYSTEMS, 2023, 29 (05) : 2615 - 2631