Automatic horizon tracking method based on knowledge self-distillation

被引:0
|
作者
Yang, Mengqiong [1 ]
Xu, Huiqun [1 ]
Peng, Zhen [1 ]
Wang, Peng [1 ]
机构
[1] Yangtze Univ, Coll Geophys & Petr Resources, Wuhan 430100, Peoples R China
关键词
deep learning; horizons tracking; UNet; teacher model; student models; knowledge self-distillation;
D O I
10.1504/IJOGCT.2023.10056112
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In recent years, deep learning has achieved a large number of successful cases in the seismic horizon tracking; however, most of the existing studies apply more complex network structures to improve the horizon tracking accuracy, which significantly increases the model complexity and training time. In order to improve the tracking accuracy without increasing the model complexity, this paper proposes a knowledge-based self-distillation method for horizon tracking. In the first step, the teacher model is obtained based on UNet training to obtain the prior weights; in the second step, the untrained UNet is used as the student network, and the teacher model is used to guide the student network for knowledge distillation training to improve the knowledge transfer performance. After synthesis and actual data testing, using the knowledge distillation method for horizon tracking improves the horizon tracking accuracy without increasing the model complexity, and provides a new method for automatic horizon tracking.
引用
收藏
页码:336 / 350
页数:16
相关论文
共 50 条
  • [31] Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
    Roth, Karsten
    Milbich, Timo
    Ommer, Bjorn
    Cohen, Joseph Paul
    Ghassemi, Marzyeh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [32] A non-negative feedback self-distillation method for salient object detection
    Chen L.
    Cao T.
    Zheng Y.
    Yang J.
    Wang Y.
    Wang Y.
    Zhang B.
    PeerJ Computer Science, 2023, 9
  • [33] Self-Distillation for Improving CTC-Transformer-based ASR Systems
    Moriya, Takafumi
    Ochiai, Tsubasa
    Karita, Shigeki
    Sato, Hiroshi
    Tanaka, Tomohiro
    Ashihara, Takanori
    Masumura, Ryo
    Shinohara, Yusuke
    Delcroix, Marc
    INTERSPEECH 2020, 2020, : 546 - 550
  • [34] Contrastive knowledge-augmented self-distillation approach for few-shot learning
    Zhang, Lixu
    Shao, Mingwen
    Chen, Sijie
    Liu, Fukang
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (05)
  • [35] An improved ShuffleNetV2 method based on ensemble self-distillation for tomato leaf diseases recognition
    Ni, Shuiping
    Jia, Yue
    Zhu, Mingfu
    Zhang, Yizhe
    Wang, Wendi
    Liu, Shangxin
    Chen, Yawei
    FRONTIERS IN PLANT SCIENCE, 2025, 15
  • [36] A non-negative feedback self-distillation method for salient object detection
    Chen, Lei
    Cao, Tieyong
    Zheng, Yunfei
    Yang, Jibin
    Wang, Yang
    Wang, Yekui
    Zhang, Bo
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [37] Deep Contrastive Representation Learning With Self-Distillation
    Xiao, Zhiwen
    Xing, Huanlai
    Zhao, Bowen
    Qu, Rong
    Luo, Shouxi
    Dai, Penglin
    Li, Ke
    Zhu, Zonghai
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (01): : 3 - 15
  • [38] Self-Distillation Amplifies Regularization in Hilbert Space
    Mobahi, Hossein
    Farajtabar, Mehrdad
    Bartlett, Peter L.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [39] Self-distillation and self-supervision for partial label learning
    Yu, Xiaotong
    Sun, Shiding
    Tian, Yingjie
    PATTERN RECOGNITION, 2024, 146
  • [40] A Lightweight Pig Face Recognition Method Based on Automatic Detection and Knowledge Distillation
    Ma, Ruihan
    Ali, Hassan
    Chung, Seyeon
    Kim, Sang Cheol
    Kim, Hyongsuk
    APPLIED SCIENCES-BASEL, 2024, 14 (01):