Automatic horizon tracking method based on knowledge self-distillation

被引:0
|
作者
Yang, Mengqiong [1 ]
Xu, Huiqun [1 ]
Peng, Zhen [1 ]
Wang, Peng [1 ]
机构
[1] Yangtze Univ, Coll Geophys & Petr Resources, Wuhan 430100, Peoples R China
关键词
deep learning; horizons tracking; UNet; teacher model; student models; knowledge self-distillation;
D O I
10.1504/IJOGCT.2023.10056112
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In recent years, deep learning has achieved a large number of successful cases in the seismic horizon tracking; however, most of the existing studies apply more complex network structures to improve the horizon tracking accuracy, which significantly increases the model complexity and training time. In order to improve the tracking accuracy without increasing the model complexity, this paper proposes a knowledge-based self-distillation method for horizon tracking. In the first step, the teacher model is obtained based on UNet training to obtain the prior weights; in the second step, the untrained UNet is used as the student network, and the teacher model is used to guide the student network for knowledge distillation training to improve the knowledge transfer performance. After synthesis and actual data testing, using the knowledge distillation method for horizon tracking improves the horizon tracking accuracy without increasing the model complexity, and provides a new method for automatic horizon tracking.
引用
收藏
页码:336 / 350
页数:16
相关论文
共 50 条
  • [41] ClKI: closed-loop and knowledge iterative via self-distillation for image sentiment analysis
    Zhang, Hongbin
    Yuan, Meng
    Hu, Lang
    Wang, Wengang
    Li, Zhijie
    Ye, Yiyuan
    Ren, Yafeng
    Ji, Donghong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (07) : 2843 - 2862
  • [42] Self-Distillation and Pinyin Character Prediction for Chinese Spelling Correction Based on Multimodality
    He, Li
    Liu, Feng
    Liu, Jie
    Duan, Jianyong
    Wang, Hao
    APPLIED SCIENCES-BASEL, 2024, 14 (04):
  • [43] A Transformer-Based Model With Self-Distillation for Multimodal Emotion Recognition in Conversations
    Ma, Hui
    Wang, Jian
    Lin, Hongfei
    Zhang, Bo
    Zhang, Yijia
    Xu, Bo
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 776 - 788
  • [44] CLASS-AWARE REGULARIZED SELF-DISTILLATION LEARNING METHOD FOR LAND COVER CLASSIFICATION
    Zang, Qi
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 4603 - 4606
  • [45] Deep convolutional neural network based on self-distillation for tool wear recognition
    Pan, Yi
    Hao, Ling
    He, Jianliang
    Ding, Kun
    Yu, Qiang
    Wang, Yulin
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 132
  • [46] Enhancing Tiny Tissues Segmentation via Self-Distillation
    Zhou, Chuan
    Chen, Yuchu
    Fan, Minghao
    Wen, Yang
    Chen, Hang
    Chen, Leiting
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 934 - 940
  • [47] EasySED: Trusted Sound Event Detection with Self-Distillation
    Zhou, Qingsong
    Xu, Kele
    Feng, Ming
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 13236 - 13238
  • [48] Toward Generalized Multistage Clustering: Multiview Self-Distillation
    Wang, Jiatai
    Xu, Zhiwei
    Wang, Xin
    Li, Tao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [49] Variational Self-Distillation for Remote Sensing Scene Classification
    Hu, Yutao
    Huang, Xin
    Luo, Xiaoyan
    Han, Jungong
    Cao, Xianbin
    Zhang, Jun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [50] Multi-exit self-distillation with appropriate teachers
    Sun, Wujie
    Chen, Defang
    Wang, Can
    Ye, Deshi
    Feng, Yan
    Chen, Chun
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2024, 25 (04) : 585 - 599