Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion

被引:0
|
作者
Fan, Cunhang [1 ]
Chen, Yujie [1 ]
Xue, Jun [1 ]
Kong, Yonghui [1 ]
Tao, Jianhua [2 ,3 ]
Lv, Zhao [1 ]
机构
[1] Anhui Univ, Anhui Prov Key Lab Multimodal Cognit Computat, Sch Comp Sci & Technol, Hefei, Peoples R China
[2] Tsinghua Univ, Dept Automat, Beijing, Peoples R China
[3] Tsinghua Univ, Beijing Natl Res Ctr lnformat Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, knowledge graph completion (KGC) models based on pre-trained language model (PLM) have shown promising results. However, the large number of parameters and high computational cost of PLM models pose challenges for their application in downstream tasks. This paper proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models. Specifically, we perform pre-distillation on PLM to obtain high-quality teacher models, and compress the PLM network to obtain multi-grade student models. However, traditional feature distillation suffers from the limitation of having a single representation of information in teacher models. To solve this problem, we propose masked generation of teacher-student features, which contain richer representation information. Furthermore, there is a significant gap in representation ability between teacher and student. Therefore, we design a progressive distillation method to distill student models at each grade level, enabling efficient knowledge transfer from teachers to students. The experimental results demonstrate that the model in the pre-distillation stage surpasses the existing state-of-the-art methods. Furthermore, in the progressive distillation stage, the model significantly reduces the model parameters while maintaining a certain level of performance. Specifically, the model parameters of the lower-grade student model are reduced by 56.7% compared to the baseline.
引用
收藏
页码:8380 / 8388
页数:9
相关论文
共 50 条
  • [31] A Cybersecurity Knowledge Graph Completion Method for Penetration Testing
    Wang, Peng
    Liu, Jingju
    Zhong, Xiaofeng
    Zhou, Shicheng
    ELECTRONICS, 2023, 12 (08)
  • [32] A Cybersecurity Knowledge Graph Completion Method for Scalable Scenarios
    Wang, Peng
    Liu, Jingju
    Yao, Qian
    Xiong, Xinli
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2023, 2023, 14120 : 83 - 98
  • [33] Label Semantic Knowledge Distillation for Unbiased Scene Graph Generation
    Li, Lin
    Xiao, Jun
    Shi, Hanrong
    Wang, Wenxiao
    Shao, Jian
    Liu, An-An
    Yang, Yi
    Chen, Long
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 195 - 206
  • [34] Attention and feature transfer based knowledge distillation
    Yang, Guoliang
    Yu, Shuaiying
    Sheng, Yangyang
    Yang, Hao
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [35] LLM-based Multi-Level Knowledge Generation for Few-shot Knowledge Graph Completion
    Li, Qian
    Chen, Zhuo
    Ji, Cheng
    Jiang, Shiqi
    Li, Jianxin
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 2135 - 2143
  • [36] An Overview of Research on Knowledge Graph Completion Based on Graph Neural Network
    Yue W.
    Haichun S.
    Data Analysis and Knowledge Discovery, 2024, 8 (03) : 10 - 28
  • [37] A Semantic Filter Based on Relations for Knowledge Graph Completion
    Liang, Zongwei
    Yang, Junan
    Liu, Hui
    Huang, Keju
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 7920 - 7929
  • [38] Embedding based Link Prediction for Knowledge Graph Completion
    Biswas, Russa
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 3221 - 3224
  • [39] Knowledge Graph Completion With Pattern-Based Methods
    Sabet, Maryam
    Pajoohan, Mohammadreza
    Moosavi, Mohammad Reza
    IEEE ACCESS, 2025, 13 : 5815 - 5831
  • [40] A Method to Constract a Masked Knowlege Graph Model using Transformer for Knowledge Graph Reasoning
    Kaneda, Ryoya
    Okada, Makoto
    Mori, Naoki
    2023 IEEE 17TH INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING, ICSC, 2023, : 298 - 299