Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion

被引:0
|
作者
Fan, Cunhang [1 ]
Chen, Yujie [1 ]
Xue, Jun [1 ]
Kong, Yonghui [1 ]
Tao, Jianhua [2 ,3 ]
Lv, Zhao [1 ]
机构
[1] Anhui Univ, Anhui Prov Key Lab Multimodal Cognit Computat, Sch Comp Sci & Technol, Hefei, Peoples R China
[2] Tsinghua Univ, Dept Automat, Beijing, Peoples R China
[3] Tsinghua Univ, Beijing Natl Res Ctr lnformat Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, knowledge graph completion (KGC) models based on pre-trained language model (PLM) have shown promising results. However, the large number of parameters and high computational cost of PLM models pose challenges for their application in downstream tasks. This paper proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models. Specifically, we perform pre-distillation on PLM to obtain high-quality teacher models, and compress the PLM network to obtain multi-grade student models. However, traditional feature distillation suffers from the limitation of having a single representation of information in teacher models. To solve this problem, we propose masked generation of teacher-student features, which contain richer representation information. Furthermore, there is a significant gap in representation ability between teacher and student. Therefore, we design a progressive distillation method to distill student models at each grade level, enabling efficient knowledge transfer from teachers to students. The experimental results demonstrate that the model in the pre-distillation stage surpasses the existing state-of-the-art methods. Furthermore, in the progressive distillation stage, the model significantly reduces the model parameters while maintaining a certain level of performance. Specifically, the model parameters of the lower-grade student model are reduced by 56.7% compared to the baseline.
引用
收藏
页码:8380 / 8388
页数:9
相关论文
共 50 条
  • [41] Medical Knowledge Graph Completion Based on Word Embeddings
    Gao, Mingxia
    Lu, Jianguo
    Chen, Furong
    INFORMATION, 2022, 13 (04)
  • [42] Domain Knowledge Graph Completion Based on Attribute Hierarchy
    Lan, Ning
    Yang, Shuqun
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON ELECTRONIC INFORMATION TECHNOLOGY AND COMPUTER ENGINEERING, EITCE 2023, 2023, : 510 - 515
  • [43] The Impact of Negative Triple Generation Strategies and Anomalies on Knowledge Graph Completion
    Bansal, Iti
    Tiwari, Sudhanshu
    Rivero, Carlos R.
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 45 - 54
  • [44] Neural response generation for task completion using conversational knowledge graph
    Ahmad, Zishan
    Ekbal, Asif
    Sengupta, Shubhashis
    Bhattacharyya, Pushpak
    PLOS ONE, 2023, 18 (02):
  • [45] FedKG: A Knowledge Distillation-Based Federated Graph Method for Social Bot Detection
    Wang, Xiujuan
    Chen, Kangmiao
    Wang, Keke
    Wang, Zhengxiang
    Zheng, Kangfeng
    Zhang, Jiayue
    SENSORS, 2024, 24 (11)
  • [46] Contextualise Entities and Relations: An Interaction Method for Knowledge Graph Completion
    Chen, Kai
    Wang, Ye
    Li, Yitong
    Li, Aiping
    Zhao, Xiaojuan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 179 - 191
  • [47] Research Progress of Knowledge Graph Completion Based on Knowledge Representation Learning
    Yu, Mengbo
    Du, Jianqiang
    Luo, Jigen
    Nie, Bin
    Liu, Yong
    Qiu, Junyang
    Computer Engineering and Applications, 2023, 59 (18) : 59 - 73
  • [48] CombinE: A Fusion Method Enhanced Model for Knowledge Graph Completion
    Cui, Ziyuan
    Wang, Jinxin
    Guo, Zhongwen
    Wang, Weigang
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 383 - 388
  • [49] MApp-KG: Mobile App Knowledge Graph for Document-Based Feature Knowledge Generation
    Motger, Quim
    Franch, Xavier
    Marco, Jordi
    INTELLIGENT INFORMATION SYSTEMS, CAISE FORUM 2024, 2024, 520 : 129 - 137
  • [50] A method of knowledge distillation based on feature fusion and attention mechanism for complex traffic scenes
    Li, Cui-jin
    Qu, Zhong
    Wang, Sheng-ye
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 124