Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion

被引:0
|
作者
Fan, Cunhang [1 ]
Chen, Yujie [1 ]
Xue, Jun [1 ]
Kong, Yonghui [1 ]
Tao, Jianhua [2 ,3 ]
Lv, Zhao [1 ]
机构
[1] Anhui Univ, Anhui Prov Key Lab Multimodal Cognit Computat, Sch Comp Sci & Technol, Hefei, Peoples R China
[2] Tsinghua Univ, Dept Automat, Beijing, Peoples R China
[3] Tsinghua Univ, Beijing Natl Res Ctr lnformat Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, knowledge graph completion (KGC) models based on pre-trained language model (PLM) have shown promising results. However, the large number of parameters and high computational cost of PLM models pose challenges for their application in downstream tasks. This paper proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models. Specifically, we perform pre-distillation on PLM to obtain high-quality teacher models, and compress the PLM network to obtain multi-grade student models. However, traditional feature distillation suffers from the limitation of having a single representation of information in teacher models. To solve this problem, we propose masked generation of teacher-student features, which contain richer representation information. Furthermore, there is a significant gap in representation ability between teacher and student. Therefore, we design a progressive distillation method to distill student models at each grade level, enabling efficient knowledge transfer from teachers to students. The experimental results demonstrate that the model in the pre-distillation stage surpasses the existing state-of-the-art methods. Furthermore, in the progressive distillation stage, the model significantly reduces the model parameters while maintaining a certain level of performance. Specifically, the model parameters of the lower-grade student model are reduced by 56.7% compared to the baseline.
引用
收藏
页码:8380 / 8388
页数:9
相关论文
共 50 条
  • [21] Knowledge Graph Completion via Multi-feature Learning
    Zhang, Hanwen
    Yao, Juanjuan
    Zhu, Yi'an
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14878 : 269 - 280
  • [22] NFFKD: A Knowledge Distillation Method Based on Normalized Feature Fusion Model
    Wang, Zihan
    Xie, Junwei
    Yao, Zhiping
    Kuang, Xu
    Gao, Qinquan
    Tong, Tong
    2022 IEEE THE 5TH INTERNATIONAL CONFERENCE ON BIG DATA AND ARTIFICIAL INTELLIGENCE (BDAI 2022), 2022, : 111 - 116
  • [23] Knowledge graph completion based on graph contrastive attention network
    Liu D.
    Fang Q.
    Zhang X.
    Hu J.
    Qian S.
    Xu C.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2022, 48 (08): : 1428 - 1435
  • [24] Knowledge graph completion method based on quantum embedding and quaternion interaction enhancement
    Li, Linyu
    Zhang, Xuan
    Jin, Zhi
    Gao, Chen
    Zhu, Rui
    Liang, Yuqin
    Ma, Yubing
    INFORMATION SCIENCES, 2023, 648
  • [25] A Cybersecurity Knowledge Graph Completion Method Based on Ensemble Learning and Adversarial Training
    Wang, Peng
    Liu, Jingju
    Hou, Dongdong
    Zhou, Shicheng
    APPLIED SCIENCES-BASEL, 2022, 12 (24):
  • [26] Knowledge graph completion method based on hyperbolic representation learning and contrastive learning
    Zhang, Xiaodong
    Wang, Meng
    Zhong, Xiuwen
    An, Feixu
    EGYPTIAN INFORMATICS JOURNAL, 2023, 24 (04)
  • [27] A Knowledge Graph Completion Method for Telecom Metadata Based on the Spherical Coordinate System
    Zhang, Kaicheng
    Wang, Han
    Yang, Mingchuan
    Li, Xinchi
    Xia, Xiaoqing
    Guo, Zhixia
    IEEE ACCESS, 2022, 10 : 122670 - 122678
  • [28] A knowledge graph completion model based on contrastive learning and relation enhancement method
    Li, LinYu
    Zhang, Xuan
    Ma, YuBin
    Gao, Chen
    Wang, Jishu
    Yu, Yong
    Yuan, Zihao
    Ma, Qiuying
    KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [29] Follow Your Path: A Progressive Method for Knowledge Distillation
    Shi, Wenxian
    Song, Yuxuan
    Zhou, Hao
    Li, Bohan
    Li, Lei
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 596 - 611
  • [30] From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer
    Xie, Xin
    Zhang, Ningyu
    Li, Zhoubo
    Deng, Shumin
    Chen, Hui
    Xiong, Feiyu
    Chen, Mosha
    Chen, Huajun
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 162 - 165