Adversarial Multi-Teacher Distillation for Semi-Supervised Relation Extraction

被引:0
|
作者
Li, Wanli [1 ]
Qian, Tieyun [1 ]
Li, Xuhui [2 ]
Zou, Lixin [3 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan 430072, Peoples R China
[2] Wuhan Univ, Sch Informat Management, Wuhan 430072, Peoples R China
[3] Wuhan Univ, Sch Cyber Sci & Engn, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Data models; Training; Task analysis; Predictive models; Adaptation models; Training data; Data mining; Adversarial training (AT); knowledge distillation (KD); relation extraction (RE); semi-supervised learning;
D O I
10.1109/TNNLS.2023.3258967
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The shortage of labeled data has been a long-standing challenge for relation extraction (RE) tasks. Semi-supervised RE (SSRE) is a promising way through annotating unlabeled samples with pseudolabels as additional training data. However, some pseudolabels on unlabeled data might be erroneous and will bring misleading knowledge into SSRE models. For this reason, we propose a novel adversarial multi-teacher distillation (AMTD) framework, which includes multi-teacher knowledge distillation and adversarial training (AT), to capture the knowledge on unlabeled data in a refined way. Specifically, we first develop a general knowledge distillation (KD) technique to learn not only from pseudolabels but also from the class distribution of predictions by different models in existing SSRE methods. To improve the robustness of the model, we further empower the distillation process with a language model-based AT technique. Extensive experimental results on two public datasets demonstrate that our framework significantly promotes the performance of the base SSRE methods.
引用
收藏
页码:11291 / 11301
页数:11
相关论文
共 50 条
  • [1] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature
    Fei, Hongxiao
    Tan, Yangying
    Huang, Wenti
    Long, Jun
    Huang, Jincai
    Yang, Liu
    [J]. WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
  • [2] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation
    Wang, Qixuan
    Zhang, Yanjun
    Lu, Jun
    Li, Congsheng
    Zhang, Yungang
    [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
  • [3] Certainty driven consistency loss on multi-teacher networks for semi-supervised learning
    Liu, Lu
    Tan, Robby T.
    [J]. PATTERN RECOGNITION, 2021, 120
  • [4] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation
    Zhao, Shiji
    Yu, Jie
    Sun, Zhenlong
    Zhang, Bo
    Wei, Xingxing
    [J]. COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602
  • [5] Semi-supervised teacher-student architecture for relation extraction
    Luo, Fan
    Nagesh, Ajay
    Sharp, Rebecca
    Surdeanu, Mihai
    [J]. NLP@NAACL-HLT 2019 - 3rd Workshop on Structured Prediction for NLP, Proceedings, 2021, : 29 - 37
  • [6] Multi-teacher Self-training for Semi-supervised Node Classification with Noisy Labels
    Liu, Yujing
    Wu, Zongqian
    Lu, Zhengyu
    Wen, Guoqiu
    Ma, Junbo
    Lu, Guangquan
    Zhu, Xiaofeng
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2946 - 2954
  • [7] Multi-Teacher D-S Fusion for Semi-Supervised SAR Ship Detection
    Zhang, Xinzheng
    Li, Jinlin
    Li, Chao
    Liu, Guojin
    [J]. REMOTE SENSING, 2024, 16 (15)
  • [8] Biographical Semi-Supervised Relation Extraction Dataset
    Plum, Alistair
    Ranasinghe, Tharindu
    Jones, Spencer
    Orasan, Constantin
    Mitkov, Ruslan
    [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 3121 - 3130
  • [9] Semi-Supervised Feature Distillation and Unsupervised Domain Adversarial Distillation for Underwater Image Enhancement
    Qiao, Nianzu
    Sun, Changyin
    Dong, Lu
    Ge, Quanbo
    [J]. IEEE Transactions on Circuits and Systems for Video Technology, 2024, 34 (08) : 7671 - 7682
  • [10] Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction
    Li, Wanli
    Qian, Tieyun
    Chen, Xu
    Tang, Kejian
    Zhan, Shaohui
    Zhan, Tao
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,