Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

被引:0
|
作者
Wan, Zhen [1 ]
Cheng, Fei [1 ]
Liu, Qianying [1 ]
Mao, Zhuoyuan [1 ]
Song, Haiyue [1 ]
Kurohashi, Sadao [1 ]
机构
[1] Kyoto Univ, Kyoto, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines.Our code and models are available at: https://github.com/YukinoWan/WCL.
引用
收藏
页码:2580 / 2585
页数:6
相关论文
共 50 条
  • [1] Synergistic Anchored Contrastive Pre-training for Few-Shot Relation Extraction
    Luo, Da
    Gan, Yanglei
    Hou, Rui
    Lin, Run
    Liu, Qiao
    Cai, Yuxiang
    Gao, Wannian
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 18742 - 18750
  • [2] UserBERT: Pre-training User Model with Contrastive Self-supervision
    Wu, Chuhan
    Wu, Fangzhao
    Qi, Tao
    Huang, Yongfeng
    [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2087 - 2092
  • [3] Relational distance and document-level contrastive pre-training based relation extraction model
    Dong, Yihao
    Xu, Xiaolong
    [J]. PATTERN RECOGNITION LETTERS, 2023, 167 : 132 - 140
  • [4] DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction
    Qin, Pengda
    Xu, Weiran
    Wang, William Yang
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 496 - 505
  • [5] A Method of Relation Extraction Using Pre-training Models
    Wang, Yu
    Sun, Yining
    Ma, Zuchang
    Gao, Lisheng
    Xu, Yang
    Wu, Yichen
    [J]. 2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 176 - 179
  • [6] Contrastive Pre-Training of GNNs on Heterogeneous Graphs
    Jiang, Xunqiang
    Lu, Yuanfu
    Fang, Yuan
    Shi, Chuan
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 803 - 812
  • [7] Adversarial momentum-contrastive pre-training
    Xu, Cong
    Li, Dan
    Yang, Min
    [J]. PATTERN RECOGNITION LETTERS, 2022, 160 : 172 - 179
  • [8] Contrastive Code-Comment Pre-training
    Pei, Xiaohuan
    Liu, Daochang
    Qian, Luo
    Xu, Chang
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 398 - 407
  • [9] Revisiting Distant Supervision for Relation Extraction
    Jiang, Tingsong
    Liu, Jing
    Lin, Chin-Yew
    Sui, Zhifang
    [J]. PROCEEDINGS OF THE ELEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2018), 2018, : 3580 - 3585
  • [10] Global Distant Supervision for Relation Extraction
    Han, Xianpei
    Sun, Le
    [J]. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2950 - 2956