Enhance prototypical networks with hybrid attention and confusing loss function for few-shot relation classification

被引:9
|
作者
Li, Yibing [1 ,2 ,3 ]
Ma, Zuchang [1 ]
Gao, Lisheng [1 ]
Wu, Yichen [1 ,2 ,4 ]
Xie, Fei [3 ]
Ren, Xiaoye [3 ]
机构
[1] Chinese Acad Sci, Hefei Inst Phys Sci, Inst Intelligent Machines, Anhui Prov Key Lab Med Phys & Technol, Hefei 230031, Peoples R China
[2] Univ Sci & Technol China, Sci Isl Branch Grad Sch, Hefei 230026, Peoples R China
[3] Hefei Normal Univ, Sch Comp Sci & Technol, Hefei 230601, Peoples R China
[4] Anhui Jianzhu Univ, Sch Elect & Informat Engn, Hefei 230601, Peoples R China
基金
中国国家自然科学基金;
关键词
Relation classification; Few-shot learning; Hybrid attention; Loss; BERT;
D O I
10.1016/j.neucom.2022.04.067
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation classification (RC) is a fundamental task to building knowledge graphs and describing semantic formalization. It aims to classify a relation between the head and the tail entities in a sentence. The existing RC method mainly adopts the distant supervision (DS) scheme. However, DS still has the problem of long-tail and suffers from data sparsity. Recently, few-shot learning (FSL) has attracted people's attention. It solves the long-tail problem by learning from few-shot samples. The prototypical networks have a better effect on FSL, which classifies a relation by distance. However, the prototypical networks and their related variants did not consider the critical role of entity words. In addition, not all sentences in support set equally contributed to classifying relations. Furthermore, an entity pair in a sentence may have true and confusing relations, which is difficult for the RC model to distinguish them. A new context encoder BERT_FE is proposed to address those problems, which uses the BERT model as pre-training and fuses the information of head and tail entities by entity word-level attention (WLA). At the same time, the sentence-level attention (SLA) is proposed to give more weight to sentences of the support set similar to the query instance and improve the classification accuracy. A confusing loss function (CLF) is designed to enhance the model's ability to distinguish between true and confusing relations. The experiment results demonstrate that our proposed model (HACLF) is better than several baseline models. (c) 2022 Elsevier B.V. All rights reserved.
引用
下载
收藏
页码:362 / 372
页数:11
相关论文
共 50 条
  • [1] Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification
    Gao, Tianyu
    Han, Xu
    Liu, Zhiyuan
    Sun, Maosong
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 6407 - 6414
  • [2] HYBRID ATTENTION-BASED PROTOTYPICAL NETWORKS FOR FEW-SHOT SOUND CLASSIFICATION
    Wang, You
    Anderson, David, V
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 651 - 655
  • [3] Hybrid Enhancement-based prototypical networks for few-shot relation classification
    Wang, Lei
    Qu, Jianfeng
    Xu, Tianyu
    Li, Zhixu
    Chen, Wei
    Xu, Jiajie
    Zhao, Lei
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 3207 - 3226
  • [4] Hybrid Enhancement-based prototypical networks for few-shot relation classification
    Lei Wang
    Jianfeng Qu
    Tianyu Xu
    Zhixu Li
    Wei Chen
    Jiajie Xu
    Lei Zhao
    World Wide Web, 2023, 26 : 3207 - 3226
  • [5] Hierarchical Attention Prototypical Networks for Few-Shot Text Classification
    Sun, Shengli
    Sun, Qingfeng
    Zhou, Kevin
    Lv, Tengchao
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 476 - 485
  • [6] Few-shot relation classification by context attention-based prototypical networks with BERT
    Bei Hui
    Liang Liu
    Jia Chen
    Xue Zhou
    Yuhui Nian
    EURASIP Journal on Wireless Communications and Networking, 2020
  • [7] Few-shot relation classification by context attention-based prototypical networks with BERT
    Hui, Bei
    Liu, Liang
    Chen, Jia
    Zhou, Xue
    Nian, Yuhui
    EURASIP JOURNAL ON WIRELESS COMMUNICATIONS AND NETWORKING, 2020, 2020 (01)
  • [8] Enhance Prototypical Network with Text Descriptions for Few-shot Relation Classification
    Yang, Kaijia
    Zheng, Nantao
    Dai, Xinyu
    He, Liang
    Huang, Shujian
    Chen, Jiajun
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2273 - 2276
  • [9] Few-shot relation classification based on the BERT model, hybrid attention and fusion networks
    Yibing Li
    Zenghui Ding
    Zuchang Ma
    Yichen Wu
    Yu Wang
    Ruiqi Zhang
    Fei Xie
    Xiaoye Ren
    Applied Intelligence, 2023, 53 : 21448 - 21464
  • [10] Few-shot relation classification based on the BERT model, hybrid attention and fusion networks
    Li, Yibing
    Ding, Zenghui
    Ma, Zuchang
    Wu, Yichen
    Wang, Yu
    Zhang, Ruiqi
    Xie, Fei
    Ren, Xiaoye
    APPLIED INTELLIGENCE, 2023, 53 (18) : 21448 - 21464