A pattern-aware self-attention network for distant supervised relation extraction

被引:26
|
作者
Shang, Yu-Ming [1 ]
Huang, Heyan [1 ,2 ]
Sun, Xin [1 ]
Wei, Wei [3 ]
Mao, Xian-Ling [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing, Peoples R China
[2] Beijing Engn Res Ctr High Volume Language Informa, Beijing, Peoples R China
[3] Huazhong Univ Sci & Technol, Huazhong, Hubei, Peoples R China
关键词
distant supervision; relation extraction; pre-trained Transformer; relational pattern; self-attention network;
D O I
10.1016/j.ins.2021.10.047
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distant supervised relation extraction is an efficient strategy of finding relational facts from unstructured text without labeled training data. A recent paradigm to develop relation extractors is using pre-trained Transformer language models to produce high-quality sentence representations. However, due to the original Transformer is weak at capturing local dependencies and phrasal structures, existing Transformer-based methods cannot identify various relational patterns in sentences. To address this issue, we propose a novel distant supervised relation extraction model, which employs a specific-designed pattern-aware self-attention network to automatically discover relational patterns for pre-trained Transformers in an end-to-end manner. Specifically, the proposed method assumes that the correlation between two adjacent tokens reflects the probability that they belong to the same pattern. Based on this assumption, a novel self-attention network is designed to generate the probability distribution of all patterns in a sentence. Then, the probability distribution is applied as a constraint in the first Transformer layer to encourage its attention heads to follow the relational pattern structures. As a result, fine-grained pattern information is enhanced in the pre-trained Transformer without losing global dependencies. Extensive experimental results on two popular benchmark datasets demonstrate that our model performs better than the state-of-the-art baselines. (c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:269 / 279
页数:11
相关论文
共 50 条
  • [1] MSnet: Multi-Head Self-Attention Network for Distantly Supervised Relation Extraction
    Sun, Tingting
    Zhang, Chunhong
    Ji, Yang
    Hu, Zheng
    IEEE ACCESS, 2019, 7 : 54472 - 54482
  • [2] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793
  • [3] Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
    Li, Yang
    Long, Guodong
    Shen, Tao
    Zhou, Tianyi
    Yao, Lina
    Huo, Huan
    Jiang, Jing
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8269 - 8276
  • [4] Pattern Filtering Attention for Distant Supervised Relation Extraction via Online Clustering
    Peng, Min
    Liao, Qingwen
    Hu, Weilong
    Tian, Gang
    Wang, Hua
    Zhang, YanChun
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2019, 2019, 11881 : 310 - 325
  • [5] Distant supervised relation extraction based on residual attention
    ZHENG Zhiyun
    LIU Yun
    LI Dun
    ZHANG Xingjin
    Frontiers of Computer Science, 2022, 16 (06)
  • [6] Distant supervised relation extraction based on residual attention
    Zhiyun Zheng
    Yun Liu
    Dun Li
    Xingjin Zhang
    Frontiers of Computer Science, 2022, 16
  • [7] Distant Supervised Relation Extraction with Hierarchical Attention Mechanism
    Liu, Jianyi
    Chen, Liandong
    Shi, Rui
    Xu, Jie
    Liu, An
    2021 THE 7TH INTERNATIONAL CONFERENCE ON COMMUNICATION AND INFORMATION PROCESSING, ICCIP 2021, 2021, : 44 - 50
  • [8] Distant supervised relation extraction based on residual attention
    Zheng, Zhiyun
    Liu, Yun
    Li, Dun
    Zhang, Xingjin
    FRONTIERS OF COMPUTER SCIENCE, 2022, 16 (06)
  • [9] Self-Attention Enhanced CNNs and Collaborative Curriculum Learning for Distantly Supervised Relation Extraction
    Huang, Yuyun
    Du, Jinhua
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 389 - 398
  • [10] Distant supervised relation extraction with position feature attention and selective bag attention
    Wang, Jiasheng
    Liu, Qiongxin
    NEUROCOMPUTING, 2021, 461 : 552 - 561