A pattern-aware self-attention network for distant supervised relation extraction

被引:30
|
作者
Shang, Yu-Ming [1 ]
Huang, Heyan [1 ,2 ]
Sun, Xin [1 ]
Wei, Wei [3 ]
Mao, Xian-Ling [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing, Peoples R China
[2] Beijing Engn Res Ctr High Volume Language Informa, Beijing, Peoples R China
[3] Huazhong Univ Sci & Technol, Huazhong, Hubei, Peoples R China
关键词
distant supervision; relation extraction; pre-trained Transformer; relational pattern; self-attention network;
D O I
10.1016/j.ins.2021.10.047
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distant supervised relation extraction is an efficient strategy of finding relational facts from unstructured text without labeled training data. A recent paradigm to develop relation extractors is using pre-trained Transformer language models to produce high-quality sentence representations. However, due to the original Transformer is weak at capturing local dependencies and phrasal structures, existing Transformer-based methods cannot identify various relational patterns in sentences. To address this issue, we propose a novel distant supervised relation extraction model, which employs a specific-designed pattern-aware self-attention network to automatically discover relational patterns for pre-trained Transformers in an end-to-end manner. Specifically, the proposed method assumes that the correlation between two adjacent tokens reflects the probability that they belong to the same pattern. Based on this assumption, a novel self-attention network is designed to generate the probability distribution of all patterns in a sentence. Then, the probability distribution is applied as a constraint in the first Transformer layer to encourage its attention heads to follow the relational pattern structures. As a result, fine-grained pattern information is enhanced in the pre-trained Transformer without losing global dependencies. Extensive experimental results on two popular benchmark datasets demonstrate that our model performs better than the state-of-the-art baselines. (c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:269 / 279
页数:11
相关论文
共 50 条
  • [21] A Customized Attention-Based Long Short-Term Memory Network for Distant Supervised Relation Extraction
    He, Dengchao
    Zhang, Hongjun
    Hao, Wenning
    Zhang, Rui
    Cheng, Kai
    NEURAL COMPUTATION, 2017, 29 (07) : 1964 - 1985
  • [22] Modeling Periodic Pattern with Self-Attention Network for Sequential Recommendation
    Ma, Jun
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhao, Lei
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2020), PT III, 2020, 12114 : 557 - 572
  • [23] SELF-ATTENTION RELATION NETWORK FOR FEW-SHOT LEARNING
    Hui, Binyuan
    Zhu, Pengfei
    Hu, Qinghua
    Wang, Qilong
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 198 - 203
  • [24] Normalized and Geometry-Aware Self-Attention Network for Image Captioning
    Guo, Longteng
    Liu, Jing
    Zhu, Xinxin
    Yao, Peng
    Lu, Shichen
    Lu, Hanqing
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 10324 - 10333
  • [25] Joint Model of Entity Recognition and Relation Extraction with Self-attention Mechanism
    Liu, Maofu
    Zhang, Yukun
    Li, Wenjie
    Ji, Donghong
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2020, 19 (04)
  • [26] Syntax-Informed Self-Attention Network for Span-Based Joint Entity and Relation Extraction
    Zhang, Haiyang
    Zhang, Guanqun
    Ma, Yue
    APPLIED SCIENCES-BASEL, 2021, 11 (04): : 1 - 16
  • [27] Attention-Based Gated Convolutional Neural Networks for Distant Supervised Relation Extraction
    Li, Xingya
    Chen, Yufeng
    Xu, Jinan
    Zhang, Yujie
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 246 - 257
  • [28] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [29] Adversarial Learning for Distant Supervised Relation Extraction
    Zeng, Daojian
    Dai, Yuan
    Li, Feng
    Sherratt, R. Simon
    Wang, Jin
    CMC-COMPUTERS MATERIALS & CONTINUA, 2018, 55 (01): : 121 - 136
  • [30] Distant Supervised Relation Extraction Based On Recurrent Convolutional Piecewise Neural Network
    Haihong, E.
    Zhou, Xiaosong
    Song, Meina
    2019 INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING SYSTEMS (SPSS 2019), 2019, : 169 - 175