Joint Entity Relation Extraction Based on LSTM via Attention Mechanism

被引:0
|
作者
Cao, Xu [1 ]
Shao, Qing [1 ]
机构
[1] Univ Shanghai Sci & Technol, Sch Opt Elect & Comp Engn, Shanghai 200093, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint entity relation extraction; Context semantic; Dependency syntax; Feature fusion; Bidirectional Long Short-Term Memory;
D O I
10.1007/s13369-023-08306-6
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Entity relation extraction holds a significant role in extracting structured information from unstructured text, serving as a foundational component for various other tasks within natural language processing. The pipeline method in entity relation extraction separates entity subtask from relation subtask, causing an error propagation. Contemporary researchers are more inclined to amalgamate two subtasks, improve and innovate the structures of models to carry out joint entity relation extraction. However, these models often merely capture surface-level text features, overlooking the profound-level semantics and syntax inherent within sentences, consequently forfeiting valuable knowledge. In this condition, we propose a joint entity relation extraction method that integrates context semantic and dependency syntax. The bidirectional long short-term memory network is employed to explore context semantic features of sentences, and tree-structured LSTM is utilized to extract dependency syntactic features, subsequently two types of features are fused with the attention mechanism for joint extraction. Experiment results demonstrate that compared with other models, the Accuracy, Recall and F1-value of our proposed method are increased evidently, proving that semantic and syntactic information contained in sentences are beneficial for entity relation extraction.
引用
收藏
页码:4353 / 4363
页数:11
相关论文
共 50 条
  • [21] Fishery standard entity relation extraction using dual attention mechanism
    Yang H.
    Yu H.
    Sun Z.
    Liu J.
    Yang H.
    Zhang S.
    Sun H.
    Jiang X.
    Yu Y.
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2021, 37 (14): : 204 - 212
  • [22] A Subject-aware Attention Hierarchical Tagger for Joint Entity and Relation Extraction
    Zhao, Yawei
    Li, Xiang
    ADVANCED INFORMATION SYSTEMS ENGINEERING (CAISE 2022), 2022, : 270 - 284
  • [23] Multi-head Attention with Hint Mechanisms for Joint Extraction of Entity and Relation
    Fang, Chih-Hsien
    Chen, Yi-Ling
    Yeh, Mi-Yen
    Lin, Yan-Shuo
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 321 - 335
  • [24] Entity and relation collaborative extraction approach based on multi-head attention and gated mechanism
    Zhao, Wei
    Zhao, Shan
    Chen, Shuhui
    Weng, Tien-Hsiung
    Kang, WenJie
    CONNECTION SCIENCE, 2022, 34 (01) : 670 - 686
  • [25] Joint multimodal entity-relation extraction based on temporal enhancement and similarity-gated attention
    Wang, Guoxiang
    Liu, Jin
    Xie, Jialong
    Zhu, Zhenwei
    Zhou, Fengyu
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [26] A Span-based Multi-Modal Attention Network for joint entity-relation extraction
    Wan, Qian
    Wei, Luona
    Zhao, Shan
    Liu, Jie
    KNOWLEDGE-BASED SYSTEMS, 2023, 262
  • [27] Using Entity Relation to Improve Event Detection via Attention Mechanism
    Zhang, Jingli
    Zhou, Wenxuan
    Hong, Yu
    Yao, Jianmin
    Zhang, Min
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 171 - 183
  • [28] Entity Relation Joint Extraction Method Based on Insertion Transformers
    Qi, Haotian
    Liu, Weiguang
    Liu, Fenghua
    Zhu, Weigang
    Shan, Fangfang
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (04) : 656 - 664
  • [29] Joint entity and relation extraction model based on rich semantics
    Geng, Zhiqiang
    Zhang, Yanhui
    Han, Yongming
    NEUROCOMPUTING, 2021, 429 : 132 - 140
  • [30] Joint entity and relation extraction based on a hybrid neural network
    Zheng, Suncong
    Hao, Yuexing
    Lu, Dongyuan
    Bao, Hongyun
    Xu, Jiaming
    Hao, Hongwei
    Xu, Bo
    NEUROCOMPUTING, 2017, 257 : 59 - 66