Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

被引:1
|
作者
Xue, Yafei [1 ,2 ]
Zhu, Jing [1 ,3 ]
Lyu, Jing [2 ]
机构
[1] Hohai Univ, Sch Comp & Informat, Nanjing 211100, Jiangsu, Peoples R China
[2] Nanjing Normal Univ Zhongbei Coll, Dept Informat Sci & Technol, Nanjing 210046, Jiangsu, Peoples R China
[3] Xinjiang Agr Univ, Coll Comp & Informat Engn, Urumqi 830052, Xinjiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language processing systems;
D O I
10.1155/2022/1530295
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] AgriBERT: A Joint Entity Relation Extraction Model Based on Agricultural Text
    Chen, Xiaojin
    Chen, Tianyue
    Zhao, Jingbo
    Wang, Yaojun
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, KSEM 2024, 2024, 14885 : 254 - 266
  • [22] A Relation-Specific Attention Network for Joint Entity and Relation Extraction
    Yuan, Yue
    Zhou, Xiaofei
    Pan, Shirui
    Zhu, Qiannan
    Song, Zeliang
    Guo, Li
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4054 - 4060
  • [23] A Span-based Multi-Modal Attention Network for joint entity-relation extraction
    Wan, Qian
    Wei, Luona
    Zhao, Shan
    Liu, Jie
    KNOWLEDGE-BASED SYSTEMS, 2023, 262
  • [24] Interactive Selection Recommendation Based on the Multi-head Attention Graph Neural Network
    Zhang, Shuxi
    Chen, Jianxia
    Yao, Meihan
    Wu, Xinyun
    Ge, Yvfan
    Li, Shu
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT III, 2024, 14449 : 447 - 458
  • [25] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [26] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [27] Dual Interactive Attention Network for Joint Entity and Relation Extraction
    Li, Lishuang
    Wang, Zehao
    Qin, Xueyang
    Lu, Hongbin
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 259 - 271
  • [28] Data-driven fiber model based on the deep neural network with multi-head attention mechanism
    Zang, Yubin
    Yu, Zhenming
    Xu, Kun
    Chen, Minghua
    Yang, Sigang
    Chen, Hongwei
    OPTICS EXPRESS, 2022, 30 (26) : 46626 - 46648
  • [29] Relation Extraction in Biomedical Texts Based on Multi-Head Attention Model With Syntactic Dependency Feature: Modeling Study
    Li, Yongbin
    Hui, Linhu
    Zou, Liping
    Li, Huyang
    Xu, Luo
    Wang, Xiaohua
    Chua, Stephanie
    JMIR MEDICAL INFORMATICS, 2022, 10 (10)
  • [30] A Multi-head Self-relation Network for Scene Text Recognition
    Zhou, Junwei
    Gao, Hongchao
    Dai, Jiao
    Liu, Dongqin
    Han, Jizhong
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 3969 - 3976