Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

被引:1
|
作者
Xue, Yafei [1 ,2 ]
Zhu, Jing [1 ,3 ]
Lyu, Jing [2 ]
机构
[1] Hohai Univ, Sch Comp & Informat, Nanjing 211100, Jiangsu, Peoples R China
[2] Nanjing Normal Univ Zhongbei Coll, Dept Informat Sci & Technol, Nanjing 210046, Jiangsu, Peoples R China
[3] Xinjiang Agr Univ, Coll Comp & Informat Engn, Urumqi 830052, Xinjiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language processing systems;
D O I
10.1155/2022/1530295
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [2] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    [J]. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [3] Multi-head Attention with Hint Mechanisms for Joint Extraction of Entity and Relation
    Fang, Chih-Hsien
    Chen, Yi-Ling
    Yeh, Mi-Yen
    Lin, Yan-Shuo
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 321 - 335
  • [4] Entity and relation collaborative extraction approach based on multi-head attention and gated mechanism
    Zhao, Wei
    Zhao, Shan
    Chen, Shuhui
    Weng, Tien-Hsiung
    Kang, WenJie
    [J]. CONNECTION SCIENCE, 2022, 34 (01) : 670 - 686
  • [5] Hybrid neural network model based on multi-head attention for English text emotion analysis
    Li, Ping
    [J]. EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS, 2022, 9 (35):
  • [6] Joint entity recognition and relation extraction as a multi-head selection problem
    Bekoulis, Giannis
    Deleu, Johannes
    Demeester, Thomas
    Develder, Chris
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2018, 114 : 34 - 45
  • [7] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Taiqu Lai
    Lianglun Cheng
    Depei Wang
    Haiming Ye
    Weiwen Zhang
    [J]. Applied Intelligence, 2022, 52 : 3132 - 3142
  • [8] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Lai, Taiqu
    Cheng, Lianglun
    Wang, Depei
    Ye, Haiming
    Zhang, Weiwen
    [J]. APPLIED INTELLIGENCE, 2022, 52 (03) : 3132 - 3142
  • [9] Network Configuration Entity Extraction Method Based on Transformer with Multi-Head Attention Mechanism
    Yang, Yang
    Qu, Zhenying
    Yan, Zefan
    Gao, Zhipeng
    Wang, Ti
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (01): : 735 - 757
  • [10] MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention
    Lu, Yiwei
    Yang, Ruopeng
    Jiang, Xuping
    Zhou, Dan
    Yin, Changsheng
    Li, Zizhuo
    [J]. SYMMETRY-BASEL, 2021, 13 (09):