Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

被引:1
|
作者
Xue, Yafei [1 ,2 ]
Zhu, Jing [1 ,3 ]
Lyu, Jing [2 ]
机构
[1] Hohai Univ, Sch Comp & Informat, Nanjing 211100, Jiangsu, Peoples R China
[2] Nanjing Normal Univ Zhongbei Coll, Dept Informat Sci & Technol, Nanjing 210046, Jiangsu, Peoples R China
[3] Xinjiang Agr Univ, Coll Comp & Informat Engn, Urumqi 830052, Xinjiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language processing systems;
D O I
10.1155/2022/1530295
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Using Recurrent Neural Network Structure and Multi-Head Attention with Convolution for Fraudulent Phone Text Recognition
    Zhou J.
    Xu H.
    Zhang Z.
    Lu J.
    Guo W.
    Li Z.
    Computer Systems Science and Engineering, 2023, 46 (02): : 2277 - 2297
  • [32] Multi-information interaction graph neural network for joint entity and relation extraction
    Zhang, Yini
    Zhang, Yuxuan
    Wang, Zijing
    Peng, Huanchun
    Yang, Yongsheng
    Li, Yuanxiang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 235
  • [33] Entity Relation Extraction Based on Multi- attention Mechanism and BiGRU Network
    Wang, Lingyun
    Xiong, Caiquan
    Xu, Wenxiang
    Lin, Song
    COMPLEX, INTELLIGENT AND SOFTWARE INTENSIVE SYSTEMS, CISIS-2021, 2021, 278 : 47 - 56
  • [34] An Ensemble of Text Convolutional Neural Networks and Multi-Head Attention Layers for Classifying Threats in Network Packets
    Kim, Hyeonmin
    Yoon, Young
    ELECTRONICS, 2023, 12 (20)
  • [35] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [36] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    ELECTRONICS, 2023, 12 (19)
  • [37] Financial Volatility Forecasting: A Sparse Multi-Head Attention Neural Network
    Lin, Hualing
    Sun, Qiubi
    INFORMATION, 2021, 12 (10)
  • [38] Bilinear Multi-Head Attention Graph Neural Network for Traffic Prediction
    Hu, Haibing
    Han, Kai
    Yin, Zhizhuo
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 2, 2022, : 33 - 43
  • [39] Fast Neural Chinese Named Entity Recognition with Multi-head Self-attention
    Qi, Tao
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Liu, Junxin
    Huang, Yongfeng
    Xie, Xing
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING, 2019, 1134 : 98 - 110
  • [40] Sentiment Analysis of Text Based on Bidirectional LSTM With Multi-Head Attention
    Long, Fei
    Zhou, Kai
    Ou, Weihua
    IEEE ACCESS, 2019, 7 : 141960 - 141969