Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network

被引:1
|
作者
Xue, Yafei [1 ,2 ]
Zhu, Jing [1 ,3 ]
Lyu, Jing [2 ]
机构
[1] Hohai Univ, Sch Comp & Informat, Nanjing 211100, Jiangsu, Peoples R China
[2] Nanjing Normal Univ Zhongbei Coll, Dept Informat Sci & Technol, Nanjing 210046, Jiangsu, Peoples R China
[3] Xinjiang Agr Univ, Coll Comp & Informat Engn, Urumqi 830052, Xinjiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language processing systems;
D O I
10.1155/2022/1530295
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are h = 8, dv = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index F1, accuracy rate P, and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Multi-Head Attention Neural Network for Smartphone Invariant Indoor Localization
    Tiku, Saideep
    Gufran, Danish
    Pasricha, Sudeep
    2022 IEEE 12TH INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN 2022), 2022,
  • [42] Relation guided and attention enhanced multi-head selection for relational facts extraction
    Zeng, Daojian
    Zhao, Chao
    Xv, Lu
    Dai, Jianhua
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250
  • [43] An Improved Model for Analyzing Textual Sentiment Based on a Deep Neural Network Using Multi-Head Attention Mechanism
    Sharaf Al-deen, Hashem Saleh
    Zeng, Zhiwen
    Al-sabri, Raeed
    Hekmat, Arash
    APPLIED SYSTEM INNOVATION, 2021, 4 (04)
  • [44] Multi-Attention Cascade Model Based on Multi-Head Structure for Image-Text Retrieval
    Zhang, Haotian
    Wu, Wei
    Zhang, Meng
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [45] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193
  • [46] A Relational Adaptive Neural Model for Joint Entity and Relation Extraction
    Duan, Guiduo
    Miao, Jiayu
    Huang, Tianxi
    Luo, Wenlong
    Hu, Dekun
    FRONTIERS IN NEUROROBOTICS, 2021, 15
  • [47] A new joint CTC-attention-based speech recognition model with multi-level multi-head attention
    Chu-Xiong Qin
    Wen-Lin Zhang
    Dan Qu
    EURASIP Journal on Audio, Speech, and Music Processing, 2019
  • [48] A new joint CTC-attention-based speech recognition model with multi-level multi-head attention
    Qin, Chu-Xiong
    Zhang, Wen-Lin
    Qu, Dan
    EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, 2019, 2019 (01)
  • [49] Temporal Residual Network Based Multi-Head Attention Model for Arabic Handwriting Recognition
    Zouari, Ramzi
    Othmen, Dalila
    Boubaker, Houcine
    Kherallah, Monji
    INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 2023, 20 (3A) : 469 - 476
  • [50] Multi-Head Attention-Based Hybrid Deep Neural Network for Aeroengine Risk Assessment
    Li, Jian-Hang
    Gao, Xin-Yue
    Lu, Xiang
    Liu, Guo-Dong
    IEEE ACCESS, 2023, 11 : 113376 - 113389