MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention

被引:9
|
作者
Lu, Yiwei [1 ]
Yang, Ruopeng [1 ]
Jiang, Xuping [1 ]
Zhou, Dan [1 ]
Yin, Changsheng [1 ]
Li, Zizhuo [2 ]
机构
[1] Natl Univ Def Technol, Coll Informat & Commun, Wuhan 430019, Peoples R China
[2] Wuhan Univ, Elect Informat Sch, Wuhan 430000, Peoples R China
来源
SYMMETRY-BASEL | 2021年 / 13卷 / 09期
关键词
military relation extraction; bi-directional encoder representations from transformers (BERT); BiGRU; multi-head attention; ENTITY;
D O I
10.3390/sym13091742
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
A great deal of operational information exists in the form of text. Therefore, extracting operational information from unstructured military text is of great significance for assisting command decision making and operations. Military relation extraction is one of the main tasks of military information extraction, which aims at identifying the relation between two named entities from unstructured military texts. However, the traditional methods of extracting military relations cannot easily resolve problems such as inadequate manual features and inaccurate Chinese word segmentation in military fields, failing to make full use of symmetrical entity relations in military texts. With our approach, based on the pre-trained language model, we present a Chinese military relation extraction method, which combines the bi-directional gate recurrent unit (BiGRU) and multi-head attention mechanism (MHATT). More specifically, the conceptual foundation of our method lies in constructing an embedding layer and combining word embedding with position embedding, based on the pre-trained language model; the output vectors of BiGRU neural networks are symmetrically spliced to learn the semantic features of context, and they fuse the multi-head attention mechanism to improve the ability of expressing semantic information. On the military text corpus that we have built, we conduct extensive experiments. We demonstrate the superiority of our method over the traditional non-attention model, attention model, and improved attention model, and the comprehensive evaluation value F1-score of the model is improved by about 4%.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] DCT based multi-head attention-BiGRU model for EEG source location
    Zhang, Boyuan
    Li, Donghao
    Wang, Dongqing
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 93
  • [2] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793
  • [3] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    [J]. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [4] Entity and relation collaborative extraction approach based on multi-head attention and gated mechanism
    Zhao, Wei
    Zhao, Shan
    Chen, Shuhui
    Weng, Tien-Hsiung
    Kang, WenJie
    [J]. CONNECTION SCIENCE, 2022, 34 (01) : 670 - 686
  • [5] Multi-head Attention with Hint Mechanisms for Joint Extraction of Entity and Relation
    Fang, Chih-Hsien
    Chen, Yi-Ling
    Yeh, Mi-Yen
    Lin, Yan-Shuo
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 321 - 335
  • [6] Relation Extraction in Biomedical Texts Based on Multi-Head Attention Model With Syntactic Dependency Feature: Modeling Study
    Li, Yongbin
    Hui, Linhu
    Zou, Liping
    Li, Huyang
    Xu, Luo
    Wang, Xiaohua
    Chua, Stephanie
    [J]. JMIR MEDICAL INFORMATICS, 2022, 10 (10)
  • [7] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [8] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [9] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    [J]. 2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [10] Relation guided and attention enhanced multi-head selection for relational facts extraction
    Zeng, Daojian
    Zhao, Chao
    Xv, Lu
    Dai, Jianhua
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250