Combined Self-Attention Mechanism for Chinese Named Entity Recognition in Military

被引:10
|
作者
Liao, Fei [1 ]
Ma, Liangli [1 ]
Pei, Jingjing [2 ]
Tan, Linshan [2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Hubei, Peoples R China
[2] Force 91001, Beijing 100841, Peoples R China
来源
FUTURE INTERNET | 2019年 / 11卷 / 08期
基金
中国国家自然科学基金;
关键词
military named entity recognition; self-attention mechanism; BiLSTM;
D O I
10.3390/fi11080180
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Military named entity recognition (MNER) is one of the key technologies in military information extraction. Traditional methods for the MNER task rely on cumbersome feature engineering and specialized domain knowledge. In order to solve this problem, we propose a method employing a bidirectional long short-term memory (BiLSTM) neural network with a self-attention mechanism to identify the military entities automatically. We obtain distributed vector representations of the military corpus by unsupervised learning and the BiLSTM model combined with the self-attention mechanism is adopted to capture contextual information fully carried by the character vector sequence. The experimental results show that the self-attention mechanism can improve effectively the performance of MNER task. The F-score of the military documents and network military texts identification was 90.15% and 89.34%, respectively, which was better than other models.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Multidimensional Self-Attention for Aspect Term Extraction and Biomedical Named Entity Recognition
    Song, Xinyu
    Feng, Ao
    Wang, Weikuan
    Gao, Zhengjie
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
  • [22] Named Entity Recognition of Chinese Agricultural Text Based on Attention Mechanism
    Zhao, Pengfei
    Zhao, Chunjiang
    Wu, Huarui
    Wang, Wei
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2021, 52 (01): : 185 - 192
  • [23] CLASSIFICATION ATTENTION FOR CHINESE NAMED ENTITY RECOGNITION
    Cong, Kai
    Wang, Yunpeng
    Li, Tao
    Xu, Yanbin
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2021, 22 (09) : 1675 - 1686
  • [24] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193
  • [25] Chinese clinical named entity recognition via multi-head self-attention based BiLSTM-CRF
    An, Ying
    Xia, Xianyun
    Chen, Xianlai
    Wu, Fang-Xiang
    Wang, Jianxin
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2022, 127
  • [26] Named Entity Recognition in Chinese Electronic Medical Record Using Attention Mechanism
    Li, Menglong
    Zhang, Yu
    Huang, Mengxing
    Chen, Jing
    Feng, Wenlong
    2019 INTERNATIONAL CONFERENCE ON INTERNET OF THINGS (ITHINGS) AND IEEE GREEN COMPUTING AND COMMUNICATIONS (GREENCOM) AND IEEE CYBER, PHYSICAL AND SOCIAL COMPUTING (CPSCOM) AND IEEE SMART DATA (SMARTDATA), 2019, : 649 - 654
  • [27] Joint Model of Entity Recognition and Relation Extraction with Self-attention Mechanism
    Liu, Maofu
    Zhang, Yukun
    Li, Wenjie
    Ji, Donghong
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2020, 19 (04)
  • [28] Exploring Named Entity Recognition via MacBERT-BiGRU and Global Pointer with Self-Attention
    Yuan, Chengzhe
    Tang, Feiyi
    Shan, Chun
    Shen, Weiqiang
    Lin, Ronghua
    Mao, Chengjie
    Li, Junxian
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (12)
  • [29] Chinese named entity recognition combined active learning with self-training
    Zhong, Zhinong, 1600, National University of Defense Technology (36):
  • [30] Chinese Q&A Community Medical Entity Recognition with Character-Level Features and Self-Attention Mechanism
    Han, Pu
    Zhang, Mingtao
    Shi, Jin
    Yang, Jinming
    Li, Xiaoyan
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2021, 29 (01): : 55 - 72