Attention-Based Bi-LSTM for Chinese Named Entity Recognition

被引:1
|
作者
Zhang, Kai [1 ]
Ren, Weiping [2 ]
Zhang, Yangsen [1 ]
机构
[1] Beijing Informat Sci & Technol Univ, Inst Intelligent Informat Proc, Beijing, Peoples R China
[2] Beijing Informat Sci & Technol Univ, Sch Foreign Studies, Beijing, Peoples R China
来源
关键词
Attention mechanism; Bi-LSTM; Named entity recognition; Word2vec;
D O I
10.1007/978-3-030-04015-4_56
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As an integral part of deep learning, attention mechanism and bidirectional long short-term memory (Bi-LSTM) are widely used in the field of NLP (natural language processing) and their effectiveness has been well recognized. This paper adopts an attention-based Bi-LSTM approach to the question of Chinese NER (named entity recognition). With the use of word2vec, we compile vectorized dictionaries and employ Bi-LSTM models to train text vectors, with which the output eigenvectors of the attention model are multiplied. Finally, softmax is used to classify vectors in order to achieve Chinese NER. In four different configurations, our experiments describe the impact of the domain relevance of Chinese character vectors, phrase vectors, and vectorized datasets on the effectiveness of Chinese NER. The experimental results show that the standard precision (P), recall (R), and F1-score (F1) are 97.51%, 95.33%, and 96.41% respectively.
引用
收藏
页码:643 / 652
页数:10
相关论文
共 50 条
  • [1] An Attention Based Bi-LSTM DenseNet Model for Named Entity Recognition in English Texts
    VeeraSekharReddy, B.
    Rao, Koppula Srinivas
    Koppula, Neerja
    [J]. WIRELESS PERSONAL COMMUNICATIONS, 2023, 130 (02) : 1435 - 1448
  • [2] An Attention Based Bi-LSTM DenseNet Model for Named Entity Recognition in English Texts
    B. VeeraSekharReddy
    Koppula Srinivas Rao
    Neerja Koppula
    [J]. Wireless Personal Communications, 2023, 130 : 1435 - 1448
  • [3] Named Entity Recognition for Biomedical Patent Text using Bi-LSTM Variants
    Saad, Farag
    [J]. IIWAS2019: THE 21ST INTERNATIONAL CONFERENCE ON INFORMATION INTEGRATION AND WEB-BASED APPLICATIONS & SERVICES, 2019, : 617 - 621
  • [4] Entity Relationship Extraction Based on Bi-LSTM and Attention Mechanism
    Wei, Ming
    Xu, Zhipeng
    Hu, Jiwei
    [J]. PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,
  • [5] Attention-Based Bi-LSTM Network for Abusive Language Detection
    Nelatoori, Kiran Babu
    Kommanti, Hima Bindu
    [J]. IETE JOURNAL OF RESEARCH, 2023, 69 (11) : 7884 - 7892
  • [6] Attention-Based Bi-LSTM Model for Arabic Depression Classification
    Almars, Abdulqader M.
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 71 (02): : 3091 - 3106
  • [7] An Attention-Based BiLSTM-CRF Model for Chinese Clinic Named Entity Recognition
    Wu, Guohua
    Tang, Guangen
    Wang, Zhongru
    Zhang, Zhen
    Wang, Zhen
    [J]. IEEE ACCESS, 2019, 7 (113942-113949) : 113942 - 113949
  • [8] Chinese Named Entity Recognition in Power Domain Based on Bi-LSTM-CRF
    Zhao, Zhenqiang
    Chen, Zhenyu
    Liu, Jinbo
    Huang, Yunhao
    Gao, Xingyu
    Di, Fangchun
    Li, Lixin
    Ji, Xiaohui
    [J]. 2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND PATTERN RECOGNITION (AIPR 2019), 2019, : 176 - 180
  • [9] Enhancer-LSTMAtt: A Bi-LSTM and Attention-Based Deep Learning Method for Enhancer Recognition
    Huang, Guohua
    Luo, Wei
    Zhang, Guiyang
    Zheng, Peijie
    Yao, Yuhua
    Lyu, Jianyi
    Liu, Yuewu
    Wei, Dong-Qing
    [J]. BIOMOLECULES, 2022, 12 (07)
  • [10] An Attention-Based Approach for Mongolian News Named Entity Recognition
    Tan, Mingyan
    Bao, Feilong
    Gao, Guanglai
    Wang, Weihua
    [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 424 - 435