DABC: A Named Entity Recognition Method Incorporating Attention Mechanisms

被引:0
|
作者
Leng, Fangling [1 ]
Li, Fan [1 ]
Bao, Yubin [1 ]
Zhang, Tiancheng [1 ]
Yu, Ge [1 ]
机构
[1] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110169, Peoples R China
基金
中国国家自然科学基金;
关键词
DeBERTa; multi-attention mechanism; BiLSTM-CRF; named entity recognition;
D O I
10.3390/math12131992
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Regarding the existing models for feature extraction of complex similar entities, there are problems in the utilization of relative position information and the ability of key feature extraction. The distinctiveness of Chinese named entity recognition compared to English lies in the absence of space delimiters, significant polysemy and homonymy of characters, diverse and common names, and a greater reliance on complex contextual and linguistic structures. An entity recognition method based on DeBERTa-Attention-BiLSTM-CRF (DABC) is proposed. Firstly, the feature extraction capability of the DeBERTa model is utilized to extract the data features; then, the attention mechanism is introduced to further enhance the extracted features; finally, BiLSTM is utilized to further capture the long-distance dependencies in the text and obtain the predicted sequences through the CRF layer, and then the entities in the text are identified. The proposed model is applied to the dataset for validation. The experiments show that the precision (P) of the proposed DABC model on the dataset reaches 88.167%, the recall (R) reaches 83.121%, and the F1 value reaches 85.024%. Compared with other models, the F1 value improves by 3 similar to 5%, and the superiority of the model is verified. In the future, it can be extended and applied to recognize complex entities in more fields.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Incorporating word⁃set attention into Chinese named entity recognition Method
    Zhong S.-S.
    Chen X.
    Zhao M.-H.
    Zhang Y.-J.
    [J]. Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2022, 52 (05): : 1098 - 1105
  • [2] Medical Named Entity Recognition Incorporating Word Information and Graph Attention
    Zhenzhen, Zhao
    Yanru, Dong
    Jing, Liu
    Junzhong, Zhang
    Hui, Cao
    [J]. Computer Engineering and Applications, 60 (11): : 147 - 155
  • [3] A Controlled Attention for Nested Named Entity Recognition
    Yanping Chen
    Rong Huang
    Lijun Pan
    Ruizhang Huang
    Qinghua Zheng
    Ping Chen
    [J]. Cognitive Computation, 2023, 15 : 132 - 145
  • [4] A Controlled Attention for Nested Named Entity Recognition
    Chen, Yanping
    Huang, Rong
    Pan, Lijun
    Huang, Ruizhang
    Zheng, Qinghua
    Chen, Ping
    [J]. COGNITIVE COMPUTATION, 2023, 15 (01) : 132 - 145
  • [5] CLASSIFICATION ATTENTION FOR CHINESE NAMED ENTITY RECOGNITION
    Cong, Kai
    Wang, Yunpeng
    Li, Tao
    Xu, Yanbin
    [J]. JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2021, 22 (09) : 1675 - 1686
  • [6] Incorporating multi-level CNN and attention mechanism for Chinese clinical named entity recognition
    Kong, Jun
    Zhang, Leixin
    Jiang, Min
    Liu, Tianshan
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2021, 116 (116)
  • [7] Incorporating Named Entity Recognition into the Speech Transcription Process
    Hatmi, Mohamed
    Jacquin, Christine
    Morin, Emmanuel
    Meignier, Sylvain
    [J]. 14TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2013), VOLS 1-5, 2013, : 3699 - 3703
  • [8] A Method of Named Entity Recognition for Tigrinya
    Yohannes, Hailemariam Mehari
    Amagasa, Toshiyuki
    [J]. APPLIED COMPUTING REVIEW, 2022, 22 (03): : 56 - 68
  • [9] Radial Basis Function Attention for Named Entity Recognition
    Chen, Jiusheng
    Xu, Xingkai
    Zhang, Xiaoyu
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (01)
  • [10] Ontology Attention Layer for Medical Named Entity Recognition
    Zha, Yue
    Ke, Yuanzhi
    Hu, Xiao
    Xiong, Caiquan
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (01):