Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition

被引:3
|
作者
Lu, Weijia [1 ,2 ]
Jiang, Jiehui [3 ]
Shi, Yaxiang [4 ]
Zhong, Xiaowei [5 ]
Gu, Jun [6 ]
Huangfu, Lixia [7 ]
Gong, Ming [7 ]
机构
[1] Nantong Univ, Affiliated Hosp, Sci & Technol Dept, Nantong, Peoples R China
[2] Jianghai Hosp Nantong Sutong Sci & Technol Pk, Dept Internal Med, Nantong, Peoples R China
[3] Shanghai Univ, Dept Biomed Engn, Shanghai, Peoples R China
[4] Southeast Univ, Zhongda Hosp, Network Informat Ctr, Nanjing, Peoples R China
[5] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou, Peoples R China
[6] Nantong Univ, Affiliated Hosp, Dept Resp, Nantong, Peoples R China
[7] Nantong Univ, Informat Ctr Dept, Affiliated Hosp, Nantong, Peoples R China
关键词
BERT; LSTM; cross attention; entity recognition; electronic medical records;
D O I
10.3389/fnins.2023.1259652
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become a significant issue in the healthcare domain.MethodsTo address this problem, this paper proposes a deep learning-based entity information extraction model called Entity-BERT. The model aims to leverage the powerful feature extraction capabilities of deep learning and the pre-training language representation learning of BERT(Bidirectional Encoder Representations from Transformers), enabling it to automatically learn and recognize various entity types in medical electronic records, including medical terminologies, disease names, drug information, and more, providing more effective support for medical research and clinical practices. The Entity-BERT model utilizes a multi-layer neural network and cross-attention mechanism to process and fuse information at different levels and types, resembling the hierarchical and distributed processing of the human brain. Additionally, the model employs pre-trained language and sequence models to process and learn textual data, sharing similarities with the language processing and semantic understanding of the human brain. Furthermore, the Entity-BERT model can capture contextual information and long-term dependencies, combining the cross-attention mechanism to handle the complex and diverse language expressions in electronic medical records, resembling the information processing method of the human brain in many aspects. Additionally, exploring how to utilize competitive learning, adaptive regulation, and synaptic plasticity to optimize the model's prediction results, automatically adjust its parameters, and achieve adaptive learning and dynamic adjustments from the perspective of neuroscience and brain-like cognition is of interest.Results and discussionExperimental results demonstrate that the Entity-BERT model achieves outstanding performance in entity recognition tasks within electronic medical records, surpassing other existing entity recognition models. This research not only provides more efficient and accurate natural language processing technology for the medical and health field but also introduces new ideas and directions for the design and optimization of deep learning models.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Fusion of SoftLexicon and RoBERTa for Purpose-Driven Electronic Medical Record Named Entity Recognition
    Cui, Xiaohui
    Yang, Yu
    Li, Dongmei
    Qu, Xiaolong
    Yao, Lei
    Luo, Sisi
    Song, Chao
    APPLIED SCIENCES-BASEL, 2023, 13 (24):
  • [32] Constructing a Chinese electronic medical record corpus for named entity recognition on resident admit notes
    Yan Gao
    Lei Gu
    Yefeng Wang
    Yandong Wang
    Feng Yang
    BMC Medical Informatics and Decision Making, 19
  • [33] Constructing a Chinese electronic medical record corpus for named entity recognition on resident admit notes
    Gao, Yan
    Gu, Lei
    Wang, Yefeng
    Wang, Yandong
    Yang, Feng
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2019, 19 (Suppl 2)
  • [34] Medical Text Entity Study based on BERT-BiLSTM-MHA-CRF Model
    Shen, Tongping
    Xu, Huanqing
    Journal of Network Intelligence, 2022, 7 (02): : 410 - 421
  • [35] ChineseCTRE: A Model for Geographical Named Entity Recognition and Correction Based on Deep Neural Networks and the BERT Model
    Zhang, Wei
    Meng, Jingtao
    Wan, Jianhua
    Zhang, Chengkun
    Zhang, Jiajun
    Wang, Yuanyuan
    Xu, Liuchang
    Li, Fei
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2023, 12 (10)
  • [36] Medical Named Entity Recognition Model Based on Knowledge Graph Enhancement
    Lu, Yonghe
    Zhao, Ruijie
    Wen, Xiuxian
    Tong, Xinyu
    Xiang, Dingcheng
    Zhang, Jinxia
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2024, 38 (04)
  • [37] Chinese medical named entity recognition model based on local enhancement
    Chen, Jing
    Xing, Kexuan
    Meng, Weilun
    Guo, Jingfeng
    Feng, Jianzhou
    Tongxin Xuebao/Journal on Communications, 45 (07): : 171 - 183
  • [38] Naming entity recognition of citrus pests and diseases based on the BERT-BiLSTM-CRF model
    Liu, Yafei
    Wei, Siqi
    Huang, Haijun
    Lai, Qin
    Li, Mengshan
    Guan, Lixin
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 234
  • [39] Named Entity Recognition for Chinese Electronic Medical Records Based on Multitask and Transfer Learning
    Guo, Wenming
    Lu, Junda
    Han, Fang
    IEEE ACCESS, 2022, 10 : 77375 - 77382
  • [40] An attention-based deep learning model for clinical named entity recognition of Chinese electronic medical records
    Li, Luqi
    Zhao, Jie
    Hou, Li
    Zhai, Yunkai
    Shi, Jinming
    Cui, Fangfang
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2019, 19 (01)