Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition

被引:3
|
作者
Lu, Weijia [1 ,2 ]
Jiang, Jiehui [3 ]
Shi, Yaxiang [4 ]
Zhong, Xiaowei [5 ]
Gu, Jun [6 ]
Huangfu, Lixia [7 ]
Gong, Ming [7 ]
机构
[1] Nantong Univ, Affiliated Hosp, Sci & Technol Dept, Nantong, Peoples R China
[2] Jianghai Hosp Nantong Sutong Sci & Technol Pk, Dept Internal Med, Nantong, Peoples R China
[3] Shanghai Univ, Dept Biomed Engn, Shanghai, Peoples R China
[4] Southeast Univ, Zhongda Hosp, Network Informat Ctr, Nanjing, Peoples R China
[5] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou, Peoples R China
[6] Nantong Univ, Affiliated Hosp, Dept Resp, Nantong, Peoples R China
[7] Nantong Univ, Informat Ctr Dept, Affiliated Hosp, Nantong, Peoples R China
关键词
BERT; LSTM; cross attention; entity recognition; electronic medical records;
D O I
10.3389/fnins.2023.1259652
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become a significant issue in the healthcare domain.MethodsTo address this problem, this paper proposes a deep learning-based entity information extraction model called Entity-BERT. The model aims to leverage the powerful feature extraction capabilities of deep learning and the pre-training language representation learning of BERT(Bidirectional Encoder Representations from Transformers), enabling it to automatically learn and recognize various entity types in medical electronic records, including medical terminologies, disease names, drug information, and more, providing more effective support for medical research and clinical practices. The Entity-BERT model utilizes a multi-layer neural network and cross-attention mechanism to process and fuse information at different levels and types, resembling the hierarchical and distributed processing of the human brain. Additionally, the model employs pre-trained language and sequence models to process and learn textual data, sharing similarities with the language processing and semantic understanding of the human brain. Furthermore, the Entity-BERT model can capture contextual information and long-term dependencies, combining the cross-attention mechanism to handle the complex and diverse language expressions in electronic medical records, resembling the information processing method of the human brain in many aspects. Additionally, exploring how to utilize competitive learning, adaptive regulation, and synaptic plasticity to optimize the model's prediction results, automatically adjust its parameters, and achieve adaptive learning and dynamic adjustments from the perspective of neuroscience and brain-like cognition is of interest.Results and discussionExperimental results demonstrate that the Entity-BERT model achieves outstanding performance in entity recognition tasks within electronic medical records, surpassing other existing entity recognition models. This research not only provides more efficient and accurate natural language processing technology for the medical and health field but also introduces new ideas and directions for the design and optimization of deep learning models.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Lexicon Graph Adapter Based BERT Model for Chinese Named Entity Recognition
    Liu, Jie
    Liu, Peipei
    Ren, Yimo
    Wang, Jinfa
    Zhu, Hongsong
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT V, KSEM 2024, 2024, 14888 : 95 - 105
  • [22] Chinese Named Entity Recognition Based on BERT and Lightweight Feature Extraction Model
    Yang, Ruisen
    Gan, Yong
    Zhang, Chenfang
    INFORMATION, 2022, 13 (11)
  • [23] Geotechnical Named Entity Recognition Based on BERT-BiGRU-CRF Model
    Quanyu W.
    Li Z.
    Tu Z.
    Chen G.
    Hu J.
    Chen J.
    Chen J.
    Lv G.
    Diqiu Kexue - Zhongguo Dizhi Daxue Xuebao/Earth Science - Journal of China University of Geosciences, 2023, 48 (08): : 3137 - 3150
  • [24] A Named Entity Recognition Model for Manufacturing Process Based on the BERT Language Model Scheme
    Shrivastava, Manu
    Seri, Kota
    Wagatsuma, Hiroaki
    SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 576 - 587
  • [25] Named Entity Recognition for Chinese Electronic Medical Record by Fusing Semantic and Boundary Information
    Cui S.
    Chen J.
    Li X.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2022, 51 (04): : 565 - 571
  • [26] A Hybrid Model for Named Entity Recognition on Chinese Electronic Medical Records
    Wang, Yu
    Sun, Yining
    Ma, Zuchang
    Gao, Lisheng
    Xu, Yang
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (02)
  • [27] A BiLSTM-CRF Method to Chinese Electronic Medical Record Named Entity Recognition
    Ji, Bin
    Liu, Rui
    Li, ShaSha
    Tang, JinTao
    Yu, Jie
    Li, Qian
    Xu, WeiSang
    2018 INTERNATIONAL CONFERENCE ON ALGORITHMS, COMPUTING AND ARTIFICIAL INTELLIGENCE (ACAI 2018), 2018,
  • [28] Named Entity Recognition in Chinese Electronic Medical Records Based on CRF
    Liu, Kaixin
    Hu, Qingcheng
    Liu, Jianwei
    Xing, Chunxiao
    2017 14TH WEB INFORMATION SYSTEMS AND APPLICATIONS CONFERENCE (WISA 2017), 2017, : 105 - 110
  • [29] Chinese Medical Named Entity Recognition based on Expert Knowledge and Fine-tuning Bert
    Zhang, Bofeng
    Yao, Xiuhong
    Li, Haiyan
    Aini, Mirensha
    2023 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH, ICKG, 2023, : 84 - 90
  • [30] ABioNER: A BERT-Based Model for Arabic Biomedical Named-Entity Recognition
    Boudjellal, Nada
    Zhang, Huaping
    Khan, Asif
    Ahmad, Arshad
    Naseem, Rashid
    Shang, Jianyun
    Dai, Lin
    COMPLEXITY, 2021, 2021