Joint Extraction of Clinical Entities and Relations Using Multi-head Selection Method

被引:1
|
作者
Fang, Xintao [1 ]
Song, Yuting [2 ]
Maeda, Akira [2 ]
机构
[1] Ritsumeikan Univ, Grad Sch Informat Sci & Engn, Kusatsu, Japan
[2] Ritsumeikan Univ, Coll Informat Sci & Engn, Kusatsu, Japan
关键词
Clinical record; entity recognition; relation extraction; multi-head selection; pre-trained language model; RECOGNITION;
D O I
10.1109/IALP54817.2021.9675275
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The extraction of entities and relations from unstructured clinical records has been attracting increasing attention. In addition to the existing traditional methods, deep learning methods have also been proposed for entity and relation extraction. However, previous work on clinical entity and relation extraction did not consider the multiple relations between clinical entities, which often exist in clinical texts. To deal with multiple relations, we propose using a multi-head selection method for clinical entity and relation extraction. As pre-trained language models have been shown to be effective for clinical entity and relation extraction, we integrate a pre-trained language model with a multi-head model to jointly extract clinical entities and relations. The experimental results show that the proposed model is effective for entity and relation extraction on both the i2b2/VA 2010 and n2c2 2018 challenge datasets and outperforms the topranking systems in the n2c2 2018 challenge. We also evaluate the impact of four existing pre-trained language models on clinical entity and relation extraction performance. The domainspecific pre-trained language model improves the performance of clinical entity and relation extraction. Between BERT and CharacterBERT, which uses a Character-CNN module instead of BERT's wordpiece system to represent entire words, we find that BERT outperforms CharacterBERT on joint extraction of clinical entities and relations.
引用
收藏
页码:99 / 104
页数:6
相关论文
共 50 条
  • [21] CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning
    Zeng, Daojian
    Zhang, Haoran
    Liu, Qianying
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9507 - 9514
  • [22] A Decoupling and Aggregating Framework for Joint Extraction of Entities and Relations
    Wang, Yao
    Liu, Xin
    Kong, Weikun
    Yu, Hai-Tao
    Racharak, Teeradaj
    Kim, Kyoung-Sook
    Nguyen, Le Minh
    IEEE ACCESS, 2024, 12 : 103313 - 103328
  • [23] Sequence to sequence learning for joint extraction of entities and relations
    Liang, Zeyu
    Du, Junping
    Neurocomputing, 2022, 501 : 480 - 488
  • [24] Sequence to sequence learning for joint extraction of entities and relations
    Liang, Zeyu
    Du, Junping
    NEUROCOMPUTING, 2022, 501 : 480 - 488
  • [25] Investigating LSTMs for Joint Extraction of Opinion Entities and Relations
    Katiyar, Arzoo
    Cardie, Claire
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 919 - 929
  • [26] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [27] A new interest extraction method based on multi-head attention mechanism for CTR prediction
    Haifeng Yang
    Linjing Yao
    Jianghui Cai
    Yupeng Wang
    Xujun Zhao
    Knowledge and Information Systems, 2023, 65 : 3337 - 3352
  • [28] Network Configuration Entity Extraction Method Based on Transformer with Multi-Head Attention Mechanism
    Yang, Yang
    Qu, Zhenying
    Yan, Zefan
    Gao, Zhipeng
    Wang, Ti
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (01): : 735 - 757
  • [29] Joint Extraction of Entities and Overlapping Relations Using Position-Attentive Sequence Labeling
    Dai, Dai
    Xiao, Xinyan
    Lyu, Yajuan
    Dou, Shan
    She, Qiaoqiao
    Wang, Haifeng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 6300 - 6308
  • [30] A new interest extraction method based on multi-head attention mechanism for CTR prediction
    Yang, Haifeng
    Yao, Linjing
    Cai, Jianghui
    Wang, Yupeng
    Zhao, Xujun
    KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (08) : 3337 - 3352