Enhancing clinical concept extraction with contextual embeddings

被引:145
|
作者
Si, Yuqi [1 ]
Wang, Jingqi [1 ]
Xu, Hua [1 ]
Roberts, Kirk [1 ]
机构
[1] Univ Texas Hlth Sci Ctr Houston, Sch Biomed Informat, 7000 Fannin St,Suite E730F, Houston, TX 77030 USA
基金
美国国家卫生研究院;
关键词
clinical concept extraction; contextual embeddings; language model; INFORMATION EXTRACTION; WORD EMBEDDINGS;
D O I
10.1093/jamia/ocz096
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Objective: Neural network-based representations ("embeddings") have dramatically advanced natural language processing (NLP) tasks, including clinical NLP tasks such as concept extraction. Recently, however, more advanced embedding methods and representations (eg, ELMo, BERT) have further pushed the state of the art in NLP, yet there are no common best practices for how to integrate these representations into clinical tasks. The purpose of this study, then, is to explore the space of possible options in utilizing these new models for clinical concept extraction, including comparing these to traditional word embedding methods (word2vec, GloVe, fastText). Materials and Methods: Both off-the-shelf, open-domain embeddings and pretrained clinical embeddings from MIMIC-III (Medical Information Mart for Intensive Care III) are evaluated. We explore a battery of embedding methods consisting of traditional word embeddings and contextual embeddings and compare these on 4 concept extraction corpora: i2b2 2010, i2b2 2012, SemEval 2014, and SemEval 2015. We also analyze the impact of the pretraining time of a large language model like ELMo or BERT on the extraction performance. Last, we present an intuitive way to understand the semantic information encoded by contextual embeddings. Results: Contextual embeddings pretrained on a large clinical corpus achieves new state-of-the-art performances across all concept extraction tasks. The best-performing model outperforms all state-of-the-art methods with respective F1-measures of 90.25, 93.18 (partial), 80.74, and 81.65. Conclusions: We demonstrate the potential of contextual embeddings through the state-of-the-art performance these methods achieve on clinical concept extraction. Additionally, we demonstrate that contextual embeddings encode valuable semantic information not accounted for in traditional word representations.
引用
收藏
页码:1297 / 1304
页数:8
相关论文
共 50 条
  • [1] Enhancing clinical concept extraction with distributional semantics
    Jonnalagadda, Siddhartha
    Cohen, Trevor
    Wu, Stephen
    Gonzalez, Graciela
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2012, 45 (01) : 129 - 140
  • [2] Evaluating Contextual Embeddings and their Extraction Layers for Depression Assessment
    Matero, Matthew
    Hung, Albert
    Schwartz, H. Andrew
    [J]. PROCEEDINGS OF THE 12TH WORKSHOP ON COMPUTATIONAL APPROACHES TO SUBJECTIVITY, SENTIMENT & SOCIAL MEDIA ANALYSIS, 2022, : 89 - 94
  • [3] Contextual Embeddings and Graph Convolutional Networks for Concept Prerequisite Learning
    Layoun, Jean-Charles
    Zouaq, Amal
    Desmarais, Michel
    [J]. 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 81 - 90
  • [4] Medical concept normalization in French using multilingual terminologies and contextual embeddings
    Wajsburt, Perceval
    Sarfati, Arnaud
    Tannier, Xavier
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2021, 114 (114)
  • [5] Concept Extraction from Medical Documents A Contextual Approach
    Szenasi, Gyorgy
    Lemnaru, Camelia
    Barbantan, Ioana
    [J]. 2015 IEEE 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP), 2015, : 13 - 17
  • [6] Exploiting Position and Contextual Word Embeddings for Keyphrase Extraction from Scientific Papers
    Patel, Krutarth
    Caragea, Cornelia
    [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1585 - 1591
  • [7] Distantly Supervised Relation Extraction via Contextual Information Interaction and Relation Embeddings
    Yin, Huixin
    Liu, Shengquan
    Jian, Zhaorui
    [J]. SYMMETRY-BASEL, 2023, 15 (09):
  • [8] Enhancing the Takhrij Al-Hadith based on Contextual Similarity using BERT Embeddings
    Luthfi, Emha Taufiq
    Yusoh, Zeratul Izzah Mohd
    Aboobaider, Burhanuddin Mohd
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (11) : 286 - 293
  • [9] Understanding patient complaint characteristics using contextual clinical BERT embeddings
    Saha, Budhaditya
    Lisboa, Sanal
    Ghosh, Shameek
    [J]. 42ND ANNUAL INTERNATIONAL CONFERENCES OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY: ENABLING INNOVATIVE TECHNOLOGIES FOR GLOBAL HEALTHCARE EMBC'20, 2020, : 5847 - 5850
  • [10] Contextual Embeddings: When Are They Worth It?
    Arora, Simran
    May, Avner
    Zhang, Jian
    Re, Christopher
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 2650 - 2663