Enhancing clinical concept extraction with contextual embeddings

被引:145
|
作者
Si, Yuqi [1 ]
Wang, Jingqi [1 ]
Xu, Hua [1 ]
Roberts, Kirk [1 ]
机构
[1] Univ Texas Hlth Sci Ctr Houston, Sch Biomed Informat, 7000 Fannin St,Suite E730F, Houston, TX 77030 USA
基金
美国国家卫生研究院;
关键词
clinical concept extraction; contextual embeddings; language model; INFORMATION EXTRACTION; WORD EMBEDDINGS;
D O I
10.1093/jamia/ocz096
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Objective: Neural network-based representations ("embeddings") have dramatically advanced natural language processing (NLP) tasks, including clinical NLP tasks such as concept extraction. Recently, however, more advanced embedding methods and representations (eg, ELMo, BERT) have further pushed the state of the art in NLP, yet there are no common best practices for how to integrate these representations into clinical tasks. The purpose of this study, then, is to explore the space of possible options in utilizing these new models for clinical concept extraction, including comparing these to traditional word embedding methods (word2vec, GloVe, fastText). Materials and Methods: Both off-the-shelf, open-domain embeddings and pretrained clinical embeddings from MIMIC-III (Medical Information Mart for Intensive Care III) are evaluated. We explore a battery of embedding methods consisting of traditional word embeddings and contextual embeddings and compare these on 4 concept extraction corpora: i2b2 2010, i2b2 2012, SemEval 2014, and SemEval 2015. We also analyze the impact of the pretraining time of a large language model like ELMo or BERT on the extraction performance. Last, we present an intuitive way to understand the semantic information encoded by contextual embeddings. Results: Contextual embeddings pretrained on a large clinical corpus achieves new state-of-the-art performances across all concept extraction tasks. The best-performing model outperforms all state-of-the-art methods with respective F1-measures of 90.25, 93.18 (partial), 80.74, and 81.65. Conclusions: We demonstrate the potential of contextual embeddings through the state-of-the-art performance these methods achieve on clinical concept extraction. Additionally, we demonstrate that contextual embeddings encode valuable semantic information not accounted for in traditional word representations.
引用
收藏
页码:1297 / 1304
页数:8
相关论文
共 50 条
  • [21] The quest for better clinical word vectors: Ontology based and lexical vector augmentation versus clinical contextual embeddings
    Nath, Namrata
    Lee, Sang-Heon
    McDonnell, Mark D.
    Lee, Ivan
    [J]. COMPUTERS IN BIOLOGY AND MEDICINE, 2021, 134 (134)
  • [22] RESEARCH HIGHLIGHT GENERATION WITH ELMO CONTEXTUAL EMBEDDINGS
    Rehman, Tohida
    Sanyal, Debarshi Kumar
    Chattopadhyay, Samiran
    [J]. SCALABLE COMPUTING-PRACTICE AND EXPERIENCE, 2023, 24 (02): : 181 - 190
  • [23] Contextual semantic embeddings for ontology subsumption prediction
    Chen, Jiaoyan
    He, Yuan
    Geng, Yuxia
    Jimenez-Ruiz, Ernesto
    Dong, Hang
    Horrocks, Ian
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 2569 - 2591
  • [24] Personalized Query Expansion with Contextual Word Embeddings
    Bassani, Elias
    Tonellotto, Nicola
    Pasi, Gabriella
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (02)
  • [25] Comparative Study of Sentence Embeddings for Contextual Paraphrasing
    Pragst, Louisa
    Minker, Wolfgang
    Ultes, Stefan
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 6841 - 6851
  • [26] Fusing contextual word embeddings for concreteness estimation
    Incitti, Francesca
    Snidaro, Lauro
    [J]. 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2021, : 508 - 515
  • [27] Contextual semantic embeddings for ontology subsumption prediction
    Jiaoyan Chen
    Yuan He
    Yuxia Geng
    Ernesto Jiménez-Ruiz
    Hang Dong
    Ian Horrocks
    [J]. World Wide Web, 2023, 26 : 2569 - 2591
  • [28] Using Paraphrases to Study Properties of Contextual Embeddings
    Burdick, Laura
    Kummerfeld, Jonathan K.
    Mihalcea, Rada
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4558 - 4568
  • [29] Dissecting Contextual Word Embeddings: Architecture and Representation
    Peters, Matthew E.
    Neumann, Mark
    Zettlemoyer, Luke
    Yih, Wen-tau
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1499 - 1509
  • [30] Non-Contextual vs Contextual Word Embeddings in Multiword Expressions Detection
    Piasecki, Maciej
    Kanclerz, Kamil
    [J]. COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2022, 2022, 13501 : 193 - 206