Extensive evaluation of transformer-based architectures for adverse drug events extraction

被引:3
|
作者
Scaboro, Simone [1 ]
Portelli, Beatrice [1 ,2 ]
Chersoni, Emmanuele [3 ]
Santus, Enrico [4 ]
Serra, Giuseppe [1 ]
机构
[1] Univ Udine, Dept Math Comp Sci & Phys, AILAB Udine, via Sci 206, I-33100 Udine, Friuli Venezia, Italy
[2] Univ Naples Federico II, Dept Biol, Corso Umberto 140, I-80138 Campania, Italy
[3] Hong Kong Polytech Univ, Dept Chinese & Bilingual Studies CBS, Hung Hom, Hong Kong, Peoples R China
[4] Bayer, Whippany, NJ USA
关键词
Adverse drug events; Transformers; Side effects; Extraction; PHARMACOVIGILANCE; CLASSIFICATION; MENTIONS; CORPUS;
D O I
10.1016/j.knosys.2023.110675
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Adverse Drug Event (ADE) extraction is one of the core tasks in digital pharmacovigilance, especially when applied to informal texts. This task has been addressed by the Natural Language Processing community using large pre-trained language models, such as BERT. Despite the great number of Transformer-based architectures used in the literature, it is unclear which of them has better performances and why. Therefore, in this paper we perform an extensive evaluation and analysis of 19 Transformer-based models for ADE extraction on informal texts. We compare the performance of all the considered models on two datasets with increasing levels of informality (forums posts and tweets). We also combine the purely Transformer-based models with two commonly-used additional processing layers (CRF and LSTM), and analyze their effect on the models performance. Furthermore, we use a well-established feature importance technique (SHAP) to correlate the performance of the models with a set of features that describe them: model category (AutoEncoding, AutoRegressive, Text-to-Text), pre-training domain, training from scratch, and model size in number of parameters. At the end of our analyses, we identify a list of take-home messages that can be derived from the experimental data.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Contextualized medication information extraction using Transformer-based deep learning architectures
    Chen, Aokun
    Yu, Zehao
    Yang, Xi
    Guo, Yi
    Bian, Jiang
    Wu, Yonghui
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 142
  • [2] TP-DDI: Transformer-based pipeline for the extraction of Drug-Drug Interactions
    Zaikis, Dimitrios
    Vlahavas, Ioannis
    [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, 2021, 119
  • [3] PARASITIC EGG DETECTION AND CLASSIFICATION WITH TRANSFORMER-BASED ARCHITECTURES
    Pedraza, Anibal
    Ruiz-Santaquiteria, Jesus
    Deniz, Oscar
    Bueno, Gloria
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4301 - 4305
  • [4] Transformer-based Extraction of Deep Image Models
    Battis, Verena
    Penner, Alexander
    [J]. 2022 IEEE 7TH EUROPEAN SYMPOSIUM ON SECURITY AND PRIVACY (EUROS&P 2022), 2022, : 320 - 336
  • [5] Transformer-based highlights extraction from scientific papers
    La Quatra, Moreno
    Cagliero, Luca
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 252
  • [6] Influence of Context in Transformer-Based Medication Relation Extraction
    Modersohn, Luise
    Hahn, Udo
    [J]. MEDINFO 2023 - THE FUTURE IS ACCESSIBLE, 2024, 310 : 669 - 673
  • [7] Transformer and Graph Transformer-Based Prediction of Drug-Target Interactions
    Qian, Meiling
    Lu, Weizhong
    Zhang, Yu
    Liu, Junkai
    Wu, Hongjie
    Lu, Yaoyao
    Li, Haiou
    Fu, Qiming
    Shen, Jiyun
    Xiao, Yongbiao
    [J]. CURRENT BIOINFORMATICS, 2024, 19 (05) : 470 - 481
  • [8] Enhanced Linear and Vision Transformer-Based Architectures for Time Series Forecasting
    Alharthi, Musleh
    Mahmood, Ausif
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (05)
  • [9] An Extensive Study of the Structure Features in Transformer-based Code Semantic Summarization
    Yang, Kang
    Mao, Xinjun
    Wang, Shangwen
    Qin, Yihao
    Zhang, Tanghaoran
    Lu, Yao
    Al-Sabahi, Kamal
    [J]. 2023 IEEE/ACM 31ST INTERNATIONAL CONFERENCE ON PROGRAM COMPREHENSION, ICPC, 2023, : 89 - 100
  • [10] Arabic abstractive text summarization using RNN-based and transformer-based architectures
    Bani-Almarjeh, Mohammad
    Kurdy, Mohamad-Bassam
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (02)