A general approach for improving deep learning-based medical relation extraction using a pre-trained model and fine-tuning

被引:30
|
作者
Chen, Tao [1 ]
Wu, Mingfen [1 ]
Li, Hexi [1 ]
机构
[1] Wuyi Univ, Dept Comp Sci & Engn, Fac Intelligent Mfg, 22 Dongcheng Village, Jiangmen City 529020, Guangdong, Peoples R China
关键词
D O I
10.1093/database/baz116
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The automatic extraction of meaningful relations from biomedical literature or clinical records is crucial in various biomedical applications. Most of the current deep learning approaches for medical relation extraction require large-scale training data to prevent overfitting of the training model. We propose using a pre-trained model and a fine-tuning technique to improve these approaches without additional time-consuming human labeling. Firstly, we show the architecture of Bidirectional Encoder Representations from Transformers (BERT), an approach for pre-training a model on large-scale unstructured text. We then combine BERT with a one-dimensional convolutional neural network (1d-CNN) to fine-tune the pre-trained model for relation extraction. Extensive experiments on three datasets, namely the BioCreative V chemical disease relation corpus, traditional Chinese medicine literature corpus and i2b2 2012 temporal relation challenge corpus, show that the proposed approach achieves state-of-the-art results (giving a relative improvement of 22.2, 7.77, and 38.5% in F1 score, respectively, compared with a traditional 1d-CNN classifier). The source code is available at https://github.com/chentao1999/MedicalRelationExtraction.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
    Alt, Christoph
    Huebner, Marc
    Hennig, Leonhard
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
  • [2] An efficient ptychography reconstruction strategy through fine-tuning of large pre-trained deep learning model
    Pan, Xinyu
    Wang, Shuo
    Zhou, Zhongzheng
    Zhou, Liang
    Liu, Peng
    Li, Chun
    Wang, Wenhui
    Zhang, Chenglong
    Dong, Yuhui
    Zhang, Yi
    [J]. ISCIENCE, 2023, 26 (12)
  • [3] Improving Pre-Trained Weights through Meta-Heuristics Fine-Tuning
    de Rosa, Gustavo H.
    Roder, Mateus
    Papa, Joao Paulo
    dos Santos, Claudio F. G.
    [J]. 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [4] Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
    Kit, Yuheng
    Mokji, Musa Mohd
    [J]. IEEE ACCESS, 2022, 10 : 107056 - 107065
  • [5] Virtual Data Augmentation: A Robust and General Framework for Fine-tuning Pre-trained Models
    Zhou, Kun
    Zhao, Wayne Xin
    Wang, Sirui
    Zhang, Fuzheng
    Wu, Wei
    We, Ji-Rong
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3875 - 3887
  • [6] Fine-Tuning Pre-Trained Model to Extract Undesired Behaviors from App Reviews
    Zhang, Wenyu
    Wang, Xiaojuan
    Lai, Shanyan
    Ye, Chunyang
    Zhou, Hui
    [J]. 2022 IEEE 22ND INTERNATIONAL CONFERENCE ON SOFTWARE QUALITY, RELIABILITY AND SECURITY, QRS, 2022, : 1125 - 1134
  • [7] Make Pre-trained Model Reversible: From Parameter to Memory Efficient Fine-Tuning
    Liao, Baohao
    Tan, Shaomu
    Monz, Christof
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [8] HyPe: Better Pre-trained Language Model Fine-tuning with Hidden Representation Perturbation
    Yuan, Hongyi
    Yuan, Zheng
    Tan, Chuanqi
    Huang, Fei
    Huang, Songfang
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 3246 - 3264
  • [9] Monkeypox Virus Detection Using Pre-trained Deep Learning-based Approaches
    Sitaula, Chiranjibi
    Shahi, Tej Bahadur
    [J]. JOURNAL OF MEDICAL SYSTEMS, 2022, 46 (11)
  • [10] Monkeypox Virus Detection Using Pre-trained Deep Learning-based Approaches
    Chiranjibi Sitaula
    Tej Bahadur Shahi
    [J]. Journal of Medical Systems, 46