Pre-trained language models with domain knowledge for biomedical extractive summarization

被引:30
|
作者
Xie, Qianqian [1 ]
Bishop, Jennifer Amy [1 ]
Tiwari, Prayag [2 ]
Ananiadou, Sophia [1 ,3 ]
机构
[1] National Centre for Text Mining, School of Computer Science, The University of Manchester, Manchester, United Kingdom
[2] Department of Computer Science, Aalto University, Espoo, Finland
[3] Alan Turing Institute, London, United Kingdom
关键词
Compilation and indexing terms; Copyright 2024 Elsevier Inc;
D O I
10.1016/j.knosys.2022.109460
中图分类号
学科分类号
摘要
Computational linguistics - Data mining - Text processing
引用
收藏
相关论文
共 50 条
  • [1] Biomedical-domain pre-trained language model for extractive summarization
    Du, Yongping
    Li, Qingxiao
    Wang, Lulin
    He, Yanqing
    [J]. KNOWLEDGE-BASED SYSTEMS, 2020, 199 (199)
  • [2] Pre-trained Language Models in Biomedical Domain: A Systematic Survey
    Wang, Benyou
    Xie, Qianqian
    Pei, Jiahuan
    Chen, Zhihong
    Tiwari, Prayag
    Li, Zhao
    Fu, Jie
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [3] Continual knowledge infusion into pre-trained biomedical language models
    Jha, Kishlay
    Zhang, Aidong
    [J]. BIOINFORMATICS, 2022, 38 (02) : 494 - 502
  • [4] Evaluating the Summarization Comprehension of Pre-Trained Language Models
    Chernyshev, D. I.
    Dobrov, B. V.
    [J]. LOBACHEVSKII JOURNAL OF MATHEMATICS, 2023, 44 (08) : 3028 - 3039
  • [5] Evaluating the Summarization Comprehension of Pre-Trained Language Models
    D. I. Chernyshev
    B. V. Dobrov
    [J]. Lobachevskii Journal of Mathematics, 2023, 44 : 3028 - 3039
  • [6] Knowledge Enhanced Pre-trained Language Model for Product Summarization
    Yin, Wenbo
    Ren, Junxiang
    Wu, Yuejiao
    Song, Ruilin
    Liu, Lang
    Cheng, Zhen
    Wang, Sibo
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 263 - 273
  • [7] Knowledge Inheritance for Pre-trained Language Models
    Qin, Yujia
    Lin, Yankai
    Yi, Jing
    Zhang, Jiajie
    Han, Xu
    Zhang, Zhengyan
    Su, Yusheng
    Liu, Zhiyuan
    Li, Peng
    Sun, Maosong
    Zhou, Jie
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3921 - 3937
  • [8] Modeling Content Importance for Summarization with Pre-trained Language Models
    Xiao, Liqiang
    Lu Wang
    Hao He
    Jin, Yaohui
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3606 - 3611
  • [9] Parameter-Efficient Domain Knowledge Integration from Multiple Sources for Biomedical Pre-trained Language Models
    Lu, Qiuhao
    Dou, Dejing
    Thien Huu Nguyen
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3855 - 3865
  • [10] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033