共 50 条
- [5] Evaluating the Summarization Comprehension of Pre-Trained Language Models [J]. Lobachevskii Journal of Mathematics, 2023, 44 : 3028 - 3039
- [6] Knowledge Enhanced Pre-trained Language Model for Product Summarization [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 263 - 273
- [7] Knowledge Inheritance for Pre-trained Language Models [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3921 - 3937
- [8] Modeling Content Importance for Summarization with Pre-trained Language Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3606 - 3611
- [9] Parameter-Efficient Domain Knowledge Integration from Multiple Sources for Biomedical Pre-trained Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3855 - 3865
- [10] Probing Pre-Trained Language Models for Disease Knowledge [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033