共 50 条
- [2] Non-Autoregressive Text Generation with Pre-trained Language Models [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 234 - 243
- [3] Exploring Pre-trained Language Models for Vocabulary Alignment in the UMLS [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, PT I, AIME 2024, 2024, 14844 : 273 - 278
- [6] Leveraging Pre-Trained Language Model for Summary Generation on Short Text [J]. IEEE ACCESS, 2020, 8 : 228798 - 228803
- [7] Automatic Title Generation for Text with Pre-trained Transformer Language Model [J]. 2021 IEEE 15TH INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC 2021), 2021, : 17 - 24
- [8] Exploring Pre-trained Language Models for Event Extraction and Generation [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5284 - 5294
- [9] STYLEDGPT: Stylized Response Generation with Pre-trained Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1548 - 1559
- [10] Controllable Generation from Pre-trained Language Models via Inverse Prompting [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2450 - 2460