共 50 条
- [4] Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language [J]. SYSTEMS, 2024, 12 (01):
- [5] A survey of transformer-based multimodal pre-trained modals [J]. NEUROCOMPUTING, 2023, 515 : 89 - 106
- [6] Automatic Question Generation using RNN-based and Pre-trained Transformer-based Models in Low Resource Indonesian Language [J]. INFORMATICA-AN INTERNATIONAL JOURNAL OF COMPUTING AND INFORMATICS, 2022, 46 (07): : 103 - 118
- [7] Automatic Title Generation for Text with Pre-trained Transformer Language Model [J]. 2021 IEEE 15TH INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC 2021), 2021, : 17 - 24
- [8] Non-Autoregressive Text Generation with Pre-trained Language Models [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 234 - 243
- [10] Controllable Generation from Pre-trained Language Models via Inverse Prompting [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2450 - 2460