共 50 条
- [3] Fine-tuning and multilingual pre-training for abstractive summarization task for the Arabic language [J]. ANNALES MATHEMATICAE ET INFORMATICAE, 2023, 57 : 24 - 35
- [4] FACTPEGASUS: Factuality-Aware Pre-training and Fine-tuning for Abstractive Summarization [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1010 - 1028
- [5] Fine-tuning and multilingual pre-training for abstractive summarization task for the Arabic language [J]. ANNALES MATHEMATICAE ET INFORMATICAE, 2023, 57 : 24 - 35
- [6] Pre-training Fine-tuning data Enhancement method based on active learning [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS, TRUSTCOM, 2022, : 1447 - 1454
- [7] Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1784 - 1795
- [8] Supervised pre-training for improved stability in deep reinforcement learning [J]. ICT EXPRESS, 2023, 9 (01): : 51 - 56