共 50 条
- [1] FactGen: Faithful Text Generation by Factuality-aware Pre-training and Contrastive Ranking Fine-tuning Journal of Artificial Intelligence Research, 2023, 76 : 1281 - 1303
- [2] FACTPEGASUS: Factuality-Aware Pre-training and Fine-tuning for Abstractive Summarization NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1010 - 1028
- [3] CREATER: CTR-driven Advertising Text Generation with Controlled Pre-Training and Contrastive Fine-Tuning 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, NAACL-HLT 2022, 2022, : 9 - 17
- [4] Bridging the Gap between Pre-Training and Fine-Tuning for Commonsense Generation 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 376 - 383
- [5] Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1906 - 1912
- [6] CODE: Contrastive Pre-training with Adversarial Fine-Tuning for Zero-Shot Expert Linking THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11846 - 11854
- [8] AlignDet: Aligning Pre-training and Fine-tuning in Object Detection 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6843 - 6853
- [9] Improved Fine-Tuning by Better Leveraging Pre-Training Data ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [10] Tri-Train: Automatic Pre-Fine Tuning between Pre-Training and Fine-Tuning for SciNER FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4778 - 4787