共 50 条
- [1] CONVFIT: Conversational Fine-Tuning of Pretrained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1151 - 1168
- [2] Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7870 - 7881
- [3] Fine-Tuning Pretrained Language Models to Enhance Dialogue Summarization in Customer Service Centers PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2023, 2023, : 365 - 373
- [4] Equi-Tuning: Group Equivariant Fine-Tuning of Pretrained Models THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6788 - 6796
- [5] Prompting or Fine-tuning? A Comparative Study of Large Language Models for Taxonomy Construction 2023 ACM/IEEE INTERNATIONAL CONFERENCE ON MODEL DRIVEN ENGINEERING LANGUAGES AND SYSTEMS COMPANION, MODELS-C, 2023, : 588 - 596
- [6] An Empirical Evaluation of the Zero-Shot, Few-Shot, and Traditional Fine-Tuning Based Pretrained Language Models for Sentiment Analysis in Software Engineering IEEE ACCESS, 2024, 12 : 109714 - 109734
- [9] Two-Stage Fine-Tuning for Improved Bias and Variance for Large Pretrained Language Models PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 15746 - 15761