共 50 条
- [1] Task-guided Disentangled Tuning for Pretrained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3126 - 3137
- [2] CONVFIT: Conversational Fine-Tuning of Pretrained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1151 - 1168
- [3] DN at SemEval-2023 Task 12: Low-Resource Language Text Classification via Multilingual Pretrained Language Model Fine-tuning 17TH INTERNATIONAL WORKSHOP ON SEMANTIC EVALUATION, SEMEVAL-2023, 2023, : 1537 - 1541
- [5] Low-resource Taxonomy Enrichment with Pretrained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2747 - 2758
- [6] Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7870 - 7881
- [7] Toward Low-Resource Languages Machine Translation: A Language-Specific Fine-Tuning With LoRA for Specialized Large Language Models IEEE ACCESS, 2025, 13 : 46616 - 46626
- [8] Fine-Tuning Pretrained Language Models to Enhance Dialogue Summarization in Customer Service Centers PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2023, 2023, : 365 - 373
- [9] Noise-Robust Fine-Tuning of Pretrained Language Models via External Guidance FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12528 - 12540