共 50 条
- [41] Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1912 - 1921
- [42] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
- [43] Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing APPLIED SCIENCES-BASEL, 2023, 13 (07):
- [44] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
- [45] Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12081 - 12095
- [46] Adapt and Refine: A Few-Shot Class-Incremental Learner via Pre-Trained Models PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT 1, 2025, 15031 : 431 - 444
- [47] Virtual Data Augmentation: A Robust and General Framework for Fine-tuning Pre-trained Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3875 - 3887
- [48] A Simple Method to Improve the Performance of Small Pre-trained Language Models on Few-shot Tasks PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 1572 - 1577
- [49] Disfluencies and Fine-Tuning Pre-trained Language Models for Detection of Alzheimer's Disease INTERSPEECH 2020, 2020, : 2162 - 2166
- [50] Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 39 - 51