共 50 条
- [41] Fine-Tuning Pre-Trained Language Models with Gaze Supervision PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 217 - 224
- [42] MediSwift: Efficient Sparse Pre-trained Biomedical Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 214 - 230
- [43] Structured Pruning for Efficient Generative Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 10880 - 10895
- [45] The Impact of Training Methods on the Development of Pre-Trained Language Models COMPUTACION Y SISTEMAS, 2024, 28 (01): : 109 - 124
- [46] Stealing Knowledge from Pre-trained Language Models for Federated Classifier Debiasing MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2024, PT X, 2024, 15010 : 685 - 695
- [47] Bi-tuning: Efficient Transfer from Pre-trained Models MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT V, 2023, 14173 : 357 - 373
- [48] Meta Distant Transfer Learning for Pre-trained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
- [49] Empowering Legal Citation Recommendation via Efficient Instruction-Tuning of Pre-trained Language Models ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT I, 2024, 14608 : 310 - 324