共 50 条
- [31] Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12081 - 12095
- [32] Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models APPLIED SCIENCES-BASEL, 2022, 12 (21):
- [33] A Simple Method to Improve the Performance of Small Pre-trained Language Models on Few-shot Tasks PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 1572 - 1577
- [35] Pre-training Intent-Aware Encoders for Zero- and Few-Shot Intent Classification 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 10433 - 10442
- [36] Few-Sample Named Entity Recognition for Security Vulnerability Reports by Fine-Tuning Pre-trained Language Models DEPLOYABLE MACHINE LEARNING FOR SECURITY DEFENSE, MLHAT 2021, 2021, 1482 : 55 - 78
- [37] Fine-tuning Pre-trained Models for Robustness under Noisy Labels PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 3643 - 3651
- [38] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models 2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
- [39] Embedding Hallucination for Few-Shot Language Fine-tuning NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 5522 - 5530
- [40] AlignDet: Aligning Pre-training and Fine-tuning in Object Detection 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6843 - 6853