共 50 条
- [1] Matching Pairs: Attributing Fine-Tuned Models to their Pre-Trained Large Language Models PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 7423 - 7442
- [2] Towards Understanding Large-Scale Discourse Structures in Pre-Trained and Fine-Tuned Language Models NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 2376 - 2394
- [3] Small Pre-trained Language Models Can be Fine-tuned as Large Models via Over-Parameterization PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 3819 - 3834
- [4] Stereotype and Skew: Quantifying Gender Bias in Pre-trained and Fine-tuned Language Models 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2232 - 2242
- [5] Discourse Structure Extraction from Pre-Trained and Fine-Tuned Language Models in Dialogues 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2562 - 2579
- [6] How Should Pre-Trained Language Models Be Fine-Tuned Towards Adversarial Robustness? ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [8] Classification of Conversational Sentences Using an Ensemble Pre-Trained Language Model with the Fine-Tuned Parameter CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (02): : 1669 - 1686
- [9] Lawformer: A pre-trained language model for Chinese legal long documents AI OPEN, 2021, 2 : 79 - 84
- [10] BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, PROPOR 2022, 2022, 13208 : 209 - 218