共 50 条
- [32] Cross-Domain Authorship Attribution Using Pre-trained Language Models ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2020, PT I, 2020, 583 : 255 - 266
- [33] Continual Learning with Pre-Trained Models: A Survey PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8363 - 8371
- [34] A Survey on Recent Advances in Keyphrase Extraction from Pre-trained Language Models 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2153 - 2164
- [35] Enhancing Domain Modeling with Pre-trained Large Language Models: An Automated Assistant for Domain Modelers CONCEPTUAL MODELING, ER 2024, 2025, 15238 : 235 - 253
- [37] A Study of Pre-trained Language Models in Natural Language Processing 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [39] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [40] Analyzing Individual Neurons in Pre-trained Language Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4865 - 4880