共 50 条
- [1] Using Noise and External Knowledge to Enhance Chinese Pre-trained Model 2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 476 - 480
- [2] TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 3687 - 3697
- [3] Distilling Relation Embeddings from Pre-trained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9044 - 9062
- [4] NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3285 - 3295
- [6] GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model INTERSPEECH 2024, 2024, : 3325 - 3329
- [7] Knowledge Rumination for Pre-trained Language Models 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3387 - 3404
- [8] Knowledge Inheritance for Pre-trained Language Models NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3921 - 3937
- [9] Toxicity Prediction Using Pre-trained Autoencoder PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 299 - 304
- [10] CollRec: Pre-Trained Language Models and Knowledge Graphs Collaborate to Enhance Conversational Recommendation System IEEE ACCESS, 2024, 12 : 104663 - 104675