共 50 条
- [2] Progressive Network Grafting for Few-Shot Knowledge Distillation THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 2541 - 2549
- [3] Black-Box Few-Shot Knowledge Distillation COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 196 - 211
- [4] Knowledge Distillation Meets Few-Shot Learning: An Approach for Few-Shot Intent Classification Within and Across Domains PROCEEDINGS OF THE 4TH WORKSHOP ON NLP FOR CONVERSATIONAL AI, 2022, : 108 - 119
- [5] EKD: Effective Knowledge Distillation for Few-Shot Sentiment Analysis ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 164 - 176
- [6] Generalized Few-Shot Node Classification With Graph Knowledge Distillation IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024,
- [7] Few-Shot Learning with Semi-Supervised Transformers for Electronic Health Records MACHINE LEARNING FOR HEALTHCARE CONFERENCE, VOL 182, 2022, 182 : 853 - 873
- [10] Integrating Knowledge Distillation With Learning to Rank for Few-Shot Scene Classification IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60