共 50 条
- [41] Multimodal fusion and knowledge distillation for improved anomaly detection VISUAL COMPUTER, 2024,
- [42] Improved Knowledge Distillation for Crowd Counting on IoT Devices 2023 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND COMMUNICATIONS, EDGE, 2023, : 207 - 214
- [43] Personalized Decentralized Federated Learning with Knowledge Distillation ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
- [44] Heterogeneous Knowledge Distillation Using Conceptual Learning IEEE ACCESS, 2024, 12 : 52803 - 52814
- [45] KNOWLEDGE DISTILLATION FOR IMPROVED ACCURACY IN SPOKEN QUESTION ANSWERING 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7793 - 7797
- [46] Boosting Contrastive Learning with Relation Knowledge Distillation THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3508 - 3516
- [47] Multimodal Learning with Incomplete Modalities by Knowledge Distillation KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1828 - 1838