共 50 条
- [1] Poster: Self-Supervised Quantization-Aware Knowledge Distillation 2023 IEEE/ACM SYMPOSIUM ON EDGE COMPUTING, SEC 2023, 2023, : 250 - 252
- [2] SKILL: SIMILARITY-AWARE KNOWLEDGE DISTILLATION FOR SPEECH SELF-SUPERVISED LEARNING 2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 675 - 679
- [3] Hierarchical Self-supervised Augmented Knowledge Distillation PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1217 - 1223
- [6] Self-supervised Knowledge Distillation Using Singular Value Decomposition COMPUTER VISION - ECCV 2018, PT VI, 2018, 11210 : 339 - 354
- [7] Distill on the Go: Online knowledge distillation in self-supervised learning 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
- [8] SSSD: Self-Supervised Self Distillation 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2769 - 2776
- [9] Knowledge-Aware Self-supervised Educational Resources Recommendation WEB INFORMATION SYSTEMS AND APPLICATIONS, WISA 2024, 2024, 14883 : 524 - 535