共 50 条
- [1] SKILL: SIMILARITY-AWARE KNOWLEDGE DISTILLATION FOR SPEECH SELF-SUPERVISED LEARNING 2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 675 - 679
- [4] Distill on the Go: Online knowledge distillation in self-supervised learning 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
- [5] On-Device Constrained Self-Supervised Speech Representation Learning for Keyword Spotting via Knowledge Distillation INTERSPEECH 2023, 2023, : 1623 - 1627
- [6] Self-Supervised Contrastive Learning for Camera-to-Radar Knowledge Distillation 2024 20TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING IN SMART SYSTEMS AND THE INTERNET OF THINGS, DCOSS-IOT 2024, 2024, : 154 - 161
- [8] Hierarchical Self-supervised Augmented Knowledge Distillation PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1217 - 1223
- [9] DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [10] COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using Transformers 2024 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS, WACVW 2024, 2024, : 518 - 528