共 50 条
- [21] More from Less: Self-supervised Knowledge Distillation for Routine Histopathology Data MACHINE LEARNING IN MEDICAL IMAGING, MLMI 2023, PT I, 2024, 14348 : 454 - 463
- [22] FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning INTERSPEECH 2022, 2022, : 3588 - 3592
- [23] COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using Transformers 2024 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS, WACVW 2024, 2024, : 518 - 528
- [24] Overcoming Oscillations in Quantization-Aware Training INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
- [26] Initiative-Aware Self-Supervised Learning for Knowledge-Grounded Conversations SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 522 - 532
- [27] Emotion-Aware Speech Self-Supervised Representation Learning with Intensity Knowledge INTERSPEECH 2024, 2024, : 3180 - 3184
- [29] Improving Self-supervised Lightweight Model Learning via Hard-Aware Metric Distillation COMPUTER VISION, ECCV 2022, PT XXXI, 2022, 13691 : 295 - 311
- [30] MAL: Motion-Aware Loss with Temporal and Distillation Hints for Self-Supervised Depth Estimation 2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 7318 - 7324