共 50 条
- [3] Cross-Modal Knowledge Distillation with Dropout-Based Confidence PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 653 - 657
- [5] Cons-KD: Dropout-Robust Knowledge Distillation for CTC-Based Automatic Speech Recognition IEEE ACCESS, 2024, 12 : 131136 - 131146