共 50 条
- [1] TEST-TIME ADAPTATION TOWARD PERSONALIZED SPEECH ENHANCEMENT: ZERO-SHOT LEARNING WITH KNOWLEDGE DISTILLATION 2021 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2021, : 176 - 180
- [2] Zero-Shot Knowledge Distillation in Deep Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
- [4] Improving Zero-Shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation COMPUTER VISION - ECCV 2024, PT LXXXIV, 2025, 15142 : 459 - 477
- [5] Triple-0: Zero-shot denoising and dereverberation on an end-to-end frozen anechoic speech separation network PLOS ONE, 2024, 19 (07):
- [6] Robust Test-Time Adaptation for Zero-Shot Prompt Tuning THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16714 - 16722
- [7] Zero-Shot Cross-Lingual Knowledge Transfer in VQA via Multimodal Distillation IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 1 - 11
- [9] Zero-Shot Visual Sentiment Prediction via Cross-Domain Knowledge Distillation IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2024, 5 : 177 - 185
- [10] Towards Zero-Shot Knowledge Distillation for Natural Language Processing 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6551 - 6561