共 50 条
- [21] KNOWLEDGE DISTILLATION FOR NEURAL TRANSDUCERS FROM LARGE SELF-SUPERVISED PRE-TRAINED MODELS [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8527 - 8531
- [22] Self-supervised Learning Based on a Pre-trained Method for the Subtype Classification of Spinal Tumors [J]. COMPUTATIONAL MATHEMATICS MODELING IN CANCER ANALYSIS, CMMCA 2022, 2022, 13574 : 58 - 67
- [23] Mitigating Backdoor Attacks in Pre-Trained Encoders via Self-Supervised Knowledge Distillation [J]. IEEE Transactions on Services Computing, 2024, 17 (05): : 2613 - 2625
- [24] END-TO-END SPOKEN LANGUAGE UNDERSTANDING USING TRANSFORMER NETWORKS AND SELF-SUPERVISED PRE-TRAINED FEATURES [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7483 - 7487
- [27] Boosting Self-Supervised Embeddings for Speech Enhancement [J]. INTERSPEECH 2022, 2022, : 186 - 190