共 50 条
- [41] Better and Faster: Knowledge Transfer from Multiple Self-supervised Learning Tasks via Graph Distillation for Video Classification PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 1135 - 1141
- [42] Knowledge Graph Self-Supervised Rationalization for Recommendation PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 3046 - 3056
- [44] A self-supervised vision transformer to predict survival from histopathology in renal cell carcinoma World Journal of Urology, 2023, 41 : 2233 - 2241
- [46] A COMPREHENSIVE STUDY ON SELF-SUPERVISED DISTILLATION FOR SPEAKER REPRESENTATION LEARNING 2022 IEEE SPOKEN LANGUAGE TECHNOLOGY WORKSHOP, SLT, 2022, : 599 - 604
- [47] Self-supervised Character-to-Character Distillation for Text Recognition 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19416 - 19427
- [48] Self-supervised Image Hash Retrieval Based On Adversarial Distillation 2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 732 - 737
- [49] DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models INTERSPEECH 2023, 2023, : 62 - 66
- [50] Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60