共 50 条
- [11] Knowledge Distillation for Semi-supervised Domain Adaptation OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
- [12] MULTI-VIEW CONTRASTIVE LEARNING FOR ONLINE KNOWLEDGE DISTILLATION 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3750 - 3754
- [14] CLDA: Contrastive Learning for Semi-Supervised Domain Adaptation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [15] Supervised Contrastive Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [16] SSCL: Semi-supervised Contrastive Learning for Industrial Anomaly Detection PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IV, 2024, 14428 : 100 - 112
- [17] Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13492 - 13503
- [18] Balanced Knowledge Distillation with Contrastive Learning for Document Re-ranking PROCEEDINGS OF THE 2023 ACM SIGIR INTERNATIONAL CONFERENCE ON THE THEORY OF INFORMATION RETRIEVAL, ICTIR 2023, 2023, : 247 - 255
- [19] Incorporating Domain Knowledge Graph into Multimodal Movie Genre Classification with Self-Supervised Attention and Contrastive Learning PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 3337 - 3345
- [20] Improving Structural and Semantic Global Knowledge in Graph Contrastive Learning with Distillation ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT II, PAKDD 2024, 2024, 14646 : 364 - 375