共 50 条
- [1] Knowledge Distillation based Compact Model Learning Method for Object Detection 11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
- [2] VARIATIONAL STUDENT: LEARNING COMPACT AND SPARSER NETWORKS IN KNOWLEDGE DISTILLATION FRAMEWORK 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3247 - 3251
- [3] Adversarial Knowledge Distillation for a Compact Generator 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 10636 - 10643
- [4] Combining Weight Pruning and Knowledge Distillation For CNN Compression 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3185 - 3192
- [7] Knowledge distillation vulnerability of DeiT through CNN adversarial attack NEURAL COMPUTING & APPLICATIONS, 2023, 37 (12): : 7721 - 7731
- [8] Compact Models for Periocular Verification Through Knowledge Distillation 2020 INTERNATIONAL CONFERENCE OF THE BIOMETRICS SPECIAL INTEREST GROUP (BIOSIG), 2020, P-306