共 50 条
- [41] Decoupled Multi-teacher Knowledge Distillation based on Entropy [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
- [42] Hybrid Learning with Teacher-student Knowledge Distillation for Recommenders [J]. 20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2020), 2020, : 227 - 235
- [43] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [44] Knowledge Distillation via Route Constrained Optimization [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1345 - 1354
- [45] A Virtual Knowledge Distillation via Conditional GAN [J]. IEEE ACCESS, 2022, 10 : 34766 - 34778
- [48] Private Model Compression via Knowledge Distillation [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1190 - +
- [49] A Virtual Knowledge Distillation via Conditional GAN [J]. IEEE Access, 2022, 10 : 34766 - 34778