共 50 条
- [41] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [42] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
- [43] Let All Be Whitened: Multi-Teacher Distillation for Efficient Visual Retrieval [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4126 - 4135
- [47] Affective image recognition with multi-attribute knowledge in deep neural networks [J]. Multimedia Tools and Applications, 2024, 83 : 18353 - 18379
- [50] Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7886 - 7895