共 50 条
- [23] Mitigating carbon footprint for knowledge distillation based deep learning model compression PLOS ONE, 2023, 18 (05):
- [24] Knowledge Reverse Distillation Based Confidence Calibration for Deep Neural Networks Neural Processing Letters, 2023, 55 : 345 - 360
- [25] Feature Distribution-based Knowledge Distillation for Deep Neural Networks 2022 19TH INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2022, : 75 - 76
- [28] Few Sample Knowledge Distillation for Efficient Network Compression 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 14627 - 14635
- [29] Scalable Spintronics-based Bayesian Neural Network for Uncertainty Estimation 2023 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2023,