共 50 条
- [1] Distilling Spikes: Knowledge Distillation in Spiking Neural Networks [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 4536 - 4543
- [3] DSNNs: learning transfer from deep neural networks to spiking neural networks [J]. High Technology Letters, 2020, 26 (02): : 136 - 144
- [5] Knowledge Distillation for Optimization of Quantized Deep Neural Networks [J]. 2020 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2020, : 111 - 116
- [6] Improving the Interpretability of Deep Neural Networks with Knowledge Distillation [J]. 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 905 - 912