共 50 条
- [41] Structured Attention Knowledge Distillation for Lightweight Networks PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 1726 - 1730
- [47] Knowledge Distillation Approach for Efficient Internal Language Model Estimation INTERSPEECH 2023, 2023, : 1339 - 1343
- [48] Plant disease detection based on lightweight CNN model 2021 4TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMPUTER TECHNOLOGIES (ICICT 2021), 2021, : 64 - 68
- [49] Poster Abstract: Efficient Knowledge Distillation to Train Lightweight Neural Network for Heterogeneous Edge Devices PROCEEDINGS OF THE 21ST ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2023, 2023, : 546 - 547
- [50] CerviSegNet-DistillPlus: An Efficient Knowledge Distillation Model for Enhancing Early Detection of Cervical Cancer Pathology IEEE ACCESS, 2024, 12 : 85134 - 85149