共 50 条
- [1] Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing [J]. IFAC PAPERSONLINE, 2022, 55 (02): : 390 - 395
- [2] Private Model Compression via Knowledge Distillation [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1190 - +
- [3] Compression of Acoustic Model via Knowledge Distillation and Pruning [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
- [6] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation [J]. INTERSPEECH 2021, 2021, : 4568 - 4572
- [7] Optimizing energy consumption in directed energy deposition-based hybrid additive manufacturing: an integrated modelling and experimental approach [J]. The International Journal of Advanced Manufacturing Technology, 2024, 130 : 4835 - 4844
- [8] Optimizing energy consumption in directed energy deposition-based hybrid additive manufacturing: an integrated modelling and experimental approach [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2024, 130 (9-10): : 4835 - 4844
- [9] Knowledge Distillation Beyond Model Compression [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6136 - 6143
- [10] Methodology and model for predicting energy consumption in manufacturing at multiple scales [J]. 15TH GLOBAL CONFERENCE ON SUSTAINABLE MANUFACTURING, 2018, 21 : 694 - 701