A hybrid model compression approach via knowledge distillation for predicting energy consumption in additive manufacturing

被引:2
|
作者
Li, Yixin [1 ]
Hu, Fu [1 ]
Liu, Ying [1 ]
Ryan, Michael [1 ]
Wang, Ray [2 ]
机构
[1] Cardiff Univ, Inst Mech & Mfg Engn, Sch Engn, Cardiff, Wales
[2] Unicmicro Co Ltd, Guangzhou, Peoples R China
关键词
Additive manufacturing; deep learning; neural network compression; knowledge distillation; energy consumption prediction; INDUSTRY; 4.0; ENSEMBLE; OPTIMIZATION; SUSTAINABILITY; DEMAND;
D O I
10.1080/00207543.2022.2160501
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Recently, additive manufacturing (AM) has received increased attention due to its high energy consumption. By extracting hidden information or highly representative features from energy-relevant data, knowledge distillation (KD) reduces predictive model complexity and computational load. By using almost predetermined and fixed models, the distillation process restricts students and teachers from learning from one model to another. To reduce computational costs while maintaining acceptable performance, a teacher assistant (TA) was added to the teacher-student architecture. Firstly, a teacher ensemble was combined with three baseline models to enhance accuracy. In the second step, a teacher ensemble (TA) was formed to bridge the capacity gap between the ensemble and the simplified model. As a result, the complexity of the student model was reduced. Using geometry-based features derived from layer-wise image data, a KD-based predictive model was developed to evaluate the feasibility and effectiveness of two independently trained student models. In comparison with independently trained student models, the performance of the proposed method has the lowest RMSE, MAE, and training time.
引用
收藏
页码:4525 / 4547
页数:23
相关论文
共 50 条
  • [1] Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing
    Li, Yixin
    Hu, Fu
    Ryan, Michael
    Wang, Ray
    Liu, Ying
    [J]. IFAC PAPERSONLINE, 2022, 55 (02): : 390 - 395
  • [2] Private Model Compression via Knowledge Distillation
    Wang, Ji
    Bao, Weidong
    Sun, Lichao
    Zhu, Xiaomin
    Cao, Bokai
    Yu, Philip S.
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1190 - +
  • [3] Compression of Acoustic Model via Knowledge Distillation and Pruning
    Li, Chenxing
    Zhu, Lei
    Xu, Shuang
    Gao, Peng
    Xu, Bo
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
  • [4] A physics-informed knowledge distillation model with spatial-temporal attention for energy consumption pre-assessment in sustainable additive manufacturing
    Wang, Kang
    Fang, Naiyu
    Huang, Zhihao
    Xu, Jinghua
    Zhang, Shuyou
    Qin, Jing
    [J]. SUSTAINABLE ENERGY TECHNOLOGIES AND ASSESSMENTS, 2023, 60
  • [5] Model Compression Algorithm via Reinforcement Learning and Knowledge Distillation
    Liu, Botao
    Hu, Bing-Bing
    Zhao, Ming
    Peng, Sheng-Lung
    Chang, Jou-Ming
    Tsoulos, Ioannis G.
    [J]. MATHEMATICS, 2023, 11 (22)
  • [6] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
    Kim, Jangho
    Chang, Simyung
    Kwak, Nojun
    [J]. INTERSPEECH 2021, 2021, : 4568 - 4572
  • [7] Optimizing energy consumption in directed energy deposition-based hybrid additive manufacturing: an integrated modelling and experimental approach
    Md Rabiul Hasan
    Zhichao Liu
    Asif Rahman
    [J]. The International Journal of Advanced Manufacturing Technology, 2024, 130 : 4835 - 4844
  • [8] Optimizing energy consumption in directed energy deposition-based hybrid additive manufacturing: an integrated modelling and experimental approach
    Hasan, Md Rabiul
    Liu, Zhichao
    Rahman, Asif
    [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2024, 130 (9-10): : 4835 - 4844
  • [9] Knowledge Distillation Beyond Model Compression
    Sarfraz, Fahad
    Arani, Elahe
    Zonooz, Bahram
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6136 - 6143
  • [10] Methodology and model for predicting energy consumption in manufacturing at multiple scales
    Reimann, Jan
    Wenzel, Ken
    Friedemann, Marko
    Putz, Matthias
    [J]. 15TH GLOBAL CONFERENCE ON SUSTAINABLE MANUFACTURING, 2018, 21 : 694 - 701