Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing

被引:2
|
作者
Li, Yixin [1 ]
Hu, Fu [1 ]
Ryan, Michael [1 ]
Wang, Ray [2 ]
Liu, Ying [1 ]
机构
[1] Cardiff Univ, Sch Engn, Dept Mech Engn, Cardiff CF24 3AA, Wales
[2] Unicmicro Guangzhou Co Ltd, Guangzhou, Guangdong, Peoples R China
来源
IFAC PAPERSONLINE | 2022年 / 55卷 / 02期
关键词
Additive manufacturing; Knowledge distillation; Energy consumption; Machine learning;
D O I
10.1016/j.ifacol.2022.04.225
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Owing to the advances of data sensing and collecting technologies, more production data of additive manufacturing (AM) systems is available and advanced data analytics techniques are increasingly employed for improving energy management. Current supervised learning-based analytical methods, however, typically require extracting and learning valuable information from a significant amount of data during training. It is difficult to make a trade-off between latency and computing resources to implement the analytical models. As such, this paper developed a method utilizing the knowledge distillation (KD) technique for predicting AM energy consumption based on product geometry information to reduce computational burdens while simultaneously retaining model performance. Through a teacher-student architecture, layer-by-layer images of products and energy consumption datasets are used to train a teacher model from which the knowledge is extracted and used to build a student model to predict energy consumption. A case study was conducted to demonstrate the feasibility and effectiveness of the proposed approach using real-world data from a selective laser sintering (SLS) system. Comparisons between distilled and independently trained student models were made in terms of the root mean square error (RMSE) and training time. The distilled student model performed better (14.3947KWh/kg) and required a shorter training time (34s) than the complex teacher model
引用
收藏
页码:390 / 395
页数:6
相关论文
共 50 条
  • [41] FROM ELICITATION TO STRUCTURING OF ADDITIVE MANUFACTURING KNOWLEDGE
    Grandvallet, Christelle
    Pourroy, Franck
    Prudhomme, Guy
    Vignat, Frederic
    [J]. DS87-6: PROCEEDINGS OF THE 21ST INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN (ICED 17) VOL 6: DESIGN INFORMATION AND KNOWLEDGE, 2017, : 141 - 150
  • [42] A categorical framework for formalising knowledge in additive manufacturing
    Qi, Qunfen
    Pagani, Luca
    Scott, Paul J.
    Jiang, Xiangqian
    [J]. 15TH CIRP CONFERENCE ON COMPUTER AIDED TOLERANCING, CIRP CAT 2018, 2018, 75 : 87 - 91
  • [43] Knowledge Management: An Overview of Roadmaps for Additive Manufacturing
    Fernandes, V.
    Matos, F.
    Godina, R.
    [J]. QUALITY INNOVATION AND SUSTAINABILITY, ICQIS 2022, 2023, : 63 - 75
  • [44] A state-of-the-art review on energy consumption and quality characteristics in metal additive manufacturing processes
    Arfan Majeed
    Altaf Ahmed
    Jingxiang Lv
    Tao Peng
    Muhammad Muzamil
    [J]. Journal of the Brazilian Society of Mechanical Sciences and Engineering, 2020, 42
  • [45] A state-of-the-art review on energy consumption and quality characteristics in metal additive manufacturing processes
    Majeed, Arfan
    Ahmed, Altaf
    Lv, Jingxiang
    Peng, Tao
    Muzamil, Muhammad
    [J]. JOURNAL OF THE BRAZILIAN SOCIETY OF MECHANICAL SCIENCES AND ENGINEERING, 2020, 42 (05)
  • [46] ENERGY CONSUMPTION AND CARBON EMISSIONS OF ADDITIVE MANUFACTURING USING SMART MATERIALS: A SUPPLY CHAIN PERSPECTIVE
    Han, Muyue
    Zhao, Jing
    Li, Lin
    [J]. PROCEEDINGS OF ASME 2023 18TH INTERNATIONAL MANUFACTURING SCIENCE AND ENGINEERING CONFERENCE, MSEC2023, VOL 1, 2023,
  • [47] Knowledge Distillation with a Precise Teacher and Prediction with Abstention
    Xu, Yi
    Pu, Jian
    Zhao, Hui
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9000 - 9006
  • [48] Ensembled CTR Prediction via Knowledge Distillation
    Zhu, Jieming
    Liu, Jinyang
    Li, Weiqi
    Lai, Jincai
    He, Xiuqiang
    Chen, Liang
    Zheng, Zibin
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2941 - 2948
  • [49] Lightweight Spectrum Prediction Based on Knowledge Distillation
    Cheng, Runmeng
    Zhang, Jianzhao
    Deng, Junquan
    Zhu, Yanping
    [J]. RADIOENGINEERING, 2023, 32 (04) : 469 - 478
  • [50] Communication Traffic Prediction with Continual Knowledge Distillation
    Li, Hang
    Wang, Ju
    Hu, Chengming
    Chen, Xi
    Liu, Xue
    Jang, Seowoo
    Dudek, Gregory
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 5481 - 5486