A new hyper-parameter optimization method for machine learning in fault classification

被引:0
|
作者
Xingchen Ye
Liang Gao
Xinyu Li
Long Wen
机构
[1] Huazhong University of Science and Technology,The State Key Laboratory of Digital Manufacturing Equipment & Technology
[2] China University of Geosciences,School of Mechanical Engineering and Electronic Information
来源
Applied Intelligence | 2023年 / 53卷
关键词
Hyper-parameter optimization; Fault classification; Dimension reduction; Partial dependencies;
D O I
暂无
中图分类号
学科分类号
摘要
Accurate bearing fault classification is essential for the safe and stable operation of rotating machinery. The success of Machine Learning (ML) in fault classification is mainly dependent on efficient features and the optimal pre-defined hyper-parameters. Various hyper-parameter optimization (HPO) methods have been proposed to tune the ML algorithms’ hyper-parameters in low dimensions but ignore the hyper-parameters of Feature Engineering (FE). The hyper-parameter dimension is high because both FE and the ML algorithm contain many hyper-parameters. This paper proposed a new HPO method for high dimensions based on dimension reduction and partial dependencies. Firstly, the whole hyper-parameter space is separated into two subspaces of FE and the ML algorithm to reduce time consumption. Secondly, the sensitive intervals of hyperparameters can be recognized by partial dependencies due to the nonlinearity of the relationship between the hyperparameters. Then HPO is conducted in intervals to acquire more satisfactory accuracy. The proposed method is verified on three OpenML datasets and the CWRU bearing dataset. The results show that it can automatically construct efficient domain features and outperforms traditional HPO methods and famous ML algorithms. The proposed method is also very time efficient.
引用
收藏
页码:14182 / 14200
页数:18
相关论文
共 50 条
  • [31] Modified Grid Searches for Hyper-Parameter Optimization
    Lopez, David
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2020, 2020, 12344 : 221 - 232
  • [32] Hybrid Hyper-parameter Optimization for Collaborative Filtering
    Szabo, Peter
    Genge, Bela
    2020 22ND INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2020), 2020, : 210 - 217
  • [33] A review of automatic selection methods for machine learning algorithms and hyper-parameter values
    Luo, Gang
    NETWORK MODELING AND ANALYSIS IN HEALTH INFORMATICS AND BIOINFORMATICS, 2016, 5 (01):
  • [34] Hippo: Sharing Computations in Hyper-Parameter Optimization
    Shin, Ahnjae
    Jeong, Joo Seong
    Kim, Do Yoon
    Jung, Soyoung
    Chun, Byung-Gon
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (05): : 1038 - 1052
  • [35] Research about pruning hyper-parameter optimization method based on transfer learning in geographic information system
    Zhang X.
    Li Y.
    Li Z.
    Arabian Journal of Geosciences, 2021, 14 (5)
  • [36] Efficient Federated Learning with Adaptive Client-Side Hyper-Parameter Optimization
    Kundroo, Majid
    Kim, Taehong
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 973 - 974
  • [37] Hyper-parameter optimization of deep learning model for prediction of Parkinson's disease
    Kaur, Sukhpal
    Aggarwal, Himanshu
    Rani, Rinkle
    MACHINE VISION AND APPLICATIONS, 2020, 31 (05)
  • [38] A GPU Scheduling Framework to Accelerate Hyper-Parameter Optimization in Deep Learning Clusters
    Son, Jaewon
    Yoo, Yonghyuk
    Kim, Khu-rai
    Kim, Youngjae
    Lee, Kwonyong
    Park, Sungyong
    ELECTRONICS, 2021, 10 (03) : 1 - 15
  • [39] Hyper-parameter optimization of deep learning model for prediction of Parkinson’s disease
    Sukhpal Kaur
    Himanshu Aggarwal
    Rinkle Rani
    Machine Vision and Applications, 2020, 31
  • [40] Hyper-parameter Tuning for Quantum Support Vector Machine
    Demirtas, Fadime
    Tanyildizi, Erkan
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2022, 22 (04) : 47 - 54