A new hyper-parameter optimization method for machine learning in fault classification

被引:0
|
作者
Xingchen Ye
Liang Gao
Xinyu Li
Long Wen
机构
[1] Huazhong University of Science and Technology,The State Key Laboratory of Digital Manufacturing Equipment & Technology
[2] China University of Geosciences,School of Mechanical Engineering and Electronic Information
来源
Applied Intelligence | 2023年 / 53卷
关键词
Hyper-parameter optimization; Fault classification; Dimension reduction; Partial dependencies;
D O I
暂无
中图分类号
学科分类号
摘要
Accurate bearing fault classification is essential for the safe and stable operation of rotating machinery. The success of Machine Learning (ML) in fault classification is mainly dependent on efficient features and the optimal pre-defined hyper-parameters. Various hyper-parameter optimization (HPO) methods have been proposed to tune the ML algorithms’ hyper-parameters in low dimensions but ignore the hyper-parameters of Feature Engineering (FE). The hyper-parameter dimension is high because both FE and the ML algorithm contain many hyper-parameters. This paper proposed a new HPO method for high dimensions based on dimension reduction and partial dependencies. Firstly, the whole hyper-parameter space is separated into two subspaces of FE and the ML algorithm to reduce time consumption. Secondly, the sensitive intervals of hyperparameters can be recognized by partial dependencies due to the nonlinearity of the relationship between the hyperparameters. Then HPO is conducted in intervals to acquire more satisfactory accuracy. The proposed method is verified on three OpenML datasets and the CWRU bearing dataset. The results show that it can automatically construct efficient domain features and outperforms traditional HPO methods and famous ML algorithms. The proposed method is also very time efficient.
引用
收藏
页码:14182 / 14200
页数:18
相关论文
共 50 条
  • [1] A new hyper-parameter optimization method for machine learning in fault classification
    Ye, Xingchen
    Gao, Liang
    Li, Xinyu
    Wen, Long
    APPLIED INTELLIGENCE, 2023, 53 (11) : 14182 - 14200
  • [2] An efficient hyper-parameter optimization method for supervised learning
    Shi, Ying
    Qi, Hui
    Qi, Xiaobo
    Mu, Xiaofang
    APPLIED SOFT COMPUTING, 2022, 126
  • [3] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [4] Classification complexity assessment for hyper-parameter optimization
    Cai, Ziyun
    Long, Yang
    Shao, Ling
    PATTERN RECOGNITION LETTERS, 2019, 125 : 396 - 403
  • [5] A study on depth classification of defects by machine learning based on hyper-parameter search
    Chen, Haoze
    Zhang, Zhijie
    Yin, Wuliang
    Zhao, Chenyang
    Wang, Fengxiang
    Li, Yanfeng
    MEASUREMENT, 2022, 189
  • [6] Hyper-Parameter Optimization Using MARS Surrogate for Machine-Learning Algorithms
    Li, Yangyang
    Liu, Guangyuan
    Lu, Gao
    Jiao, Licheng
    Marturi, Naresh
    Shang, Ronghua
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2020, 4 (03): : 287 - 297
  • [7] CNN hyper-parameter optimization for environmental sound classification
    Inik, Ozkan
    APPLIED ACOUSTICS, 2023, 202
  • [8] Cultural Events Classification using Hyper-parameter Optimization of Deep Learning Technique
    Feng Zhipeng
    Gani, Hamdan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (05) : 603 - 609
  • [9] Quantum Inspired High Dimensional Hyper-Parameter Optimization of Machine Learning Model
    Li, Yangyang
    Lu, Gao
    Zhou, Linhao
    Jiao, Licheng
    2017 INTERNATIONAL SMART CITIES CONFERENCE (ISC2), 2017,
  • [10] A New Baseline for Automated Hyper-Parameter Optimization
    Geitle, Marius
    Olsson, Roland
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, 2019, 11943 : 521 - 530