Integration of Bayesian optimization into hyperparameter tuning of the particle swarm optimization algorithm to enhance neural networks in bearing failure classification

被引:0
|
作者
Soares, Ricardo Cardoso [1 ]
Silva, Julio Cesar [2 ]
de Lucena, Jose Anselmo [2 ]
Filho, Abel Cavalcante Lima [2 ]
de Souza Ramos, Jorge Gabriel Gomes [4 ]
Brito, Alisson V. [2 ,3 ]
机构
[1] Department of Industry, IFPI, Praça da Liberdade, 1597, Piauí, Teresina,64000-040, Brazil
[2] PPGEM, UFPB, Campus I Lot. Cidade Universitária, Paraíba, João Pessoa,58051-900, Brazil
[3] Center of Informatics, UFPB, Campus I Lot. Cidade Universitária, Paraíba, João Pessoa,58051-900, Brazil
[4] Department of Physics, UFPB, Campus I Lot. Cidade Universitária, Paraíba, João Pessoa,58051-900, Brazil
关键词
Particle swarm optimization (PSO);
D O I
10.1016/j.measurement.2024.115829
中图分类号
学科分类号
摘要
Bearings are a primary source of defects in induction motors (IM), requiring effective diagnostic methods. Neural Networks (NNs) are useful for this purpose, but optimizing their parameters is challenging. This study presents an alternative approach that modifies Particle Swarm Optimization (PSO) by integrating Bayesian optimization to dynamically tune PSO hyperparameters, enhancing the NN's ability to detect defects in IM bearings. Unlike traditional methods with empirically determined hyperparameters, this approach adapts to varying data conditions for better performance. The method was tested using vibration and current signals of different durations (2s, 1s, 0.5s, 0.25s) and torque ranges (0Nm to 22Nm) from a laboratory-generated dataset, with results compared to those obtained using other optimizers. The accuracy achieved was 92.57% for vibration signals at 0.25s and 97.23% across torque ranges. For current signals, the accuracy was 91.29% for 0.25s samples and 97.5% across torque ranges. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 50 条
  • [1] Hyperparameter Tuning for Deep Neural Networks Based Optimization Algorithm
    Vidyabharathi, D.
    Mohanraj, V.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 36 (03): : 2559 - 2573
  • [2] Hyperparameter Optimization for Convolutional Neural Networks using the Salp Swarm Algorithm
    Abdulsaed E.H.
    Alabbas M.
    Khudeyer R.S.
    Informatica (Slovenia), 2023, 47 (09): : 133 - 144
  • [3] Neural network hyperparameter optimization based on improved particle swarm optimization①
    Xie X.
    He W.
    Zhu Y.
    Yu J.
    High Technology Letters, 2023, 29 (04) : 427 - 433
  • [4] Neural network hyperparameter optimization based on improved particle swarm optimization
    谢晓燕
    HE Wanqi
    ZHU Yun
    YU Jinhao
    High Technology Letters, 2023, 29 (04) : 427 - 433
  • [5] A bayesian particle swarm optimization algorithm
    Research Institute of Computer Software, Xi'An Jiaotong University, Xi'an 710049, China
    Chin J Electron, 2006, 4 A (937-940):
  • [6] A Bayesian particle swarm optimization algorithm
    Heng Xingchen
    Qin Zheng
    Wang Xianhui
    Shao Liping
    CHINESE JOURNAL OF ELECTRONICS, 2006, 15 (4A): : 937 - 940
  • [7] Efficient Hyperparameter Optimization for Convolution Neural Networks in Deep Learning: A Distributed Particle Swarm Optimization Approach
    Guo, Yu
    Li, Jian-Yu
    Zhan, Zhi-Hui
    CYBERNETICS AND SYSTEMS, 2020, 52 (01) : 36 - 57
  • [8] Hyperparameter Optimization for Convolutional Neural Networks with Genetic Algorithms and Bayesian Optimization
    Puentes G, David E.
    Barrios H, Carlos J.
    Navaux, Philippe O. A.
    2022 IEEE LATIN AMERICAN CONFERENCE ON COMPUTATIONAL INTELLIGENCE (LA-CCI), 2022, : 131 - 135
  • [9] An effective algorithm for hyperparameter optimization of neural networks
    Diaz, G. I.
    Fokoue-Nkoutche, A.
    Nannicini, G.
    Samulowitz, H.
    IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 2017, 61 (4-5)
  • [10] Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks
    Cho, Hyunghun
    Kim, Yongjin
    Lee, Eunjung
    Choi, Daeyoung
    Lee, Yongjae
    Rhee, Wonjong
    IEEE ACCESS, 2020, 8 : 52588 - 52608