Tool wear state recognition based on gradient boosting decision tree and hybrid classification RBM

被引:36
|
作者
Li, Guofa [1 ,2 ]
Wang, Yanbo [1 ,2 ]
He, Jialong [1 ,2 ]
Hao, Qingbo [3 ]
Yang, Haiji [1 ,2 ]
Wei, Jingfeng [1 ,2 ]
机构
[1] Jilin Univ, Minist Educ, Key Lab CNC Equipment Reliabil, Changchun, Jilin, Peoples R China
[2] Jilin Univ, Sch Mech & Aerosp Engn, Changchun 130022, Peoples R China
[3] Aviat Univ Air Force, Off Acad Affairs, Changchun 130000, Peoples R China
来源
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY | 2020年 / 110卷 / 1-2期
基金
中国国家自然科学基金;
关键词
Tool wear state recognition; Hybrid classification RBM; Contrastive divergence; RMSspectral; Gradient boosting decision tree; SUPPORT VECTOR MACHINE; FEATURE-SELECTION; DESCENT; MODEL; SVM;
D O I
10.1007/s00170-020-05890-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Machined surface quality and dimensional accuracy are significantly affected by tool wear in machining process. Tool wear state (TWS) recognition is highly desirable to realize automated machining process. In order to improve the accuracy of TWS recognition, this research develops a TWS recognition scheme using an indirect measurement method which selects signal features that are strongly correlated with tool wear to recognize TWS. Firstly, three time domain features are proposed, including dynamic time warping feature and two entropy features. The time, frequency, and time-frequency domain features of the vibration and force signals are extracted to form a feature set. Secondly, gradient boosting decision tree (GBDT) is adopted to select the optimal feature subset. Lastly, contrastive divergence (CD) and RMSspectral are used to train hybrid classification RBM (H-ClassRBM). The trained H-ClassRBM is used for TWS recognition. The PHM challenge 2010 data set is used to validate the proposed scheme. Experimental results show that the proposed features have better monotonicity and correlation than the classical features. Compared with CD and Adadelta, CD and Adagrad, and CD and stochastic gradient descent with momentum, the H-ClassRBM trained by CD and RMSspectral improves recognition accuracy by 1%, 2%, and 2%, respectively. Compared with feedforward neural network, probabilistic neural network, Gaussian kernel support vector machine, and H-ClassRBM, the proposed TWS recognition scheme improves recognition accuracy by 37%, 51%, 9%, and 8%, respectively. Therefore, the proposed TWS recognition scheme is beneficial in improving the recognition accuracy of TWS, and provides an effective guide for decision-making in the machining process.
引用
收藏
页码:511 / 522
页数:12
相关论文
共 50 条
  • [41] Strip steel surface defect recognition based on Boosting optimized decision tree
    Yang, Shui-Shan
    He, Yong-Hui
    Zhao, Wan-Sheng
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2010, 39 (05): : 954 - 958
  • [42] Decision Tree Application to Classification Problems with Boosting Algorithm
    Zhao, Long
    Lee, Sanghyuk
    Jeong, Seon-Phil
    ELECTRONICS, 2021, 10 (16)
  • [43] Channel State Information Indoor Fingerprint Localization Algorithm Based on Locally Linear Embedding and Gradient Boosting Decision Tree
    Li Xinchun
    Zhao Zhongting
    Yu Hongshi
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (02)
  • [44] Distributed Gradient Boosting Decision Tree Algorithm for High-dimensional and Multi- classification Problems
    Jiang J.-W.
    Fu F.-C.
    Shao Y.-X.
    Cui B.
    Ruan Jian Xue Bao/Journal of Software, 2019, 30 (03): : 784 - 798
  • [45] Identifying Transportation Modes Using Gradient Boosting Decision Tree
    Fu, Xin
    Wang, Dong
    Zhang, Hengcai
    INTELLIGENT COMPUTING THEORIES AND APPLICATION, PT II, 2018, 10955 : 543 - 549
  • [46] Tracking-by-Segmentation with Online Gradient Boosting Decision Tree
    Son, Jeany
    Jung, Ilchae
    Park, Kayoung
    Han, Bohyung
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3056 - 3064
  • [47] HarpGBDT: Optimizing Gradient Boosting Decision Tree for Parallel Efficiency
    Peng, Bo
    Chen, Langshi
    Li, Jiayu
    Jiang, Miao
    Akkas, Selahattin
    Smirnov, Egor
    Israfilov, Ruslan
    Khekhnev, Sergey
    Nikolaev, Andrey
    Qiu, Judy
    2019 IEEE INTERNATIONAL CONFERENCE ON CLUSTER COMPUTING (CLUSTER), 2019, : 182 - 192
  • [48] Unbiased Gradient Boosting Decision Tree with Unbiased Feature Importance
    Zhang, Zheyu
    Zhang, Tianping
    Li, Jian
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 4629 - 4637
  • [49] Exploiting GPUs for Efficient Gradient Boosting Decision Tree Training
    Wen, Zeyi
    Shi, Jiashuai
    He, Bingsheng
    Chen, Jian
    Ramamohanarao, Kotagiri
    Li, Qinbin
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2019, 30 (12) : 2706 - 2717
  • [50] An Elastic Gradient Boosting Decision Tree for Concept Drift Learning
    Wang, Kun
    Liu, Anjin
    Lu, Jie
    Zhang, Guangquan
    Xiong, Li
    AI 2020: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 12576 : 420 - 432