A transformer-based end-to-end data-driven model for multisensor time series monitoring of machine tool condition

被引:3
|
作者
Agung', Oroko Joanes [1 ]
James, Kimotho [2 ]
Samuel, Kabini [1 ]
Evan, Murimi [1 ]
机构
[1] Jomo Kenyatta Univ Agr & Technol, Mech Engn Dept, Nairobi, Kenya
[2] Jomo Kenyatta Univ Agr & Technol, Mechatron Engn Dept, Nairobi, Kenya
关键词
deep-learning; end-to-end model; tool condition monitoring; transformer; NEURAL-NETWORK; WEAR; SYSTEM; HEALTH; PREDICTION;
D O I
10.1002/eng2.12598
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Online determination of a cutter's health status is crucial for the attainment of condition-based automated tool change in computer numerically controlled (CNC) machining. Due to the impracticalities associated with direct condition measurements, data-based modeling of monitoring signals provides a viable practical route. However, the highly noisy and redundant nature of the associated data impacts negatively on model's accuracy and typically calls for additional initial preprocessing before modeling. Additionally, the long sequential data entails widely varying condition distributions exhibited by different cutters, even from the same batch on similar machining parameters, posing a challenge to model generalization. An end-to-end model has thus been developed to work directly on unprocessed data to establish global sensitive features from varying distributions for online tool wear estimation in CNC machining. The model utilizes three main functional blocks. First, a data denoising and feature selection block automatically processes raw multisensor data directly, dispensing with scaling or preprocessing of inputs as conventionally done. Each sensor channel's independence is preserved at initial processing ensuring complementary information from different sensors is utilized while simultaneously minimizing existing redundancies. The weighted denoised data is then processed through a transformer encoder block for determination of global dependencies in the time-series sequence, regardless of the time-step position. The learned features are then fed to an upper supervised learning block for association with the monitored wear condition. The developed model works directly on raw noisy data irrespective of scaling differences, saving on preprocessing computational cost. The global associations extracted on long sequences by the transformer-encoder allow for model generalization to varying wear distributions. The parallel processing structure of all channels ensures complementary information is utilized minimizing unforeseen model bias. The model's performance as evaluated on experimental milling data and further comparison with other reported models on same dataset shows attainment of comparable state-of-art results.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Transformer-based end-to-end scene text recognition
    Zhu, Xinghao
    Zhang, Zhi
    [J]. PROCEEDINGS OF THE 2021 IEEE 16TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA 2021), 2021, : 1691 - 1695
  • [2] Transformer-Based End-to-End Classification of Variable-Length Volumetric Data
    Oghbaie, Marzieh
    Araujo, Teresa
    Emre, Taha
    Schmidt-Erfurth, Ursula
    Bogunovic, Hrvoje
    [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT VI, 2023, 14225 : 358 - 367
  • [3] Transformer-Based End-to-End Anatomical and Functional Image Fusion
    Zhang, Jing
    Liu, Aiping
    Wang, Dan
    Liu, Yu
    Wang, Z. Jane
    Chen, Xun
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71 : 1 - 1
  • [4] Transformer-based End-to-End Object Detection in Aerial Images
    Vo, Nguyen D.
    Le, Nguyen
    Ngo, Giang
    Doan, Du
    Le, Do
    Nguyen, Khang
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (10) : 1072 - 1079
  • [5] A Transformer-Based End-to-End Automatic Speech Recognition Algorithm
    Dong, Fang
    Qian, Yiyang
    Wang, Tianlei
    Liu, Peng
    Cao, Jiuwen
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 1592 - 1596
  • [6] Data-Driven Monitoring for Distributed Sensor Networks: An End-to-End Strategy Based on Collaborative Learning
    Chen, Fuyang
    He, Sudao
    Li, Yiwei
    Chen, Hongtian
    [J]. IEEE SENSORS JOURNAL, 2022, 22 (22) : 21795 - 21805
  • [7] End-to-End Transformer-Based Models in Textual-Based NLP
    Rahali, Abir
    Akhloufi, Moulay A.
    [J]. AI, 2023, 4 (01) : 54 - 110
  • [8] SymFormer: End-to-End Symbolic Regression Using Transformer-Based Architecture
    Vastl, Martin
    Kulhanek, Jonas
    Kubalik, Jiri
    Derner, Erik
    Babuska, Robert
    [J]. IEEE ACCESS, 2024, 12 : 37840 - 37849
  • [9] Transformer-based Long-context End-to-end Speech Recognition
    Hori, Takaaki
    Moritz, Niko
    Hori, Chiori
    Le Roux, Jonathan
    [J]. INTERSPEECH 2020, 2020, : 5011 - 5015
  • [10] Transformer-Based End-to-End Speech Translation With Rotary Position Embedding
    Li, Xueqing
    Li, Shengqiang
    Zhang, Xiao-Lei
    Rahardja, Susanto
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 371 - 375