Quantifying predictive uncertainty in damage classification for nondestructive evaluation using Bayesian approximation and deep learning

被引:2
|
作者
Li, Zi [1 ]
Deng, Yiming [1 ]
机构
[1] Michigan State Univ, Dept Elect & Comp Engn, E Lansing, MI 48824 USA
关键词
uncertainty quantification; magnetic flux leakage; inverse NDE; Bayesian approximation; deep learning; FLUX LEAKAGE SIGNALS; QUANTIFICATION; RECONSTRUCTION; PROPAGATION; CRACKS; SIMULATION; INVERSION; PARAMETER; ENSEMBLE; MODEL;
D O I
10.1088/1361-6420/ad2f63
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Magnetic flux leakage (MFL), a widely used nondestructive evaluation (NDE) method, for inspecting pipelines to prevent potential long-term failures. However, during field testing, uncertainties can affect the accuracy of the inspection and the decision-making process regarding damage conditions. Therefore, it is essential to identify and quantify these uncertainties to ensure the reliability of the inspection. This study focuses on the uncertainties that arise during the inverse NDE process due to the dynamic magnetization process, which is affected by the relative motion of the MFL sensor and the material being tested. Specifically, the study investigates the uncertainties caused by sensing liftoff, which can affect the output signal of the sensing system. Due to the complexity of describing the forward uncertainty propagation process, this study compared two typical machine learning (ML)-based approximate Bayesian inference methods, convolutional neural network and deep ensemble, to address the input uncertainty from the MFL response data. Besides, an autoencoder method is applied to tackle the lack of experimental data for the training model by augmenting the dataset, which is constructed with the pre-trained model based on transfer learning. Prior knowledge learned from large simulated MFL signals can fine-tune the autoencoder model which enhances the subsequent learning process on experimental MFL data with faster generalization. The augmented data from the fine-tuned autoencoder is further applied for ML-based defect size classification. This study conducted prediction accuracy and uncertainty analysis with calibration, which can evaluate the prediction performance and reveal the relation between the liftoff uncertainty and prediction accuracy. Further, to strengthen the trustworthiness of the prediction results, the decision-making process guided by uncertainty is applied to provide valuable insights into the reliability of the final prediction results. Overall, the proposed framework for uncertainty quantification offers valuable insights into the assessment of reliability in MFL-based decision-making and inverse problems.
引用
收藏
页数:28
相关论文
共 50 条
  • [1] Fast Predictive Uncertainty for Classification with Bayesian Deep Networks
    Hobbhahn, Marius
    Kristiadi, Agustinus
    Hennig, Philipp
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 822 - 832
  • [2] Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
    Gal, Yarin
    Ghahramani, Zoubin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [3] Evaluation of Image Classification for Quantifying Mitochondrial Morphology Using Deep Learning
    Tsutsumi, Kaori
    Tokunaga, Keima
    Saito, Shun
    Sasase, Tatsuya
    Sugimori, Hiroyuki
    ENDOCRINE METABOLIC & IMMUNE DISORDERS-DRUG TARGETS, 2023, 23 (02) : 214 - 221
  • [4] Quantifying uncertainty in deep learning approaches to radio galaxy classification
    Mohan, Devina
    Scaife, Anna M. M.
    Porter, Fiona
    Walmsley, Mike
    Bowles, Micah
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2022, 511 (03) : 3722 - 3740
  • [5] Quantifying Predictive Uncertainty in Medical Image Analysis with Deep Kernel Learning
    Wu, Zhiliang
    Yang, Yinchong
    Gu, Jindong
    Tresp, Volker
    2021 IEEE 9TH INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2021), 2021, : 63 - 72
  • [6] Bayesian Deep Learning for Hyperspectral Image Classification With Low Uncertainty
    He, Xin
    Chen, Yushi
    Huang, Lingbo
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [7] Towards quantifying the uncertainty in in silico predictions using Bayesian learning
    Allen, Timothy E.H.
    Middleton, Alistair M.
    Goodman, Jonathan M.
    Russell, Paul J.
    Kukic, Predrag
    Gutsell, Steve
    Computational Toxicology, 2022, 23
  • [8] Towards quantifying the uncertainty in in silico predictions using Bayesian learning
    Allen, Timothy E. H.
    Middleton, Alistair M.
    Goodman, Jonathan M.
    Russell, Paul J.
    Kukic, Predrag
    Gutsell, Steve
    COMPUTATIONAL TOXICOLOGY, 2022, 23
  • [9] Quantifying Uncertainty in Discrete-Continuous and Skewed Data with Bayesian Deep Learning
    Vandal, Thomas
    Kodra, Evan
    Dy, Jennifer
    Ganguly, Sangram
    Nemani, Ramakrishna
    Ganguly, Auroop R.
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 2377 - 2386
  • [10] Quantifying uncertainty in Bayesian Networks structural learning
    Barth, Vitor O.
    Caetano, Henrique O.
    Maciel, Carlos D.
    Aiello, Marco
    IEEE CONFERENCE ON EVOLVING AND ADAPTIVE INTELLIGENT SYSTEMS 2024, IEEE EAIS 2024, 2024, : 200 - 207