Equivariant Bootstrapping for Uncertainty Quantification in Imaging Inverse Problems

被引:0
|
作者
Tachella, Julian [1 ,2 ]
Pereyra, Marcelo [3 ,4 ]
机构
[1] CNRS, Lab Phys, Paris, France
[2] ENS Lyon, Lyon, France
[3] Heriot Watt Univ, Edinburgh, Midlothian, Scotland
[4] Maxwell Inst Math Sci, Edinburgh, Midlothian, Scotland
基金
英国科研创新办公室;
关键词
PRIORS; NETWORK;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Scientific imaging problems are often severely ill-posed and hence have significant intrinsic uncertainty. Accurately quantifying the uncertainty in the solutions to such problems is therefore critical for the rigorous interpretation of experimental results as well as for reliably using the reconstructed images as scientific evidence. Unfortunately, existing imaging methods are unable to quantify the uncertainty in the reconstructed images in a way that is robust to experiment replications. This paper presents a new uncertainty quantification methodology based on an equivariant formulation of the parametric bootstrap algorithm that leverages symmetries and invariance properties commonly encountered in imaging problems. Additionally, the proposed methodology is general and can be easily applied with any image reconstruction technique, including unsupervised training strategies that can be trained from observed data alone, thus enabling uncertainty quantification in situations where there is no ground truth data available. We demonstrate the proposed approach with a series of experiments and comparisons with alternative state-of-the-art uncertainty quantification strategies. In all our experiments, the proposed equivariant bootstrap delivers remarkably accurate high-dimensional confidence regions and outperforms the competing approaches in terms of estimation accuracy, uncertainty quantification accuracy, and computing time. These empirical findings are supported by a detailed theoretical analysis of equivariant bootstrap for linear estimators.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Standard error computations for uncertainty quantification in inverse problems: Asymptotic theory vs. bootstrapping
    Banks, H. T.
    Holm, Kathleen
    Robbins, Danielle
    MATHEMATICAL AND COMPUTER MODELLING, 2010, 52 (9-10) : 1610 - 1625
  • [2] Scalable Bayesian Uncertainty Quantification in Imaging Inverse Problems via Convex Optimization
    Repetti, Audrey
    Pereyra, Marcelo
    Wiaux, Yves
    SIAM JOURNAL ON IMAGING SCIENCES, 2019, 12 (01): : 87 - 118
  • [3] Cycle-Consistency-Based Uncertainty Quantification of Neural Networks in Inverse Imaging Problems
    Huang, Luzhe
    Li, Jianing
    Ding, Xiaofu
    Zhang, Yijie
    Chen, Hanlong
    Ozcan, Aydogan
    Intelligent Computing, 2023, 2
  • [4] Data analysis tools for uncertainty quantification of inverse problems
    Tenorio, L.
    Andersson, F.
    de Hoop, M.
    Ma, P.
    INVERSE PROBLEMS, 2011, 27 (04)
  • [5] Equivariant Inverse Spectral Problems
    Dryden, Emily B.
    Guillemin, Victor
    Sena-Dias, Rosa
    SPECTRAL GEOMETRY, 2012, 84 : 155 - +
  • [6] An MCMC method for uncertainty quantification in nonnegativity constrained inverse problems
    Bardsley, Johnathan M.
    Fox, Colin
    INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2012, 20 (04) : 477 - 498
  • [7] A unified framework for multilevel uncertainty quantification in Bayesian inverse problems
    Nagel, Joseph B.
    Sudret, Bruno
    PROBABILISTIC ENGINEERING MECHANICS, 2016, 43 : 68 - 84
  • [8] Equivariant neural networks for inverse problems
    Celledoni, Elena
    Ehrhardt, Matthias J.
    Etmann, Christian
    Owren, Brynjulf
    Schonlieb, Carola-Bibiane
    Sherry, Ferdia
    INVERSE PROBLEMS, 2021, 37 (08)
  • [9] Uncertainty Quantification in Inverse Scattering Problems With Bayesian Convolutional Neural Networks
    Wei, Zhun
    Chen, Xudong
    IEEE TRANSACTIONS ON ANTENNAS AND PROPAGATION, 2021, 69 (06) : 3409 - 3418
  • [10] Expanded uncertainty quantification in inverse problems: Hierarchical Bayes and empirical Bayes
    Malinverno, A
    Briggs, VA
    GEOPHYSICS, 2004, 69 (04) : 1005 - 1016